Apr 19 12:09:51.116451 ip-10-0-131-150 systemd[1]: Starting Kubernetes Kubelet... Apr 19 12:09:51.616845 ip-10-0-131-150 kubenswrapper[2567]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 19 12:09:51.616845 ip-10-0-131-150 kubenswrapper[2567]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 19 12:09:51.616845 ip-10-0-131-150 kubenswrapper[2567]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 19 12:09:51.616845 ip-10-0-131-150 kubenswrapper[2567]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 19 12:09:51.616845 ip-10-0-131-150 kubenswrapper[2567]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 19 12:09:51.618682 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.618588 2567 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 19 12:09:51.625078 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625060 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 12:09:51.625078 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625077 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 12:09:51.625171 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625081 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 12:09:51.625171 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625085 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 12:09:51.625171 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625088 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 12:09:51.625171 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625092 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 12:09:51.625171 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625095 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 12:09:51.625171 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625098 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 12:09:51.625171 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625101 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 12:09:51.625171 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625104 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 12:09:51.625171 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625119 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 12:09:51.625171 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625122 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 12:09:51.625171 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625125 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 12:09:51.625171 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625128 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 12:09:51.625171 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625131 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 12:09:51.625171 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625134 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 12:09:51.625171 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625138 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 12:09:51.625171 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625142 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 12:09:51.625171 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625144 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 12:09:51.625171 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625147 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 12:09:51.625171 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625157 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 12:09:51.625629 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625161 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 12:09:51.625629 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625164 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 12:09:51.625629 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625167 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 12:09:51.625629 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625169 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 12:09:51.625629 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625172 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 12:09:51.625629 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625175 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 12:09:51.625629 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625178 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 12:09:51.625629 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625181 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 12:09:51.625629 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625183 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 12:09:51.625629 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625186 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 12:09:51.625629 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625189 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 12:09:51.625629 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625191 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 12:09:51.625629 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625194 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 12:09:51.625629 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625196 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 12:09:51.625629 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625199 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 12:09:51.625629 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625201 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 12:09:51.625629 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625205 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 12:09:51.625629 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625207 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 12:09:51.625629 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625210 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 12:09:51.626196 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625212 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 12:09:51.626196 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625215 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 12:09:51.626196 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625218 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 12:09:51.626196 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625222 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 12:09:51.626196 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625224 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 12:09:51.626196 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625226 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 12:09:51.626196 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625229 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 12:09:51.626196 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625232 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 12:09:51.626196 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625234 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 12:09:51.626196 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625236 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 12:09:51.626196 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625239 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 12:09:51.626196 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625241 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 12:09:51.626196 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625244 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 19 12:09:51.626196 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625246 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 12:09:51.626196 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625251 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 12:09:51.626196 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625254 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 12:09:51.626196 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625256 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 12:09:51.626196 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625259 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 12:09:51.626196 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625261 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 12:09:51.626196 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625264 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 12:09:51.626693 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625266 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 12:09:51.626693 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625269 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 12:09:51.626693 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625271 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 12:09:51.626693 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625274 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 12:09:51.626693 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625276 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 12:09:51.626693 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625279 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 12:09:51.626693 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625283 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 12:09:51.626693 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625286 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 12:09:51.626693 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625290 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 12:09:51.626693 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625292 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 12:09:51.626693 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625297 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 12:09:51.626693 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625300 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 12:09:51.626693 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625302 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 12:09:51.626693 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625305 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 12:09:51.626693 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625308 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 12:09:51.626693 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625310 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 12:09:51.626693 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625313 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 12:09:51.626693 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625315 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 12:09:51.626693 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625318 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 12:09:51.626693 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625320 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 12:09:51.627192 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625323 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 12:09:51.627192 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625325 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 12:09:51.627192 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625328 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 12:09:51.627192 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625330 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 12:09:51.627192 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625333 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 12:09:51.627192 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625336 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 12:09:51.627192 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625750 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 12:09:51.627192 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625756 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 12:09:51.627192 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625759 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 12:09:51.627192 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625762 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 12:09:51.627192 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625765 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 12:09:51.627192 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625767 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 12:09:51.627192 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625770 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 12:09:51.627192 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625772 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 12:09:51.627192 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625776 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 12:09:51.627192 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625778 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 12:09:51.627192 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625781 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 12:09:51.627192 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625783 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 12:09:51.627192 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625786 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 12:09:51.627192 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625788 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 12:09:51.627684 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625792 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 12:09:51.627684 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625796 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 12:09:51.627684 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625799 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 12:09:51.627684 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625802 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 12:09:51.627684 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625804 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 12:09:51.627684 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625807 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 12:09:51.627684 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625809 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 12:09:51.627684 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625812 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 12:09:51.627684 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625815 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 12:09:51.627684 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625817 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 12:09:51.627684 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625820 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 12:09:51.627684 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625823 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 12:09:51.627684 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625825 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 12:09:51.627684 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625828 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 12:09:51.627684 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625830 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 12:09:51.627684 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625833 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 12:09:51.627684 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625835 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 12:09:51.627684 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625838 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 12:09:51.627684 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625841 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 12:09:51.627684 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625844 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 12:09:51.628193 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625847 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 12:09:51.628193 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625849 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 12:09:51.628193 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625852 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 12:09:51.628193 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625854 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 12:09:51.628193 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625857 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 12:09:51.628193 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625859 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 12:09:51.628193 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625862 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 12:09:51.628193 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625864 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 12:09:51.628193 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625867 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 12:09:51.628193 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625870 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 12:09:51.628193 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625872 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 12:09:51.628193 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625874 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 12:09:51.628193 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625877 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 12:09:51.628193 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625880 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 12:09:51.628193 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625882 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 12:09:51.628193 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625885 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 12:09:51.628193 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625888 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 12:09:51.628193 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625890 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 12:09:51.628193 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625893 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 12:09:51.628193 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625895 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 12:09:51.628675 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625898 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 12:09:51.628675 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625900 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 12:09:51.628675 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625903 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 12:09:51.628675 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625905 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 12:09:51.628675 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625908 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 12:09:51.628675 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625910 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 12:09:51.628675 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625913 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 19 12:09:51.628675 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625916 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 12:09:51.628675 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625918 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 12:09:51.628675 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625922 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 12:09:51.628675 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625925 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 12:09:51.628675 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625927 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 12:09:51.628675 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625930 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 12:09:51.628675 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625933 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 12:09:51.628675 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625935 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 12:09:51.628675 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625938 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 12:09:51.628675 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625940 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 12:09:51.628675 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625943 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 12:09:51.628675 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625945 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 12:09:51.628675 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625948 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 12:09:51.629202 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625950 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 12:09:51.629202 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625954 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 12:09:51.629202 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625958 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 12:09:51.629202 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625961 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 12:09:51.629202 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625964 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 12:09:51.629202 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625968 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 12:09:51.629202 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625970 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 12:09:51.629202 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625973 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 12:09:51.629202 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625977 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 12:09:51.629202 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625979 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 12:09:51.629202 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625982 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 12:09:51.629202 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.625984 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 12:09:51.629202 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626054 2567 flags.go:64] FLAG: --address="0.0.0.0" Apr 19 12:09:51.629202 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626066 2567 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 19 12:09:51.629202 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626073 2567 flags.go:64] FLAG: --anonymous-auth="true" Apr 19 12:09:51.629202 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626077 2567 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 19 12:09:51.629202 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626082 2567 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 19 12:09:51.629202 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626085 2567 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 19 12:09:51.629202 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626090 2567 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 19 12:09:51.629202 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626094 2567 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 19 12:09:51.629726 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626098 2567 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 19 12:09:51.629726 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626102 2567 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 19 12:09:51.629726 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626106 2567 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 19 12:09:51.629726 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626122 2567 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 19 12:09:51.629726 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626125 2567 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 19 12:09:51.629726 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626129 2567 flags.go:64] FLAG: --cgroup-root="" Apr 19 12:09:51.629726 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626132 2567 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 19 12:09:51.629726 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626135 2567 flags.go:64] FLAG: --client-ca-file="" Apr 19 12:09:51.629726 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626138 2567 flags.go:64] FLAG: --cloud-config="" Apr 19 12:09:51.629726 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626141 2567 flags.go:64] FLAG: --cloud-provider="external" Apr 19 12:09:51.629726 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626144 2567 flags.go:64] FLAG: --cluster-dns="[]" Apr 19 12:09:51.629726 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626149 2567 flags.go:64] FLAG: --cluster-domain="" Apr 19 12:09:51.629726 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626152 2567 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 19 12:09:51.629726 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626155 2567 flags.go:64] FLAG: --config-dir="" Apr 19 12:09:51.629726 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626158 2567 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 19 12:09:51.629726 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626162 2567 flags.go:64] FLAG: --container-log-max-files="5" Apr 19 12:09:51.629726 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626166 2567 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 19 12:09:51.629726 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626169 2567 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 19 12:09:51.629726 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626172 2567 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 19 12:09:51.629726 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626176 2567 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 19 12:09:51.629726 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626179 2567 flags.go:64] FLAG: --contention-profiling="false" Apr 19 12:09:51.629726 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626182 2567 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 19 12:09:51.629726 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626185 2567 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 19 12:09:51.629726 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626188 2567 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 19 12:09:51.629726 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626191 2567 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 19 12:09:51.630371 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626195 2567 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 19 12:09:51.630371 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626198 2567 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 19 12:09:51.630371 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626202 2567 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 19 12:09:51.630371 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626205 2567 flags.go:64] FLAG: --enable-load-reader="false" Apr 19 12:09:51.630371 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626208 2567 flags.go:64] FLAG: --enable-server="true" Apr 19 12:09:51.630371 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626211 2567 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 19 12:09:51.630371 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626215 2567 flags.go:64] FLAG: --event-burst="100" Apr 19 12:09:51.630371 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626218 2567 flags.go:64] FLAG: --event-qps="50" Apr 19 12:09:51.630371 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626222 2567 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 19 12:09:51.630371 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626225 2567 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 19 12:09:51.630371 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626228 2567 flags.go:64] FLAG: --eviction-hard="" Apr 19 12:09:51.630371 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626232 2567 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 19 12:09:51.630371 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626235 2567 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 19 12:09:51.630371 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626238 2567 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 19 12:09:51.630371 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626241 2567 flags.go:64] FLAG: --eviction-soft="" Apr 19 12:09:51.630371 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626244 2567 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 19 12:09:51.630371 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626247 2567 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 19 12:09:51.630371 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626250 2567 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 19 12:09:51.630371 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626253 2567 flags.go:64] FLAG: --experimental-mounter-path="" Apr 19 12:09:51.630371 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626256 2567 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 19 12:09:51.630371 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626259 2567 flags.go:64] FLAG: --fail-swap-on="true" Apr 19 12:09:51.630371 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626262 2567 flags.go:64] FLAG: --feature-gates="" Apr 19 12:09:51.630371 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626266 2567 flags.go:64] FLAG: --file-check-frequency="20s" Apr 19 12:09:51.630371 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626269 2567 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 19 12:09:51.630371 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626272 2567 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 19 12:09:51.631000 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626277 2567 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 19 12:09:51.631000 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626280 2567 flags.go:64] FLAG: --healthz-port="10248" Apr 19 12:09:51.631000 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626283 2567 flags.go:64] FLAG: --help="false" Apr 19 12:09:51.631000 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626286 2567 flags.go:64] FLAG: --hostname-override="ip-10-0-131-150.ec2.internal" Apr 19 12:09:51.631000 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626290 2567 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 19 12:09:51.631000 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626293 2567 flags.go:64] FLAG: --http-check-frequency="20s" Apr 19 12:09:51.631000 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626296 2567 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 19 12:09:51.631000 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626300 2567 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 19 12:09:51.631000 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626303 2567 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 19 12:09:51.631000 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626306 2567 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 19 12:09:51.631000 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626309 2567 flags.go:64] FLAG: --image-service-endpoint="" Apr 19 12:09:51.631000 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626312 2567 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 19 12:09:51.631000 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626315 2567 flags.go:64] FLAG: --kube-api-burst="100" Apr 19 12:09:51.631000 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626318 2567 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 19 12:09:51.631000 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626321 2567 flags.go:64] FLAG: --kube-api-qps="50" Apr 19 12:09:51.631000 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626324 2567 flags.go:64] FLAG: --kube-reserved="" Apr 19 12:09:51.631000 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626327 2567 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 19 12:09:51.631000 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626330 2567 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 19 12:09:51.631000 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626333 2567 flags.go:64] FLAG: --kubelet-cgroups="" Apr 19 12:09:51.631000 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626336 2567 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 19 12:09:51.631000 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626339 2567 flags.go:64] FLAG: --lock-file="" Apr 19 12:09:51.631000 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626342 2567 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 19 12:09:51.631000 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626345 2567 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 19 12:09:51.631000 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626348 2567 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 19 12:09:51.631611 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626353 2567 flags.go:64] FLAG: --log-json-split-stream="false" Apr 19 12:09:51.631611 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626356 2567 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 19 12:09:51.631611 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626359 2567 flags.go:64] FLAG: --log-text-split-stream="false" Apr 19 12:09:51.631611 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626362 2567 flags.go:64] FLAG: --logging-format="text" Apr 19 12:09:51.631611 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626365 2567 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 19 12:09:51.631611 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626368 2567 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 19 12:09:51.631611 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626371 2567 flags.go:64] FLAG: --manifest-url="" Apr 19 12:09:51.631611 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626374 2567 flags.go:64] FLAG: --manifest-url-header="" Apr 19 12:09:51.631611 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626382 2567 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 19 12:09:51.631611 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626385 2567 flags.go:64] FLAG: --max-open-files="1000000" Apr 19 12:09:51.631611 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626389 2567 flags.go:64] FLAG: --max-pods="110" Apr 19 12:09:51.631611 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626392 2567 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 19 12:09:51.631611 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626395 2567 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 19 12:09:51.631611 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626399 2567 flags.go:64] FLAG: --memory-manager-policy="None" Apr 19 12:09:51.631611 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626402 2567 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 19 12:09:51.631611 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626405 2567 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 19 12:09:51.631611 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626407 2567 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 19 12:09:51.631611 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626410 2567 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 19 12:09:51.631611 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626418 2567 flags.go:64] FLAG: --node-status-max-images="50" Apr 19 12:09:51.631611 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626421 2567 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 19 12:09:51.631611 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626424 2567 flags.go:64] FLAG: --oom-score-adj="-999" Apr 19 12:09:51.631611 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626427 2567 flags.go:64] FLAG: --pod-cidr="" Apr 19 12:09:51.631611 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626430 2567 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 19 12:09:51.632232 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626436 2567 flags.go:64] FLAG: --pod-manifest-path="" Apr 19 12:09:51.632232 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626443 2567 flags.go:64] FLAG: --pod-max-pids="-1" Apr 19 12:09:51.632232 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626447 2567 flags.go:64] FLAG: --pods-per-core="0" Apr 19 12:09:51.632232 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626450 2567 flags.go:64] FLAG: --port="10250" Apr 19 12:09:51.632232 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626453 2567 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 19 12:09:51.632232 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626456 2567 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0284ee6228b2860a6" Apr 19 12:09:51.632232 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626459 2567 flags.go:64] FLAG: --qos-reserved="" Apr 19 12:09:51.632232 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626462 2567 flags.go:64] FLAG: --read-only-port="10255" Apr 19 12:09:51.632232 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626465 2567 flags.go:64] FLAG: --register-node="true" Apr 19 12:09:51.632232 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626468 2567 flags.go:64] FLAG: --register-schedulable="true" Apr 19 12:09:51.632232 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626471 2567 flags.go:64] FLAG: --register-with-taints="" Apr 19 12:09:51.632232 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626475 2567 flags.go:64] FLAG: --registry-burst="10" Apr 19 12:09:51.632232 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626478 2567 flags.go:64] FLAG: --registry-qps="5" Apr 19 12:09:51.632232 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626481 2567 flags.go:64] FLAG: --reserved-cpus="" Apr 19 12:09:51.632232 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626484 2567 flags.go:64] FLAG: --reserved-memory="" Apr 19 12:09:51.632232 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626488 2567 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 19 12:09:51.632232 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626491 2567 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 19 12:09:51.632232 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626494 2567 flags.go:64] FLAG: --rotate-certificates="false" Apr 19 12:09:51.632232 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626497 2567 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 19 12:09:51.632232 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626500 2567 flags.go:64] FLAG: --runonce="false" Apr 19 12:09:51.632232 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626503 2567 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 19 12:09:51.632232 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626506 2567 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 19 12:09:51.632232 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626509 2567 flags.go:64] FLAG: --seccomp-default="false" Apr 19 12:09:51.632232 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626512 2567 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 19 12:09:51.632232 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626515 2567 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 19 12:09:51.632232 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626518 2567 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 19 12:09:51.632860 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626521 2567 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 19 12:09:51.632860 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626525 2567 flags.go:64] FLAG: --storage-driver-password="root" Apr 19 12:09:51.632860 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626527 2567 flags.go:64] FLAG: --storage-driver-secure="false" Apr 19 12:09:51.632860 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626530 2567 flags.go:64] FLAG: --storage-driver-table="stats" Apr 19 12:09:51.632860 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626533 2567 flags.go:64] FLAG: --storage-driver-user="root" Apr 19 12:09:51.632860 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626536 2567 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 19 12:09:51.632860 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626539 2567 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 19 12:09:51.632860 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626544 2567 flags.go:64] FLAG: --system-cgroups="" Apr 19 12:09:51.632860 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626547 2567 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 19 12:09:51.632860 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626553 2567 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 19 12:09:51.632860 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626555 2567 flags.go:64] FLAG: --tls-cert-file="" Apr 19 12:09:51.632860 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626559 2567 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 19 12:09:51.632860 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626564 2567 flags.go:64] FLAG: --tls-min-version="" Apr 19 12:09:51.632860 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626566 2567 flags.go:64] FLAG: --tls-private-key-file="" Apr 19 12:09:51.632860 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626569 2567 flags.go:64] FLAG: --topology-manager-policy="none" Apr 19 12:09:51.632860 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626574 2567 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 19 12:09:51.632860 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626577 2567 flags.go:64] FLAG: --topology-manager-scope="container" Apr 19 12:09:51.632860 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626580 2567 flags.go:64] FLAG: --v="2" Apr 19 12:09:51.632860 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626584 2567 flags.go:64] FLAG: --version="false" Apr 19 12:09:51.632860 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626588 2567 flags.go:64] FLAG: --vmodule="" Apr 19 12:09:51.632860 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626593 2567 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 19 12:09:51.632860 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626596 2567 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 19 12:09:51.632860 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626711 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 12:09:51.632860 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626715 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 12:09:51.633460 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626720 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 12:09:51.633460 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626724 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 12:09:51.633460 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626726 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 12:09:51.633460 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626729 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 12:09:51.633460 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626731 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 12:09:51.633460 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626734 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 12:09:51.633460 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626737 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 12:09:51.633460 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626740 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 12:09:51.633460 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626743 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 12:09:51.633460 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626745 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 12:09:51.633460 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626748 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 12:09:51.633460 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626750 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 12:09:51.633460 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626753 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 12:09:51.633460 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626756 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 12:09:51.633460 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626758 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 12:09:51.633460 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626763 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 12:09:51.633460 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626766 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 12:09:51.633460 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626768 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 12:09:51.633460 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626772 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 12:09:51.633985 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626776 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 12:09:51.633985 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626779 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 12:09:51.633985 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626782 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 12:09:51.633985 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626784 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 12:09:51.633985 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626788 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 12:09:51.633985 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626791 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 12:09:51.633985 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626793 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 12:09:51.633985 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626796 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 12:09:51.633985 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626799 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 12:09:51.633985 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626801 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 12:09:51.633985 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626804 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 12:09:51.633985 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626806 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 12:09:51.633985 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626809 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 12:09:51.633985 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626811 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 12:09:51.633985 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626814 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 12:09:51.633985 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626817 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 12:09:51.633985 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626819 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 12:09:51.633985 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626822 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 12:09:51.633985 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626824 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 12:09:51.633985 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626827 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 12:09:51.634562 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626830 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 12:09:51.634562 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626833 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 12:09:51.634562 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626835 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 12:09:51.634562 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626838 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 12:09:51.634562 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626841 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 12:09:51.634562 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626843 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 12:09:51.634562 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626846 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 19 12:09:51.634562 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626848 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 12:09:51.634562 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626852 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 12:09:51.634562 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626855 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 12:09:51.634562 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626857 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 12:09:51.634562 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626860 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 12:09:51.634562 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626862 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 12:09:51.634562 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626865 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 12:09:51.634562 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626867 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 12:09:51.634562 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626870 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 12:09:51.634562 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626874 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 12:09:51.634562 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626877 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 12:09:51.634562 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626880 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 12:09:51.635031 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626882 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 12:09:51.635031 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626885 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 12:09:51.635031 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626887 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 12:09:51.635031 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626890 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 12:09:51.635031 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626892 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 12:09:51.635031 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626895 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 12:09:51.635031 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626897 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 12:09:51.635031 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626900 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 12:09:51.635031 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626902 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 12:09:51.635031 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626905 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 12:09:51.635031 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626907 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 12:09:51.635031 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626910 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 12:09:51.635031 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626912 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 12:09:51.635031 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626915 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 12:09:51.635031 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626918 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 12:09:51.635031 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626920 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 12:09:51.635031 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626923 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 12:09:51.635031 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626926 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 12:09:51.635031 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626928 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 12:09:51.635031 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626931 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 12:09:51.635537 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626933 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 12:09:51.635537 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626937 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 12:09:51.635537 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626940 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 12:09:51.635537 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626943 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 12:09:51.635537 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626945 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 12:09:51.635537 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.626947 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 12:09:51.635537 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.626952 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 19 12:09:51.635537 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.633500 2567 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 19 12:09:51.635537 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.633517 2567 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 19 12:09:51.635537 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633569 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 12:09:51.635537 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633574 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 12:09:51.635537 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633577 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 12:09:51.635537 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633581 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 12:09:51.635537 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633584 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 12:09:51.635537 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633587 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 12:09:51.635909 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633589 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 12:09:51.635909 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633592 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 12:09:51.635909 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633595 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 12:09:51.635909 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633597 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 12:09:51.635909 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633600 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 12:09:51.635909 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633602 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 12:09:51.635909 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633605 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 12:09:51.635909 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633608 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 12:09:51.635909 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633611 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 12:09:51.635909 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633614 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 12:09:51.635909 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633616 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 12:09:51.635909 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633620 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 12:09:51.635909 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633622 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 12:09:51.635909 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633625 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 12:09:51.635909 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633628 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 12:09:51.635909 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633630 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 12:09:51.635909 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633634 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 12:09:51.635909 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633636 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 12:09:51.635909 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633639 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 12:09:51.635909 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633642 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 12:09:51.636409 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633644 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 12:09:51.636409 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633647 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 12:09:51.636409 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633650 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 12:09:51.636409 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633654 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 12:09:51.636409 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633658 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 12:09:51.636409 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633663 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 12:09:51.636409 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633666 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 12:09:51.636409 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633669 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 12:09:51.636409 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633671 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 12:09:51.636409 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633676 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 12:09:51.636409 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633679 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 12:09:51.636409 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633682 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 12:09:51.636409 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633685 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 12:09:51.636409 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633687 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 12:09:51.636409 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633690 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 12:09:51.636409 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633692 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 12:09:51.636409 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633695 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 12:09:51.636409 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633698 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 12:09:51.636409 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633700 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 12:09:51.636874 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633702 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 12:09:51.636874 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633705 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 12:09:51.636874 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633708 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 12:09:51.636874 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633710 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 12:09:51.636874 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633713 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 12:09:51.636874 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633715 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 12:09:51.636874 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633718 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 12:09:51.636874 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633721 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 12:09:51.636874 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633723 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 12:09:51.636874 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633726 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 12:09:51.636874 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633729 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 12:09:51.636874 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633731 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 12:09:51.636874 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633734 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 12:09:51.636874 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633736 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 12:09:51.636874 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633739 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 12:09:51.636874 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633741 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 12:09:51.636874 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633744 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 12:09:51.636874 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633746 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 12:09:51.636874 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633750 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 12:09:51.636874 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633753 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 12:09:51.637478 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633755 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 12:09:51.637478 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633758 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 12:09:51.637478 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633760 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 12:09:51.637478 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633763 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 12:09:51.637478 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633765 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 12:09:51.637478 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633768 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 12:09:51.637478 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633771 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 12:09:51.637478 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633773 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 12:09:51.637478 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633775 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 12:09:51.637478 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633778 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 12:09:51.637478 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633781 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 12:09:51.637478 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633783 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 12:09:51.637478 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633786 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 12:09:51.637478 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633789 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 19 12:09:51.637478 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633791 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 12:09:51.637478 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633794 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 12:09:51.637478 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633797 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 12:09:51.637478 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633799 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 12:09:51.637478 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633802 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 12:09:51.637478 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633804 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 12:09:51.637960 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633807 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 12:09:51.637960 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.633812 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 19 12:09:51.637960 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633907 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 12:09:51.637960 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633912 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 12:09:51.637960 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633915 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 12:09:51.637960 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633918 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 12:09:51.637960 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633920 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 12:09:51.637960 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633923 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 12:09:51.637960 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633926 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 12:09:51.637960 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633929 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 12:09:51.637960 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633932 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 12:09:51.637960 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633936 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 12:09:51.637960 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633939 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 12:09:51.637960 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633941 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 12:09:51.637960 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633944 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 12:09:51.638426 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633947 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 12:09:51.638426 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633949 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 12:09:51.638426 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633952 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 12:09:51.638426 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633955 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 12:09:51.638426 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633957 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 12:09:51.638426 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633960 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 12:09:51.638426 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633962 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 12:09:51.638426 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633965 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 12:09:51.638426 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633968 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 12:09:51.638426 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633971 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 12:09:51.638426 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633973 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 12:09:51.638426 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633976 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 12:09:51.638426 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633979 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 12:09:51.638426 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633982 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 12:09:51.638426 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633984 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 12:09:51.638426 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633987 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 12:09:51.638426 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633990 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 12:09:51.638426 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633993 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 12:09:51.638426 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633995 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 12:09:51.638426 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.633998 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 12:09:51.638923 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634001 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 12:09:51.638923 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634003 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 12:09:51.638923 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634006 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 12:09:51.638923 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634009 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 12:09:51.638923 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634011 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 12:09:51.638923 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634014 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 12:09:51.638923 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634016 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 12:09:51.638923 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634019 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 12:09:51.638923 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634022 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 12:09:51.638923 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634024 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 12:09:51.638923 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634027 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 12:09:51.638923 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634029 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 12:09:51.638923 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634032 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 12:09:51.638923 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634035 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 12:09:51.638923 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634037 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 12:09:51.638923 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634039 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 12:09:51.638923 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634042 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 12:09:51.638923 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634046 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 12:09:51.638923 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634049 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 12:09:51.639404 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634053 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 12:09:51.639404 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634056 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 12:09:51.639404 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634058 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 12:09:51.639404 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634061 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 12:09:51.639404 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634063 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 12:09:51.639404 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634066 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 12:09:51.639404 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634069 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 12:09:51.639404 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634071 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 12:09:51.639404 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634074 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 12:09:51.639404 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634076 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 12:09:51.639404 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634079 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 12:09:51.639404 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634081 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 12:09:51.639404 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634084 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 12:09:51.639404 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634086 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 12:09:51.639404 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634089 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 12:09:51.639404 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634091 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 12:09:51.639404 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634094 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 12:09:51.639404 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634096 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 12:09:51.639404 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634099 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 19 12:09:51.639404 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634103 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 12:09:51.639892 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634106 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 12:09:51.639892 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634125 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 12:09:51.639892 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634128 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 12:09:51.639892 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634131 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 12:09:51.639892 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634134 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 12:09:51.639892 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634136 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 12:09:51.639892 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634139 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 12:09:51.639892 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634141 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 12:09:51.639892 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634144 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 12:09:51.639892 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634147 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 12:09:51.639892 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634149 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 12:09:51.639892 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634152 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 12:09:51.639892 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634154 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 12:09:51.639892 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:51.634157 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 12:09:51.639892 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.634162 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 19 12:09:51.640278 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.634944 2567 server.go:962] "Client rotation is on, will bootstrap in background" Apr 19 12:09:51.640278 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.637166 2567 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 19 12:09:51.640278 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.638381 2567 server.go:1019] "Starting client certificate rotation" Apr 19 12:09:51.640278 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.638477 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 19 12:09:51.640278 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.639431 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 19 12:09:51.663786 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.663748 2567 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 19 12:09:51.668639 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.668614 2567 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 19 12:09:51.685436 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.685412 2567 log.go:25] "Validated CRI v1 runtime API" Apr 19 12:09:51.691382 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.691362 2567 log.go:25] "Validated CRI v1 image API" Apr 19 12:09:51.692576 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.692557 2567 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 19 12:09:51.695882 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.695865 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 19 12:09:51.697755 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.697734 2567 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 b2a8c269-4321-4ab3-bfc6-4437051e258b:/dev/nvme0n1p4 ea5c0b47-86ec-40bb-bbe7-6e1036f19d5e:/dev/nvme0n1p3] Apr 19 12:09:51.697807 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.697754 2567 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 19 12:09:51.703791 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.703667 2567 manager.go:217] Machine: {Timestamp:2026-04-19 12:09:51.701560366 +0000 UTC m=+0.455752189 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3108400 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec24646a48663dbac0efaae6f5c4291a SystemUUID:ec24646a-4866-3dba-c0ef-aae6f5c4291a BootID:7d178f01-5975-41cb-9aa0-89ba9ba96f09 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:e0:9b:3e:4a:69 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:e0:9b:3e:4a:69 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:6a:b8:88:d0:5a:f2 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 19 12:09:51.703791 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.703787 2567 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 19 12:09:51.703904 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.703880 2567 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 19 12:09:51.705020 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.704990 2567 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 19 12:09:51.705187 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.705022 2567 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-150.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 19 12:09:51.705236 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.705198 2567 topology_manager.go:138] "Creating topology manager with none policy" Apr 19 12:09:51.705236 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.705207 2567 container_manager_linux.go:306] "Creating device plugin manager" Apr 19 12:09:51.705236 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.705220 2567 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 19 12:09:51.705315 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.705242 2567 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 19 12:09:51.706235 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.706225 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 19 12:09:51.706344 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.706336 2567 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 19 12:09:51.708972 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.708962 2567 kubelet.go:491] "Attempting to sync node with API server" Apr 19 12:09:51.709027 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.708977 2567 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 19 12:09:51.709027 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.708990 2567 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 19 12:09:51.709027 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.708999 2567 kubelet.go:397] "Adding apiserver pod source" Apr 19 12:09:51.709027 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.709018 2567 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 19 12:09:51.710265 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.710253 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 19 12:09:51.710307 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.710274 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 19 12:09:51.713097 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.713082 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-62ck6" Apr 19 12:09:51.713372 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.713351 2567 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 19 12:09:51.714896 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.714882 2567 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 19 12:09:51.717392 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.717373 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 19 12:09:51.717463 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.717398 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 19 12:09:51.717463 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.717408 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 19 12:09:51.717463 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.717418 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 19 12:09:51.717463 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.717428 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 19 12:09:51.717463 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.717444 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 19 12:09:51.717463 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.717454 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 19 12:09:51.717463 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.717462 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 19 12:09:51.717657 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.717474 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 19 12:09:51.717657 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.717483 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 19 12:09:51.717657 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.717497 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 19 12:09:51.717657 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.717510 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 19 12:09:51.718647 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.718627 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 19 12:09:51.718729 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.718652 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 19 12:09:51.719833 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.719812 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-62ck6" Apr 19 12:09:51.721127 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:51.721086 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-150.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 19 12:09:51.721220 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:51.721130 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 19 12:09:51.722917 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.722902 2567 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 19 12:09:51.722972 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.722950 2567 server.go:1295] "Started kubelet" Apr 19 12:09:51.723077 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.723046 2567 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 19 12:09:51.723182 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.723133 2567 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 19 12:09:51.723239 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.723206 2567 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 19 12:09:51.723864 ip-10-0-131-150 systemd[1]: Started Kubernetes Kubelet. Apr 19 12:09:51.725197 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.725178 2567 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 19 12:09:51.725270 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.725231 2567 server.go:317] "Adding debug handlers to kubelet server" Apr 19 12:09:51.730295 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.730276 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 19 12:09:51.730945 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.730873 2567 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 19 12:09:51.731893 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.731873 2567 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 19 12:09:51.731893 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.731884 2567 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 19 12:09:51.732025 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.731902 2567 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 19 12:09:51.732025 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.732018 2567 reconstruct.go:97] "Volume reconstruction finished" Apr 19 12:09:51.732099 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.732040 2567 reconciler.go:26] "Reconciler: start to sync state" Apr 19 12:09:51.732317 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:51.732297 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-150.ec2.internal\" not found" Apr 19 12:09:51.733361 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.733302 2567 factory.go:55] Registering systemd factory Apr 19 12:09:51.733361 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.733329 2567 factory.go:223] Registration of the systemd container factory successfully Apr 19 12:09:51.733599 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.733583 2567 factory.go:153] Registering CRI-O factory Apr 19 12:09:51.733599 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.733601 2567 factory.go:223] Registration of the crio container factory successfully Apr 19 12:09:51.733745 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.733650 2567 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 19 12:09:51.733745 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.733675 2567 factory.go:103] Registering Raw factory Apr 19 12:09:51.733745 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.733693 2567 manager.go:1196] Started watching for new ooms in manager Apr 19 12:09:51.734370 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.734343 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 12:09:51.734544 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.734349 2567 manager.go:319] Starting recovery of all containers Apr 19 12:09:51.734786 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.734766 2567 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-150.ec2.internal" not found Apr 19 12:09:51.735867 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:51.735828 2567 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 19 12:09:51.737080 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:51.737054 2567 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-131-150.ec2.internal\" not found" node="ip-10-0-131-150.ec2.internal" Apr 19 12:09:51.744372 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.744212 2567 manager.go:324] Recovery completed Apr 19 12:09:51.750194 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.750179 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 12:09:51.750377 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.750363 2567 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-150.ec2.internal" not found Apr 19 12:09:51.752687 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.752673 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-150.ec2.internal" event="NodeHasSufficientMemory" Apr 19 12:09:51.752759 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.752705 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-150.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 12:09:51.752759 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.752722 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-150.ec2.internal" event="NodeHasSufficientPID" Apr 19 12:09:51.753305 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.753290 2567 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 19 12:09:51.753305 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.753304 2567 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 19 12:09:51.753435 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.753348 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 19 12:09:51.755954 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.755941 2567 policy_none.go:49] "None policy: Start" Apr 19 12:09:51.756037 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.755960 2567 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 19 12:09:51.756037 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.755974 2567 state_mem.go:35] "Initializing new in-memory state store" Apr 19 12:09:51.805543 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.796702 2567 manager.go:341] "Starting Device Plugin manager" Apr 19 12:09:51.805543 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:51.796733 2567 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 19 12:09:51.805543 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.796742 2567 server.go:85] "Starting device plugin registration server" Apr 19 12:09:51.805543 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.797047 2567 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 19 12:09:51.805543 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.797058 2567 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 19 12:09:51.805543 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.797144 2567 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 19 12:09:51.805543 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.797221 2567 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 19 12:09:51.805543 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.797230 2567 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 19 12:09:51.805543 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:51.797816 2567 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 19 12:09:51.805543 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:51.797865 2567 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-150.ec2.internal\" not found" Apr 19 12:09:51.812876 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.812860 2567 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-150.ec2.internal" not found Apr 19 12:09:51.831371 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.831333 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 19 12:09:51.832558 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.832542 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 19 12:09:51.832644 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.832576 2567 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 19 12:09:51.832644 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.832592 2567 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 19 12:09:51.832644 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.832598 2567 kubelet.go:2451] "Starting kubelet main sync loop" Apr 19 12:09:51.832644 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:51.832635 2567 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 19 12:09:51.835552 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.835535 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 12:09:51.897946 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.897854 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 12:09:51.898932 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.898913 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-150.ec2.internal" event="NodeHasSufficientMemory" Apr 19 12:09:51.899033 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.898948 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-150.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 12:09:51.899033 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.898959 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-150.ec2.internal" event="NodeHasSufficientPID" Apr 19 12:09:51.899033 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.898983 2567 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-150.ec2.internal" Apr 19 12:09:51.907807 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.907789 2567 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-150.ec2.internal" Apr 19 12:09:51.907857 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:51.907813 2567 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-150.ec2.internal\": node \"ip-10-0-131-150.ec2.internal\" not found" Apr 19 12:09:51.933125 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.933083 2567 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-131-150.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-150.ec2.internal"] Apr 19 12:09:51.933260 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.933187 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 12:09:51.933301 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:51.933280 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-150.ec2.internal\" not found" Apr 19 12:09:51.937701 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.937678 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-150.ec2.internal" event="NodeHasSufficientMemory" Apr 19 12:09:51.937701 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.937705 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-150.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 12:09:51.937847 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.937715 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-150.ec2.internal" event="NodeHasSufficientPID" Apr 19 12:09:51.938981 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.938969 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 12:09:51.939123 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.939098 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-150.ec2.internal" Apr 19 12:09:51.939166 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.939147 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 12:09:51.940560 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.940545 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-150.ec2.internal" event="NodeHasSufficientMemory" Apr 19 12:09:51.940656 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.940573 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-150.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 12:09:51.940656 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.940583 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-150.ec2.internal" event="NodeHasSufficientPID" Apr 19 12:09:51.940656 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.940544 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-150.ec2.internal" event="NodeHasSufficientMemory" Apr 19 12:09:51.940656 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.940653 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-150.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 12:09:51.940837 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.940670 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-150.ec2.internal" event="NodeHasSufficientPID" Apr 19 12:09:51.941863 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.941848 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-150.ec2.internal" Apr 19 12:09:51.941951 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.941876 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 12:09:51.943565 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.943548 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-150.ec2.internal" event="NodeHasSufficientMemory" Apr 19 12:09:51.943647 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.943574 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-150.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 12:09:51.943647 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:51.943587 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-150.ec2.internal" event="NodeHasSufficientPID" Apr 19 12:09:51.967850 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:51.967831 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-150.ec2.internal\" not found" node="ip-10-0-131-150.ec2.internal" Apr 19 12:09:51.972130 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:51.972099 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-150.ec2.internal\" not found" node="ip-10-0-131-150.ec2.internal" Apr 19 12:09:52.033857 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:52.033823 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-150.ec2.internal\" not found" Apr 19 12:09:52.034026 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:52.033917 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/21c64da2dfd5e799e05beb88f86623ce-config\") pod \"kube-apiserver-proxy-ip-10-0-131-150.ec2.internal\" (UID: \"21c64da2dfd5e799e05beb88f86623ce\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-150.ec2.internal" Apr 19 12:09:52.034026 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:52.033947 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/482a6c1a610e53baa96869a662ee4be0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-150.ec2.internal\" (UID: \"482a6c1a610e53baa96869a662ee4be0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-150.ec2.internal" Apr 19 12:09:52.034026 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:52.033966 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/482a6c1a610e53baa96869a662ee4be0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-150.ec2.internal\" (UID: \"482a6c1a610e53baa96869a662ee4be0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-150.ec2.internal" Apr 19 12:09:52.134447 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:52.134414 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-150.ec2.internal\" not found" Apr 19 12:09:52.134505 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:52.134462 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/21c64da2dfd5e799e05beb88f86623ce-config\") pod \"kube-apiserver-proxy-ip-10-0-131-150.ec2.internal\" (UID: \"21c64da2dfd5e799e05beb88f86623ce\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-150.ec2.internal" Apr 19 12:09:52.134505 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:52.134487 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/482a6c1a610e53baa96869a662ee4be0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-150.ec2.internal\" (UID: \"482a6c1a610e53baa96869a662ee4be0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-150.ec2.internal" Apr 19 12:09:52.134573 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:52.134507 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/482a6c1a610e53baa96869a662ee4be0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-150.ec2.internal\" (UID: \"482a6c1a610e53baa96869a662ee4be0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-150.ec2.internal" Apr 19 12:09:52.134573 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:52.134560 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/482a6c1a610e53baa96869a662ee4be0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-150.ec2.internal\" (UID: \"482a6c1a610e53baa96869a662ee4be0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-150.ec2.internal" Apr 19 12:09:52.134573 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:52.134562 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/21c64da2dfd5e799e05beb88f86623ce-config\") pod \"kube-apiserver-proxy-ip-10-0-131-150.ec2.internal\" (UID: \"21c64da2dfd5e799e05beb88f86623ce\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-150.ec2.internal" Apr 19 12:09:52.134666 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:52.134570 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/482a6c1a610e53baa96869a662ee4be0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-150.ec2.internal\" (UID: \"482a6c1a610e53baa96869a662ee4be0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-150.ec2.internal" Apr 19 12:09:52.235142 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:52.235070 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-150.ec2.internal\" not found" Apr 19 12:09:52.270304 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:52.270279 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-150.ec2.internal" Apr 19 12:09:52.275066 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:52.275047 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-150.ec2.internal" Apr 19 12:09:52.336107 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:52.336077 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-150.ec2.internal\" not found" Apr 19 12:09:52.436489 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:52.436457 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-150.ec2.internal\" not found" Apr 19 12:09:52.536970 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:52.536866 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-150.ec2.internal\" not found" Apr 19 12:09:52.637397 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:52.637357 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-150.ec2.internal\" not found" Apr 19 12:09:52.638441 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:52.638424 2567 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 19 12:09:52.638585 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:52.638568 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 19 12:09:52.638643 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:52.638610 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 19 12:09:52.722065 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:52.722024 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-18 12:04:51 +0000 UTC" deadline="2027-12-24 02:00:54.120176333 +0000 UTC" Apr 19 12:09:52.722065 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:52.722067 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14725h51m1.398119944s" Apr 19 12:09:52.731227 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:52.731199 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 19 12:09:52.737802 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:52.737780 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-150.ec2.internal\" not found" Apr 19 12:09:52.740680 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:52.740660 2567 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 12:09:52.741482 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:52.741450 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21c64da2dfd5e799e05beb88f86623ce.slice/crio-87b5ae9efb1545add842bace88dd03976b0b40f0213898e9261fd1092ba456c8 WatchSource:0}: Error finding container 87b5ae9efb1545add842bace88dd03976b0b40f0213898e9261fd1092ba456c8: Status 404 returned error can't find the container with id 87b5ae9efb1545add842bace88dd03976b0b40f0213898e9261fd1092ba456c8 Apr 19 12:09:52.741847 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:52.741829 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod482a6c1a610e53baa96869a662ee4be0.slice/crio-ad85e56f79d1ed4a1a161e34ae1ef3765b35c70c3017ac54d308189978e3c92c WatchSource:0}: Error finding container ad85e56f79d1ed4a1a161e34ae1ef3765b35c70c3017ac54d308189978e3c92c: Status 404 returned error can't find the container with id ad85e56f79d1ed4a1a161e34ae1ef3765b35c70c3017ac54d308189978e3c92c Apr 19 12:09:52.746256 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:52.746237 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 19 12:09:52.747507 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:52.747481 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 19 12:09:52.766604 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:52.766575 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-59t77" Apr 19 12:09:52.773621 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:52.773601 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-59t77" Apr 19 12:09:52.832141 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:52.832059 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-150.ec2.internal" Apr 19 12:09:52.835142 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:52.835078 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-150.ec2.internal" event={"ID":"21c64da2dfd5e799e05beb88f86623ce","Type":"ContainerStarted","Data":"87b5ae9efb1545add842bace88dd03976b0b40f0213898e9261fd1092ba456c8"} Apr 19 12:09:52.835951 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:52.835931 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-150.ec2.internal" event={"ID":"482a6c1a610e53baa96869a662ee4be0","Type":"ContainerStarted","Data":"ad85e56f79d1ed4a1a161e34ae1ef3765b35c70c3017ac54d308189978e3c92c"} Apr 19 12:09:52.845732 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:52.845713 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 19 12:09:52.847791 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:52.847769 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-150.ec2.internal" Apr 19 12:09:52.856316 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:52.856296 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 19 12:09:52.905599 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:52.905574 2567 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 12:09:53.482334 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.482295 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 12:09:53.671523 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.671492 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 12:09:53.710245 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.710209 2567 apiserver.go:52] "Watching apiserver" Apr 19 12:09:53.717036 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.717009 2567 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 19 12:09:53.717471 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.717441 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-2tbjd","openshift-multus/network-metrics-daemon-rxnrj","openshift-network-operator/iptables-alerter-qgqpz","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd75q","openshift-dns/node-resolver-8gjrs","openshift-image-registry/node-ca-zfq9x","openshift-network-diagnostics/network-check-target-69kxr","openshift-ovn-kubernetes/ovnkube-node-5n96s","kube-system/konnectivity-agent-6ljqn","kube-system/kube-apiserver-proxy-ip-10-0-131-150.ec2.internal","openshift-cluster-node-tuning-operator/tuned-txn87","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-150.ec2.internal","openshift-multus/multus-57vt2"] Apr 19 12:09:53.720296 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.720275 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-69kxr" Apr 19 12:09:53.720398 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:53.720358 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-69kxr" podUID="c60217db-a1f2-446f-bada-9675c7c62201" Apr 19 12:09:53.721590 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.721565 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxnrj" Apr 19 12:09:53.721682 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:53.721639 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rxnrj" podUID="3a9d31f9-fb41-43e0-9946-0611710438a1" Apr 19 12:09:53.722943 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.722920 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qgqpz" Apr 19 12:09:53.724232 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.724210 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd75q" Apr 19 12:09:53.725051 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.724973 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 19 12:09:53.725201 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.725151 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 19 12:09:53.725344 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.725324 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-t5cz6\"" Apr 19 12:09:53.725344 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.725342 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 19 12:09:53.725592 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.725576 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zfq9x" Apr 19 12:09:53.726351 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.726275 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 19 12:09:53.726351 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.726276 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-9s8vz\"" Apr 19 12:09:53.726351 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.726331 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 19 12:09:53.726351 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.726343 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 19 12:09:53.726966 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.726935 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2tbjd" Apr 19 12:09:53.727397 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.727376 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 19 12:09:53.727684 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.727667 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 19 12:09:53.727810 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.727790 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 19 12:09:53.727883 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.727854 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-59bnb\"" Apr 19 12:09:53.728412 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.728392 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.729380 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.729288 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 19 12:09:53.729380 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.729337 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 19 12:09:53.729542 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.729491 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 19 12:09:53.729542 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.729501 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 19 12:09:53.729652 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.729626 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 19 12:09:53.729961 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.729738 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6ljqn" Apr 19 12:09:53.729961 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.729806 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-dnr6p\"" Apr 19 12:09:53.731439 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.731172 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 19 12:09:53.731439 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.731181 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.731439 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.731179 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 19 12:09:53.732003 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.731977 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-nd9dx\"" Apr 19 12:09:53.732072 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.732008 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 19 12:09:53.732072 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.732022 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 19 12:09:53.732868 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.732818 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 19 12:09:53.733799 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.733021 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8gjrs" Apr 19 12:09:53.733799 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.733265 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 19 12:09:53.733799 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.733021 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-66msp\"" Apr 19 12:09:53.733989 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.733851 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 19 12:09:53.734306 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.734286 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 19 12:09:53.734533 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.734516 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 19 12:09:53.734996 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.734977 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 19 12:09:53.735571 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.735553 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-qf68d\"" Apr 19 12:09:53.736676 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.736449 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 19 12:09:53.736676 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.736561 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 19 12:09:53.736676 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.736611 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-cntlb\"" Apr 19 12:09:53.737174 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.737158 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.739278 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.739260 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-q8n5k\"" Apr 19 12:09:53.739493 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.739473 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 19 12:09:53.741257 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.741229 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3ed6a38c-6faa-41d7-855c-af958b4e6898-konnectivity-ca\") pod \"konnectivity-agent-6ljqn\" (UID: \"3ed6a38c-6faa-41d7-855c-af958b4e6898\") " pod="kube-system/konnectivity-agent-6ljqn" Apr 19 12:09:53.741353 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.741268 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b06d6e65-9964-4785-8589-43ef464433aa-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vd75q\" (UID: \"b06d6e65-9964-4785-8589-43ef464433aa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd75q" Apr 19 12:09:53.741353 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.741296 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2527ca81-1ebd-4808-a264-00f75b2caea4-serviceca\") pod \"node-ca-zfq9x\" (UID: \"2527ca81-1ebd-4808-a264-00f75b2caea4\") " pod="openshift-image-registry/node-ca-zfq9x" Apr 19 12:09:53.741353 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.741322 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vwbm\" (UniqueName: \"kubernetes.io/projected/b06d6e65-9964-4785-8589-43ef464433aa-kube-api-access-7vwbm\") pod \"aws-ebs-csi-driver-node-vd75q\" (UID: \"b06d6e65-9964-4785-8589-43ef464433aa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd75q" Apr 19 12:09:53.741353 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.741348 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fd753b45-0daa-4581-ae69-246f33099827-etc-modprobe-d\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.741573 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.741370 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fd753b45-0daa-4581-ae69-246f33099827-etc-sysconfig\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.741573 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.741384 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b06d6e65-9964-4785-8589-43ef464433aa-device-dir\") pod \"aws-ebs-csi-driver-node-vd75q\" (UID: \"b06d6e65-9964-4785-8589-43ef464433aa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd75q" Apr 19 12:09:53.741573 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.741428 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w8hf\" (UniqueName: \"kubernetes.io/projected/2c805a23-696a-4038-acd9-e934f8c66c1d-kube-api-access-6w8hf\") pod \"iptables-alerter-qgqpz\" (UID: \"2c805a23-696a-4038-acd9-e934f8c66c1d\") " pod="openshift-network-operator/iptables-alerter-qgqpz" Apr 19 12:09:53.741573 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.741464 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-var-lib-openvswitch\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.741573 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.741494 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/08fea0f9-3a6e-4ab6-b269-5668dab364ea-ovnkube-config\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.741573 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.741533 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkx9m\" (UniqueName: \"kubernetes.io/projected/3a9d31f9-fb41-43e0-9946-0611710438a1-kube-api-access-pkx9m\") pod \"network-metrics-daemon-rxnrj\" (UID: \"3a9d31f9-fb41-43e0-9946-0611710438a1\") " pod="openshift-multus/network-metrics-daemon-rxnrj" Apr 19 12:09:53.741573 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.741556 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b06d6e65-9964-4785-8589-43ef464433aa-etc-selinux\") pod \"aws-ebs-csi-driver-node-vd75q\" (UID: \"b06d6e65-9964-4785-8589-43ef464433aa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd75q" Apr 19 12:09:53.741979 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.741580 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2c805a23-696a-4038-acd9-e934f8c66c1d-iptables-alerter-script\") pod \"iptables-alerter-qgqpz\" (UID: \"2c805a23-696a-4038-acd9-e934f8c66c1d\") " pod="openshift-network-operator/iptables-alerter-qgqpz" Apr 19 12:09:53.741979 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.741643 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfh4z\" (UniqueName: \"kubernetes.io/projected/08fea0f9-3a6e-4ab6-b269-5668dab364ea-kube-api-access-zfh4z\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.741979 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.741688 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fd753b45-0daa-4581-ae69-246f33099827-etc-sysctl-d\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.741979 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.741713 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fd753b45-0daa-4581-ae69-246f33099827-etc-systemd\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.741979 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.741736 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-run-openvswitch\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.741979 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.741760 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4qh2\" (UniqueName: \"kubernetes.io/projected/2527ca81-1ebd-4808-a264-00f75b2caea4-kube-api-access-k4qh2\") pod \"node-ca-zfq9x\" (UID: \"2527ca81-1ebd-4808-a264-00f75b2caea4\") " pod="openshift-image-registry/node-ca-zfq9x" Apr 19 12:09:53.741979 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.741783 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/08fea0f9-3a6e-4ab6-b269-5668dab364ea-ovn-node-metrics-cert\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.741979 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.741818 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd753b45-0daa-4581-ae69-246f33099827-var-lib-kubelet\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.741979 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.741843 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f601721a-a6ac-4b15-8bc0-48274f620286-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2tbjd\" (UID: \"f601721a-a6ac-4b15-8bc0-48274f620286\") " pod="openshift-multus/multus-additional-cni-plugins-2tbjd" Apr 19 12:09:53.741979 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.741868 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-host-run-netns\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.741979 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.741891 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fd753b45-0daa-4581-ae69-246f33099827-run\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.741979 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.741936 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b06d6e65-9964-4785-8589-43ef464433aa-registration-dir\") pod \"aws-ebs-csi-driver-node-vd75q\" (UID: \"b06d6e65-9964-4785-8589-43ef464433aa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd75q" Apr 19 12:09:53.741979 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.741963 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f601721a-a6ac-4b15-8bc0-48274f620286-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2tbjd\" (UID: \"f601721a-a6ac-4b15-8bc0-48274f620286\") " pod="openshift-multus/multus-additional-cni-plugins-2tbjd" Apr 19 12:09:53.742638 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.741997 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.742638 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.742038 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fd753b45-0daa-4581-ae69-246f33099827-sys\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.742638 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.742087 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a9d31f9-fb41-43e0-9946-0611710438a1-metrics-certs\") pod \"network-metrics-daemon-rxnrj\" (UID: \"3a9d31f9-fb41-43e0-9946-0611710438a1\") " pod="openshift-multus/network-metrics-daemon-rxnrj" Apr 19 12:09:53.742638 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.742139 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b06d6e65-9964-4785-8589-43ef464433aa-socket-dir\") pod \"aws-ebs-csi-driver-node-vd75q\" (UID: \"b06d6e65-9964-4785-8589-43ef464433aa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd75q" Apr 19 12:09:53.742638 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.742165 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-host-slash\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.742638 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.742187 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-host-cni-bin\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.742638 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.742227 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-host-cni-netd\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.742638 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.742252 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fd753b45-0daa-4581-ae69-246f33099827-tmp\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.742638 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.742292 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-etc-openvswitch\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.742638 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.742342 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-node-log\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.742638 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.742388 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/08fea0f9-3a6e-4ab6-b269-5668dab364ea-env-overrides\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.742638 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.742415 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fd753b45-0daa-4581-ae69-246f33099827-lib-modules\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.742638 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.742451 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd753b45-0daa-4581-ae69-246f33099827-etc-kubernetes\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.742638 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.742498 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b06d6e65-9964-4785-8589-43ef464433aa-sys-fs\") pod \"aws-ebs-csi-driver-node-vd75q\" (UID: \"b06d6e65-9964-4785-8589-43ef464433aa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd75q" Apr 19 12:09:53.742638 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.742536 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f601721a-a6ac-4b15-8bc0-48274f620286-os-release\") pod \"multus-additional-cni-plugins-2tbjd\" (UID: \"f601721a-a6ac-4b15-8bc0-48274f620286\") " pod="openshift-multus/multus-additional-cni-plugins-2tbjd" Apr 19 12:09:53.742638 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.742565 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f601721a-a6ac-4b15-8bc0-48274f620286-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2tbjd\" (UID: \"f601721a-a6ac-4b15-8bc0-48274f620286\") " pod="openshift-multus/multus-additional-cni-plugins-2tbjd" Apr 19 12:09:53.743376 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.742589 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-run-systemd\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.743376 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.742631 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3ed6a38c-6faa-41d7-855c-af958b4e6898-agent-certs\") pod \"konnectivity-agent-6ljqn\" (UID: \"3ed6a38c-6faa-41d7-855c-af958b4e6898\") " pod="kube-system/konnectivity-agent-6ljqn" Apr 19 12:09:53.743376 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.742651 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2c805a23-696a-4038-acd9-e934f8c66c1d-host-slash\") pod \"iptables-alerter-qgqpz\" (UID: \"2c805a23-696a-4038-acd9-e934f8c66c1d\") " pod="openshift-network-operator/iptables-alerter-qgqpz" Apr 19 12:09:53.743376 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.742673 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-systemd-units\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.743376 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.742702 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-host-run-ovn-kubernetes\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.743376 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.742723 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/08fea0f9-3a6e-4ab6-b269-5668dab364ea-ovnkube-script-lib\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.743376 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.742739 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fd753b45-0daa-4581-ae69-246f33099827-etc-sysctl-conf\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.743376 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.742759 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fd753b45-0daa-4581-ae69-246f33099827-etc-tuned\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.743376 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.742797 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmc2b\" (UniqueName: \"kubernetes.io/projected/8cd0f946-6502-4d2a-94d4-721582219a2f-kube-api-access-jmc2b\") pod \"node-resolver-8gjrs\" (UID: \"8cd0f946-6502-4d2a-94d4-721582219a2f\") " pod="openshift-dns/node-resolver-8gjrs" Apr 19 12:09:53.743376 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.742847 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f601721a-a6ac-4b15-8bc0-48274f620286-cni-binary-copy\") pod \"multus-additional-cni-plugins-2tbjd\" (UID: \"f601721a-a6ac-4b15-8bc0-48274f620286\") " pod="openshift-multus/multus-additional-cni-plugins-2tbjd" Apr 19 12:09:53.743376 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.742877 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-run-ovn\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.743376 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.742909 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94c4z\" (UniqueName: \"kubernetes.io/projected/f601721a-a6ac-4b15-8bc0-48274f620286-kube-api-access-94c4z\") pod \"multus-additional-cni-plugins-2tbjd\" (UID: \"f601721a-a6ac-4b15-8bc0-48274f620286\") " pod="openshift-multus/multus-additional-cni-plugins-2tbjd" Apr 19 12:09:53.743376 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.742935 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fd753b45-0daa-4581-ae69-246f33099827-host\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.743376 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.742959 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlt6c\" (UniqueName: \"kubernetes.io/projected/fd753b45-0daa-4581-ae69-246f33099827-kube-api-access-qlt6c\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.743376 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.742982 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8cd0f946-6502-4d2a-94d4-721582219a2f-tmp-dir\") pod \"node-resolver-8gjrs\" (UID: \"8cd0f946-6502-4d2a-94d4-721582219a2f\") " pod="openshift-dns/node-resolver-8gjrs" Apr 19 12:09:53.743376 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.743006 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddsvq\" (UniqueName: \"kubernetes.io/projected/c60217db-a1f2-446f-bada-9675c7c62201-kube-api-access-ddsvq\") pod \"network-check-target-69kxr\" (UID: \"c60217db-a1f2-446f-bada-9675c7c62201\") " pod="openshift-network-diagnostics/network-check-target-69kxr" Apr 19 12:09:53.744079 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.743051 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2527ca81-1ebd-4808-a264-00f75b2caea4-host\") pod \"node-ca-zfq9x\" (UID: \"2527ca81-1ebd-4808-a264-00f75b2caea4\") " pod="openshift-image-registry/node-ca-zfq9x" Apr 19 12:09:53.744079 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.743078 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f601721a-a6ac-4b15-8bc0-48274f620286-system-cni-dir\") pod \"multus-additional-cni-plugins-2tbjd\" (UID: \"f601721a-a6ac-4b15-8bc0-48274f620286\") " pod="openshift-multus/multus-additional-cni-plugins-2tbjd" Apr 19 12:09:53.744079 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.743140 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-log-socket\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.744079 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.743156 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8cd0f946-6502-4d2a-94d4-721582219a2f-hosts-file\") pod \"node-resolver-8gjrs\" (UID: \"8cd0f946-6502-4d2a-94d4-721582219a2f\") " pod="openshift-dns/node-resolver-8gjrs" Apr 19 12:09:53.744079 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.743176 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f601721a-a6ac-4b15-8bc0-48274f620286-cnibin\") pod \"multus-additional-cni-plugins-2tbjd\" (UID: \"f601721a-a6ac-4b15-8bc0-48274f620286\") " pod="openshift-multus/multus-additional-cni-plugins-2tbjd" Apr 19 12:09:53.744079 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.743198 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-host-kubelet\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.774709 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.774671 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-18 12:04:52 +0000 UTC" deadline="2027-12-20 13:47:56.01091026 +0000 UTC" Apr 19 12:09:53.774709 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.774705 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14641h38m2.236208763s" Apr 19 12:09:53.832780 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.832748 2567 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 19 12:09:53.843976 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.843935 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-host-cni-bin\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.844152 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.843984 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-host-cni-netd\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.844152 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844014 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-cnibin\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.844152 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844039 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-multus-daemon-config\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.844152 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844053 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-host-cni-bin\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.844152 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844067 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fd753b45-0daa-4581-ae69-246f33099827-tmp\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.844152 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844073 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-host-cni-netd\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.844152 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844095 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-etc-openvswitch\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.844435 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844155 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-node-log\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.844435 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844161 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-etc-openvswitch\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.844435 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844183 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/08fea0f9-3a6e-4ab6-b269-5668dab364ea-env-overrides\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.844435 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844224 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-node-log\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.844435 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844226 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fd753b45-0daa-4581-ae69-246f33099827-lib-modules\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.844435 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844261 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd753b45-0daa-4581-ae69-246f33099827-etc-kubernetes\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.844435 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844318 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b06d6e65-9964-4785-8589-43ef464433aa-sys-fs\") pod \"aws-ebs-csi-driver-node-vd75q\" (UID: \"b06d6e65-9964-4785-8589-43ef464433aa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd75q" Apr 19 12:09:53.844435 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844354 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f601721a-a6ac-4b15-8bc0-48274f620286-os-release\") pod \"multus-additional-cni-plugins-2tbjd\" (UID: \"f601721a-a6ac-4b15-8bc0-48274f620286\") " pod="openshift-multus/multus-additional-cni-plugins-2tbjd" Apr 19 12:09:53.844435 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844382 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f601721a-a6ac-4b15-8bc0-48274f620286-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2tbjd\" (UID: \"f601721a-a6ac-4b15-8bc0-48274f620286\") " pod="openshift-multus/multus-additional-cni-plugins-2tbjd" Apr 19 12:09:53.844435 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844395 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd753b45-0daa-4581-ae69-246f33099827-etc-kubernetes\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.844435 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844401 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b06d6e65-9964-4785-8589-43ef464433aa-sys-fs\") pod \"aws-ebs-csi-driver-node-vd75q\" (UID: \"b06d6e65-9964-4785-8589-43ef464433aa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd75q" Apr 19 12:09:53.844435 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844408 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-run-systemd\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.844435 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844404 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fd753b45-0daa-4581-ae69-246f33099827-lib-modules\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.844435 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844433 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3ed6a38c-6faa-41d7-855c-af958b4e6898-agent-certs\") pod \"konnectivity-agent-6ljqn\" (UID: \"3ed6a38c-6faa-41d7-855c-af958b4e6898\") " pod="kube-system/konnectivity-agent-6ljqn" Apr 19 12:09:53.845075 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844457 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f601721a-a6ac-4b15-8bc0-48274f620286-os-release\") pod \"multus-additional-cni-plugins-2tbjd\" (UID: \"f601721a-a6ac-4b15-8bc0-48274f620286\") " pod="openshift-multus/multus-additional-cni-plugins-2tbjd" Apr 19 12:09:53.845075 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844450 2567 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 19 12:09:53.845075 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844471 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-run-systemd\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.845075 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844495 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-multus-cni-dir\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.845075 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844525 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2c805a23-696a-4038-acd9-e934f8c66c1d-host-slash\") pod \"iptables-alerter-qgqpz\" (UID: \"2c805a23-696a-4038-acd9-e934f8c66c1d\") " pod="openshift-network-operator/iptables-alerter-qgqpz" Apr 19 12:09:53.845075 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844548 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-systemd-units\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.845075 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844573 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-host-run-ovn-kubernetes\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.845075 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844591 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2c805a23-696a-4038-acd9-e934f8c66c1d-host-slash\") pod \"iptables-alerter-qgqpz\" (UID: \"2c805a23-696a-4038-acd9-e934f8c66c1d\") " pod="openshift-network-operator/iptables-alerter-qgqpz" Apr 19 12:09:53.845075 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844597 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/08fea0f9-3a6e-4ab6-b269-5668dab364ea-ovnkube-script-lib\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.845075 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844627 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-host-run-netns\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.845075 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844637 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-systemd-units\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.845075 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844659 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fd753b45-0daa-4581-ae69-246f33099827-etc-sysctl-conf\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.845075 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844680 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-host-run-ovn-kubernetes\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.845075 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844686 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fd753b45-0daa-4581-ae69-246f33099827-etc-tuned\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.845075 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844717 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jmc2b\" (UniqueName: \"kubernetes.io/projected/8cd0f946-6502-4d2a-94d4-721582219a2f-kube-api-access-jmc2b\") pod \"node-resolver-8gjrs\" (UID: \"8cd0f946-6502-4d2a-94d4-721582219a2f\") " pod="openshift-dns/node-resolver-8gjrs" Apr 19 12:09:53.845075 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844743 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f601721a-a6ac-4b15-8bc0-48274f620286-cni-binary-copy\") pod \"multus-additional-cni-plugins-2tbjd\" (UID: \"f601721a-a6ac-4b15-8bc0-48274f620286\") " pod="openshift-multus/multus-additional-cni-plugins-2tbjd" Apr 19 12:09:53.845075 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844768 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-run-ovn\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.845075 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844794 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-host-var-lib-cni-bin\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.845899 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844822 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-host-var-lib-kubelet\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.845899 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844865 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-etc-kubernetes\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.845899 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844890 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8s52\" (UniqueName: \"kubernetes.io/projected/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-kube-api-access-f8s52\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.845899 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844964 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-94c4z\" (UniqueName: \"kubernetes.io/projected/f601721a-a6ac-4b15-8bc0-48274f620286-kube-api-access-94c4z\") pod \"multus-additional-cni-plugins-2tbjd\" (UID: \"f601721a-a6ac-4b15-8bc0-48274f620286\") " pod="openshift-multus/multus-additional-cni-plugins-2tbjd" Apr 19 12:09:53.845899 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.844994 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-os-release\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.845899 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.845018 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-cni-binary-copy\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.845899 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.845048 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fd753b45-0daa-4581-ae69-246f33099827-host\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.845899 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.845060 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f601721a-a6ac-4b15-8bc0-48274f620286-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2tbjd\" (UID: \"f601721a-a6ac-4b15-8bc0-48274f620286\") " pod="openshift-multus/multus-additional-cni-plugins-2tbjd" Apr 19 12:09:53.845899 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.845073 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qlt6c\" (UniqueName: \"kubernetes.io/projected/fd753b45-0daa-4581-ae69-246f33099827-kube-api-access-qlt6c\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.845899 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.845100 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8cd0f946-6502-4d2a-94d4-721582219a2f-tmp-dir\") pod \"node-resolver-8gjrs\" (UID: \"8cd0f946-6502-4d2a-94d4-721582219a2f\") " pod="openshift-dns/node-resolver-8gjrs" Apr 19 12:09:53.845899 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.845142 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddsvq\" (UniqueName: \"kubernetes.io/projected/c60217db-a1f2-446f-bada-9675c7c62201-kube-api-access-ddsvq\") pod \"network-check-target-69kxr\" (UID: \"c60217db-a1f2-446f-bada-9675c7c62201\") " pod="openshift-network-diagnostics/network-check-target-69kxr" Apr 19 12:09:53.845899 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.845170 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2527ca81-1ebd-4808-a264-00f75b2caea4-host\") pod \"node-ca-zfq9x\" (UID: \"2527ca81-1ebd-4808-a264-00f75b2caea4\") " pod="openshift-image-registry/node-ca-zfq9x" Apr 19 12:09:53.845899 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.845195 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f601721a-a6ac-4b15-8bc0-48274f620286-system-cni-dir\") pod \"multus-additional-cni-plugins-2tbjd\" (UID: \"f601721a-a6ac-4b15-8bc0-48274f620286\") " pod="openshift-multus/multus-additional-cni-plugins-2tbjd" Apr 19 12:09:53.845899 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.845229 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-log-socket\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.845899 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.845246 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fd753b45-0daa-4581-ae69-246f33099827-etc-sysctl-conf\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.845899 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.845253 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8cd0f946-6502-4d2a-94d4-721582219a2f-hosts-file\") pod \"node-resolver-8gjrs\" (UID: \"8cd0f946-6502-4d2a-94d4-721582219a2f\") " pod="openshift-dns/node-resolver-8gjrs" Apr 19 12:09:53.845899 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.845289 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f601721a-a6ac-4b15-8bc0-48274f620286-cnibin\") pod \"multus-additional-cni-plugins-2tbjd\" (UID: \"f601721a-a6ac-4b15-8bc0-48274f620286\") " pod="openshift-multus/multus-additional-cni-plugins-2tbjd" Apr 19 12:09:53.846494 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.845303 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8cd0f946-6502-4d2a-94d4-721582219a2f-hosts-file\") pod \"node-resolver-8gjrs\" (UID: \"8cd0f946-6502-4d2a-94d4-721582219a2f\") " pod="openshift-dns/node-resolver-8gjrs" Apr 19 12:09:53.846494 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.845316 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/08fea0f9-3a6e-4ab6-b269-5668dab364ea-env-overrides\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.846494 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.845330 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-host-kubelet\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.846494 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.845403 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f601721a-a6ac-4b15-8bc0-48274f620286-cnibin\") pod \"multus-additional-cni-plugins-2tbjd\" (UID: \"f601721a-a6ac-4b15-8bc0-48274f620286\") " pod="openshift-multus/multus-additional-cni-plugins-2tbjd" Apr 19 12:09:53.846494 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.845418 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fd753b45-0daa-4581-ae69-246f33099827-host\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.846494 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.845510 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f601721a-a6ac-4b15-8bc0-48274f620286-system-cni-dir\") pod \"multus-additional-cni-plugins-2tbjd\" (UID: \"f601721a-a6ac-4b15-8bc0-48274f620286\") " pod="openshift-multus/multus-additional-cni-plugins-2tbjd" Apr 19 12:09:53.846494 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.845552 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2527ca81-1ebd-4808-a264-00f75b2caea4-host\") pod \"node-ca-zfq9x\" (UID: \"2527ca81-1ebd-4808-a264-00f75b2caea4\") " pod="openshift-image-registry/node-ca-zfq9x" Apr 19 12:09:53.846494 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.845590 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-run-ovn\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.846494 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.845713 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8cd0f946-6502-4d2a-94d4-721582219a2f-tmp-dir\") pod \"node-resolver-8gjrs\" (UID: \"8cd0f946-6502-4d2a-94d4-721582219a2f\") " pod="openshift-dns/node-resolver-8gjrs" Apr 19 12:09:53.846494 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.845724 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-log-socket\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.846494 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.845812 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/08fea0f9-3a6e-4ab6-b269-5668dab364ea-ovnkube-script-lib\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.846494 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.845853 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3ed6a38c-6faa-41d7-855c-af958b4e6898-konnectivity-ca\") pod \"konnectivity-agent-6ljqn\" (UID: \"3ed6a38c-6faa-41d7-855c-af958b4e6898\") " pod="kube-system/konnectivity-agent-6ljqn" Apr 19 12:09:53.846494 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.845881 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b06d6e65-9964-4785-8589-43ef464433aa-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vd75q\" (UID: \"b06d6e65-9964-4785-8589-43ef464433aa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd75q" Apr 19 12:09:53.846494 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.845905 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2527ca81-1ebd-4808-a264-00f75b2caea4-serviceca\") pod \"node-ca-zfq9x\" (UID: \"2527ca81-1ebd-4808-a264-00f75b2caea4\") " pod="openshift-image-registry/node-ca-zfq9x" Apr 19 12:09:53.846494 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.845930 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7vwbm\" (UniqueName: \"kubernetes.io/projected/b06d6e65-9964-4785-8589-43ef464433aa-kube-api-access-7vwbm\") pod \"aws-ebs-csi-driver-node-vd75q\" (UID: \"b06d6e65-9964-4785-8589-43ef464433aa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd75q" Apr 19 12:09:53.846494 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.845956 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fd753b45-0daa-4581-ae69-246f33099827-etc-modprobe-d\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.846494 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.845980 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fd753b45-0daa-4581-ae69-246f33099827-etc-sysconfig\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.846494 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.846004 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b06d6e65-9964-4785-8589-43ef464433aa-device-dir\") pod \"aws-ebs-csi-driver-node-vd75q\" (UID: \"b06d6e65-9964-4785-8589-43ef464433aa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd75q" Apr 19 12:09:53.847279 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.846031 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6w8hf\" (UniqueName: \"kubernetes.io/projected/2c805a23-696a-4038-acd9-e934f8c66c1d-kube-api-access-6w8hf\") pod \"iptables-alerter-qgqpz\" (UID: \"2c805a23-696a-4038-acd9-e934f8c66c1d\") " pod="openshift-network-operator/iptables-alerter-qgqpz" Apr 19 12:09:53.847279 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.846057 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-var-lib-openvswitch\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.847279 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.846081 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/08fea0f9-3a6e-4ab6-b269-5668dab364ea-ovnkube-config\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.847279 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.846124 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pkx9m\" (UniqueName: \"kubernetes.io/projected/3a9d31f9-fb41-43e0-9946-0611710438a1-kube-api-access-pkx9m\") pod \"network-metrics-daemon-rxnrj\" (UID: \"3a9d31f9-fb41-43e0-9946-0611710438a1\") " pod="openshift-multus/network-metrics-daemon-rxnrj" Apr 19 12:09:53.847279 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.846149 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b06d6e65-9964-4785-8589-43ef464433aa-etc-selinux\") pod \"aws-ebs-csi-driver-node-vd75q\" (UID: \"b06d6e65-9964-4785-8589-43ef464433aa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd75q" Apr 19 12:09:53.847279 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.846173 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2c805a23-696a-4038-acd9-e934f8c66c1d-iptables-alerter-script\") pod \"iptables-alerter-qgqpz\" (UID: \"2c805a23-696a-4038-acd9-e934f8c66c1d\") " pod="openshift-network-operator/iptables-alerter-qgqpz" Apr 19 12:09:53.847279 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.846280 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-host-kubelet\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.847279 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.846866 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zfh4z\" (UniqueName: \"kubernetes.io/projected/08fea0f9-3a6e-4ab6-b269-5668dab364ea-kube-api-access-zfh4z\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.847279 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.847020 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-multus-socket-dir-parent\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.847279 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.847059 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-hostroot\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.847279 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.847085 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-multus-conf-dir\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.847279 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.847138 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fd753b45-0daa-4581-ae69-246f33099827-etc-sysctl-d\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.847279 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.847171 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fd753b45-0daa-4581-ae69-246f33099827-etc-systemd\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.847279 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.847204 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-run-openvswitch\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.847279 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.847239 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4qh2\" (UniqueName: \"kubernetes.io/projected/2527ca81-1ebd-4808-a264-00f75b2caea4-kube-api-access-k4qh2\") pod \"node-ca-zfq9x\" (UID: \"2527ca81-1ebd-4808-a264-00f75b2caea4\") " pod="openshift-image-registry/node-ca-zfq9x" Apr 19 12:09:53.849052 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.847759 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2527ca81-1ebd-4808-a264-00f75b2caea4-serviceca\") pod \"node-ca-zfq9x\" (UID: \"2527ca81-1ebd-4808-a264-00f75b2caea4\") " pod="openshift-image-registry/node-ca-zfq9x" Apr 19 12:09:53.849052 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.847845 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3ed6a38c-6faa-41d7-855c-af958b4e6898-konnectivity-ca\") pod \"konnectivity-agent-6ljqn\" (UID: \"3ed6a38c-6faa-41d7-855c-af958b4e6898\") " pod="kube-system/konnectivity-agent-6ljqn" Apr 19 12:09:53.849052 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.847923 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fd753b45-0daa-4581-ae69-246f33099827-etc-sysconfig\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.849052 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.847960 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fd753b45-0daa-4581-ae69-246f33099827-etc-modprobe-d\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.849052 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.847985 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b06d6e65-9964-4785-8589-43ef464433aa-device-dir\") pod \"aws-ebs-csi-driver-node-vd75q\" (UID: \"b06d6e65-9964-4785-8589-43ef464433aa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd75q" Apr 19 12:09:53.849052 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.848256 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-var-lib-openvswitch\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.849052 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.848286 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3ed6a38c-6faa-41d7-855c-af958b4e6898-agent-certs\") pod \"konnectivity-agent-6ljqn\" (UID: \"3ed6a38c-6faa-41d7-855c-af958b4e6898\") " pod="kube-system/konnectivity-agent-6ljqn" Apr 19 12:09:53.849052 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.848317 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f601721a-a6ac-4b15-8bc0-48274f620286-cni-binary-copy\") pod \"multus-additional-cni-plugins-2tbjd\" (UID: \"f601721a-a6ac-4b15-8bc0-48274f620286\") " pod="openshift-multus/multus-additional-cni-plugins-2tbjd" Apr 19 12:09:53.849052 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.848423 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b06d6e65-9964-4785-8589-43ef464433aa-etc-selinux\") pod \"aws-ebs-csi-driver-node-vd75q\" (UID: \"b06d6e65-9964-4785-8589-43ef464433aa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd75q" Apr 19 12:09:53.849052 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.848519 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fd753b45-0daa-4581-ae69-246f33099827-etc-sysctl-d\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.849052 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.848630 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b06d6e65-9964-4785-8589-43ef464433aa-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vd75q\" (UID: \"b06d6e65-9964-4785-8589-43ef464433aa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd75q" Apr 19 12:09:53.849052 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.848666 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-run-openvswitch\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.849052 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.848742 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fd753b45-0daa-4581-ae69-246f33099827-etc-systemd\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.849052 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.848787 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2c805a23-696a-4038-acd9-e934f8c66c1d-iptables-alerter-script\") pod \"iptables-alerter-qgqpz\" (UID: \"2c805a23-696a-4038-acd9-e934f8c66c1d\") " pod="openshift-network-operator/iptables-alerter-qgqpz" Apr 19 12:09:53.849052 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.848918 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/08fea0f9-3a6e-4ab6-b269-5668dab364ea-ovn-node-metrics-cert\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.849052 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.848940 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fd753b45-0daa-4581-ae69-246f33099827-etc-tuned\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.849052 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.848983 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-system-cni-dir\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.849803 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.849019 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-host-var-lib-cni-multus\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.849803 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.849056 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd753b45-0daa-4581-ae69-246f33099827-var-lib-kubelet\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.849803 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.849092 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f601721a-a6ac-4b15-8bc0-48274f620286-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2tbjd\" (UID: \"f601721a-a6ac-4b15-8bc0-48274f620286\") " pod="openshift-multus/multus-additional-cni-plugins-2tbjd" Apr 19 12:09:53.849803 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.849095 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fd753b45-0daa-4581-ae69-246f33099827-tmp\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.849803 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.849134 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-host-run-netns\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.849803 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.849167 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fd753b45-0daa-4581-ae69-246f33099827-run\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.849803 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.849183 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd753b45-0daa-4581-ae69-246f33099827-var-lib-kubelet\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.849803 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.849197 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b06d6e65-9964-4785-8589-43ef464433aa-registration-dir\") pod \"aws-ebs-csi-driver-node-vd75q\" (UID: \"b06d6e65-9964-4785-8589-43ef464433aa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd75q" Apr 19 12:09:53.849803 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.849230 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f601721a-a6ac-4b15-8bc0-48274f620286-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2tbjd\" (UID: \"f601721a-a6ac-4b15-8bc0-48274f620286\") " pod="openshift-multus/multus-additional-cni-plugins-2tbjd" Apr 19 12:09:53.849803 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.849262 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.849803 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.849271 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-host-run-netns\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.849803 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.849293 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-host-run-k8s-cni-cncf-io\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.849803 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.849321 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-host-run-multus-certs\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.849803 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.849352 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fd753b45-0daa-4581-ae69-246f33099827-sys\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.849803 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.849383 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a9d31f9-fb41-43e0-9946-0611710438a1-metrics-certs\") pod \"network-metrics-daemon-rxnrj\" (UID: \"3a9d31f9-fb41-43e0-9946-0611710438a1\") " pod="openshift-multus/network-metrics-daemon-rxnrj" Apr 19 12:09:53.849803 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.849396 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f601721a-a6ac-4b15-8bc0-48274f620286-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2tbjd\" (UID: \"f601721a-a6ac-4b15-8bc0-48274f620286\") " pod="openshift-multus/multus-additional-cni-plugins-2tbjd" Apr 19 12:09:53.849803 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.849414 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b06d6e65-9964-4785-8589-43ef464433aa-socket-dir\") pod \"aws-ebs-csi-driver-node-vd75q\" (UID: \"b06d6e65-9964-4785-8589-43ef464433aa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd75q" Apr 19 12:09:53.850489 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.849443 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-host-slash\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.850489 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.849451 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.850489 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.849550 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08fea0f9-3a6e-4ab6-b269-5668dab364ea-host-slash\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.850489 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.849555 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fd753b45-0daa-4581-ae69-246f33099827-sys\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.850489 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.849650 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b06d6e65-9964-4785-8589-43ef464433aa-socket-dir\") pod \"aws-ebs-csi-driver-node-vd75q\" (UID: \"b06d6e65-9964-4785-8589-43ef464433aa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd75q" Apr 19 12:09:53.850489 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.849663 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/08fea0f9-3a6e-4ab6-b269-5668dab364ea-ovn-node-metrics-cert\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.850489 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:53.849665 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:09:53.850489 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.849717 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fd753b45-0daa-4581-ae69-246f33099827-run\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.850489 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:53.849758 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a9d31f9-fb41-43e0-9946-0611710438a1-metrics-certs podName:3a9d31f9-fb41-43e0-9946-0611710438a1 nodeName:}" failed. No retries permitted until 2026-04-19 12:09:54.349733205 +0000 UTC m=+3.103925010 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3a9d31f9-fb41-43e0-9946-0611710438a1-metrics-certs") pod "network-metrics-daemon-rxnrj" (UID: "3a9d31f9-fb41-43e0-9946-0611710438a1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:09:53.850489 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.849808 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b06d6e65-9964-4785-8589-43ef464433aa-registration-dir\") pod \"aws-ebs-csi-driver-node-vd75q\" (UID: \"b06d6e65-9964-4785-8589-43ef464433aa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd75q" Apr 19 12:09:53.850489 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.850087 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f601721a-a6ac-4b15-8bc0-48274f620286-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2tbjd\" (UID: \"f601721a-a6ac-4b15-8bc0-48274f620286\") " pod="openshift-multus/multus-additional-cni-plugins-2tbjd" Apr 19 12:09:53.852344 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.852318 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/08fea0f9-3a6e-4ab6-b269-5668dab364ea-ovnkube-config\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.852724 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:53.852590 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:09:53.852724 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:53.852621 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:09:53.852724 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:53.852633 2567 projected.go:194] Error preparing data for projected volume kube-api-access-ddsvq for pod openshift-network-diagnostics/network-check-target-69kxr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:09:53.852724 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:53.852697 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c60217db-a1f2-446f-bada-9675c7c62201-kube-api-access-ddsvq podName:c60217db-a1f2-446f-bada-9675c7c62201 nodeName:}" failed. No retries permitted until 2026-04-19 12:09:54.35268355 +0000 UTC m=+3.106875355 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ddsvq" (UniqueName: "kubernetes.io/projected/c60217db-a1f2-446f-bada-9675c7c62201-kube-api-access-ddsvq") pod "network-check-target-69kxr" (UID: "c60217db-a1f2-446f-bada-9675c7c62201") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:09:53.853291 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.853214 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlt6c\" (UniqueName: \"kubernetes.io/projected/fd753b45-0daa-4581-ae69-246f33099827-kube-api-access-qlt6c\") pod \"tuned-txn87\" (UID: \"fd753b45-0daa-4581-ae69-246f33099827\") " pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:53.854427 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.854403 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmc2b\" (UniqueName: \"kubernetes.io/projected/8cd0f946-6502-4d2a-94d4-721582219a2f-kube-api-access-jmc2b\") pod \"node-resolver-8gjrs\" (UID: \"8cd0f946-6502-4d2a-94d4-721582219a2f\") " pod="openshift-dns/node-resolver-8gjrs" Apr 19 12:09:53.855253 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.854986 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-94c4z\" (UniqueName: \"kubernetes.io/projected/f601721a-a6ac-4b15-8bc0-48274f620286-kube-api-access-94c4z\") pod \"multus-additional-cni-plugins-2tbjd\" (UID: \"f601721a-a6ac-4b15-8bc0-48274f620286\") " pod="openshift-multus/multus-additional-cni-plugins-2tbjd" Apr 19 12:09:53.855522 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.855496 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfh4z\" (UniqueName: \"kubernetes.io/projected/08fea0f9-3a6e-4ab6-b269-5668dab364ea-kube-api-access-zfh4z\") pod \"ovnkube-node-5n96s\" (UID: \"08fea0f9-3a6e-4ab6-b269-5668dab364ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:53.855747 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.855728 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vwbm\" (UniqueName: \"kubernetes.io/projected/b06d6e65-9964-4785-8589-43ef464433aa-kube-api-access-7vwbm\") pod \"aws-ebs-csi-driver-node-vd75q\" (UID: \"b06d6e65-9964-4785-8589-43ef464433aa\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd75q" Apr 19 12:09:53.858799 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.858780 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w8hf\" (UniqueName: \"kubernetes.io/projected/2c805a23-696a-4038-acd9-e934f8c66c1d-kube-api-access-6w8hf\") pod \"iptables-alerter-qgqpz\" (UID: \"2c805a23-696a-4038-acd9-e934f8c66c1d\") " pod="openshift-network-operator/iptables-alerter-qgqpz" Apr 19 12:09:53.859332 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.859310 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4qh2\" (UniqueName: \"kubernetes.io/projected/2527ca81-1ebd-4808-a264-00f75b2caea4-kube-api-access-k4qh2\") pod \"node-ca-zfq9x\" (UID: \"2527ca81-1ebd-4808-a264-00f75b2caea4\") " pod="openshift-image-registry/node-ca-zfq9x" Apr 19 12:09:53.860547 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.860526 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkx9m\" (UniqueName: \"kubernetes.io/projected/3a9d31f9-fb41-43e0-9946-0611710438a1-kube-api-access-pkx9m\") pod \"network-metrics-daemon-rxnrj\" (UID: \"3a9d31f9-fb41-43e0-9946-0611710438a1\") " pod="openshift-multus/network-metrics-daemon-rxnrj" Apr 19 12:09:53.950748 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.950711 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-cnibin\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.950926 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.950756 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-multus-daemon-config\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.950926 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.950782 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-multus-cni-dir\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.950926 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.950860 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-cnibin\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.950926 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.950898 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-host-run-netns\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.951151 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.950936 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-host-var-lib-cni-bin\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.951151 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.950943 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-multus-cni-dir\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.951151 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.950957 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-host-var-lib-kubelet\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.951151 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.950998 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-host-run-netns\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.951151 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.951000 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-host-var-lib-kubelet\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.951151 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.951019 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-etc-kubernetes\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.951151 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.951046 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-host-var-lib-cni-bin\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.951151 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.951045 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f8s52\" (UniqueName: \"kubernetes.io/projected/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-kube-api-access-f8s52\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.951151 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.951068 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-etc-kubernetes\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.951151 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.951082 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-os-release\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.951151 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.951104 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-cni-binary-copy\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.951627 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.951176 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-multus-socket-dir-parent\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.951627 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.951181 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-os-release\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.951627 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.951199 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-hostroot\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.951627 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.951220 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-multus-conf-dir\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.951627 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.951246 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-system-cni-dir\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.951627 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.951267 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-host-var-lib-cni-multus\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.951627 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.951295 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-host-run-k8s-cni-cncf-io\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.951627 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.951318 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-host-run-multus-certs\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.951627 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.951394 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-multus-conf-dir\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.951627 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.951403 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-multus-daemon-config\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.951627 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.951413 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-host-run-multus-certs\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.951627 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.951435 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-host-run-k8s-cni-cncf-io\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.951627 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.951470 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-system-cni-dir\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.951627 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.951474 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-host-var-lib-cni-multus\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.951627 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.951499 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-hostroot\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.951627 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.951548 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-multus-socket-dir-parent\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.951627 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.951585 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-cni-binary-copy\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:53.958968 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:53.958943 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8s52\" (UniqueName: \"kubernetes.io/projected/d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216-kube-api-access-f8s52\") pod \"multus-57vt2\" (UID: \"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216\") " pod="openshift-multus/multus-57vt2" Apr 19 12:09:54.037065 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:54.036982 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qgqpz" Apr 19 12:09:54.044993 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:54.044967 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd75q" Apr 19 12:09:54.053855 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:54.053822 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zfq9x" Apr 19 12:09:54.059432 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:54.059407 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2tbjd" Apr 19 12:09:54.067128 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:54.067095 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:09:54.074756 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:54.074734 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6ljqn" Apr 19 12:09:54.082332 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:54.082310 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-txn87" Apr 19 12:09:54.088552 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:54.088516 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8gjrs" Apr 19 12:09:54.092232 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:54.092214 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-57vt2" Apr 19 12:09:54.299299 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:54.299258 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf601721a_a6ac_4b15_8bc0_48274f620286.slice/crio-c9f897d1e7eb56d1301695b1f6ba65aaaac346c94e435dd2b385604cc5bbf98a WatchSource:0}: Error finding container c9f897d1e7eb56d1301695b1f6ba65aaaac346c94e435dd2b385604cc5bbf98a: Status 404 returned error can't find the container with id c9f897d1e7eb56d1301695b1f6ba65aaaac346c94e435dd2b385604cc5bbf98a Apr 19 12:09:54.312877 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:54.312852 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb06d6e65_9964_4785_8589_43ef464433aa.slice/crio-fd7f9e8cae59def263993a91ed5d8a124b8fb2ab32cb99c0cf6af79283beb683 WatchSource:0}: Error finding container fd7f9e8cae59def263993a91ed5d8a124b8fb2ab32cb99c0cf6af79283beb683: Status 404 returned error can't find the container with id fd7f9e8cae59def263993a91ed5d8a124b8fb2ab32cb99c0cf6af79283beb683 Apr 19 12:09:54.316245 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:54.316216 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08fea0f9_3a6e_4ab6_b269_5668dab364ea.slice/crio-7458f7eaf1e59d0f77ac28efd689c07e192f2f26e0aa7c338d02e5a0f0ada1f9 WatchSource:0}: Error finding container 7458f7eaf1e59d0f77ac28efd689c07e192f2f26e0aa7c338d02e5a0f0ada1f9: Status 404 returned error can't find the container with id 7458f7eaf1e59d0f77ac28efd689c07e192f2f26e0aa7c338d02e5a0f0ada1f9 Apr 19 12:09:54.316882 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:54.316848 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ed6a38c_6faa_41d7_855c_af958b4e6898.slice/crio-1c5e9be32a550e1de94c89d4b72c0e3fca7126b7d6eeee59ec52759df17052f1 WatchSource:0}: Error finding container 1c5e9be32a550e1de94c89d4b72c0e3fca7126b7d6eeee59ec52759df17052f1: Status 404 returned error can't find the container with id 1c5e9be32a550e1de94c89d4b72c0e3fca7126b7d6eeee59ec52759df17052f1 Apr 19 12:09:54.318106 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:54.318075 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2527ca81_1ebd_4808_a264_00f75b2caea4.slice/crio-fcb6b1f4a780776c373678f11455e6550b7e85d226cbc55e3dcaddffc07bb304 WatchSource:0}: Error finding container fcb6b1f4a780776c373678f11455e6550b7e85d226cbc55e3dcaddffc07bb304: Status 404 returned error can't find the container with id fcb6b1f4a780776c373678f11455e6550b7e85d226cbc55e3dcaddffc07bb304 Apr 19 12:09:54.319951 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:54.319910 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c805a23_696a_4038_acd9_e934f8c66c1d.slice/crio-0b3222b343bf38a983e92d19e55904b6129f9167fc10a70d1e072a50825254f5 WatchSource:0}: Error finding container 0b3222b343bf38a983e92d19e55904b6129f9167fc10a70d1e072a50825254f5: Status 404 returned error can't find the container with id 0b3222b343bf38a983e92d19e55904b6129f9167fc10a70d1e072a50825254f5 Apr 19 12:09:54.322677 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:54.321248 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd753b45_0daa_4581_ae69_246f33099827.slice/crio-701e01bf24311c600b35b817136f1f4ea733d01c9131f353fe18a2e45499480d WatchSource:0}: Error finding container 701e01bf24311c600b35b817136f1f4ea733d01c9131f353fe18a2e45499480d: Status 404 returned error can't find the container with id 701e01bf24311c600b35b817136f1f4ea733d01c9131f353fe18a2e45499480d Apr 19 12:09:54.322953 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:54.322930 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cd0f946_6502_4d2a_94d4_721582219a2f.slice/crio-6c2cc67b1bfdeaea555c283c20191dc91e481d4125e6bdee8aec8673e3b82f36 WatchSource:0}: Error finding container 6c2cc67b1bfdeaea555c283c20191dc91e481d4125e6bdee8aec8673e3b82f36: Status 404 returned error can't find the container with id 6c2cc67b1bfdeaea555c283c20191dc91e481d4125e6bdee8aec8673e3b82f36 Apr 19 12:09:54.325251 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:09:54.325225 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd61ab6ab_1e8a_4e45_97e4_d3bb1ed06216.slice/crio-178cb6b354f38d48fcdc135a51025edf11949089d5373fa79123366185e350b4 WatchSource:0}: Error finding container 178cb6b354f38d48fcdc135a51025edf11949089d5373fa79123366185e350b4: Status 404 returned error can't find the container with id 178cb6b354f38d48fcdc135a51025edf11949089d5373fa79123366185e350b4 Apr 19 12:09:54.354089 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:54.354063 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddsvq\" (UniqueName: \"kubernetes.io/projected/c60217db-a1f2-446f-bada-9675c7c62201-kube-api-access-ddsvq\") pod \"network-check-target-69kxr\" (UID: \"c60217db-a1f2-446f-bada-9675c7c62201\") " pod="openshift-network-diagnostics/network-check-target-69kxr" Apr 19 12:09:54.354182 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:54.354145 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a9d31f9-fb41-43e0-9946-0611710438a1-metrics-certs\") pod \"network-metrics-daemon-rxnrj\" (UID: \"3a9d31f9-fb41-43e0-9946-0611710438a1\") " pod="openshift-multus/network-metrics-daemon-rxnrj" Apr 19 12:09:54.354242 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:54.354226 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:09:54.354283 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:54.354256 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:09:54.354283 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:54.354232 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:09:54.354283 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:54.354269 2567 projected.go:194] Error preparing data for projected volume kube-api-access-ddsvq for pod openshift-network-diagnostics/network-check-target-69kxr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:09:54.354423 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:54.354319 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a9d31f9-fb41-43e0-9946-0611710438a1-metrics-certs podName:3a9d31f9-fb41-43e0-9946-0611710438a1 nodeName:}" failed. No retries permitted until 2026-04-19 12:09:55.354299205 +0000 UTC m=+4.108491011 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3a9d31f9-fb41-43e0-9946-0611710438a1-metrics-certs") pod "network-metrics-daemon-rxnrj" (UID: "3a9d31f9-fb41-43e0-9946-0611710438a1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:09:54.354423 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:54.354338 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c60217db-a1f2-446f-bada-9675c7c62201-kube-api-access-ddsvq podName:c60217db-a1f2-446f-bada-9675c7c62201 nodeName:}" failed. No retries permitted until 2026-04-19 12:09:55.354328339 +0000 UTC m=+4.108520157 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ddsvq" (UniqueName: "kubernetes.io/projected/c60217db-a1f2-446f-bada-9675c7c62201-kube-api-access-ddsvq") pod "network-check-target-69kxr" (UID: "c60217db-a1f2-446f-bada-9675c7c62201") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:09:54.775349 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:54.774968 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-18 12:04:52 +0000 UTC" deadline="2028-01-21 00:31:24.043023798 +0000 UTC" Apr 19 12:09:54.775349 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:54.775245 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15396h21m29.267785732s" Apr 19 12:09:54.845357 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:54.845286 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8gjrs" event={"ID":"8cd0f946-6502-4d2a-94d4-721582219a2f","Type":"ContainerStarted","Data":"6c2cc67b1bfdeaea555c283c20191dc91e481d4125e6bdee8aec8673e3b82f36"} Apr 19 12:09:54.850285 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:54.850211 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-txn87" event={"ID":"fd753b45-0daa-4581-ae69-246f33099827","Type":"ContainerStarted","Data":"701e01bf24311c600b35b817136f1f4ea733d01c9131f353fe18a2e45499480d"} Apr 19 12:09:54.857584 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:54.857512 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd75q" event={"ID":"b06d6e65-9964-4785-8589-43ef464433aa","Type":"ContainerStarted","Data":"fd7f9e8cae59def263993a91ed5d8a124b8fb2ab32cb99c0cf6af79283beb683"} Apr 19 12:09:54.864320 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:54.864271 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2tbjd" event={"ID":"f601721a-a6ac-4b15-8bc0-48274f620286","Type":"ContainerStarted","Data":"c9f897d1e7eb56d1301695b1f6ba65aaaac346c94e435dd2b385604cc5bbf98a"} Apr 19 12:09:54.871593 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:54.871557 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6ljqn" event={"ID":"3ed6a38c-6faa-41d7-855c-af958b4e6898","Type":"ContainerStarted","Data":"1c5e9be32a550e1de94c89d4b72c0e3fca7126b7d6eeee59ec52759df17052f1"} Apr 19 12:09:54.876937 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:54.876861 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qgqpz" event={"ID":"2c805a23-696a-4038-acd9-e934f8c66c1d","Type":"ContainerStarted","Data":"0b3222b343bf38a983e92d19e55904b6129f9167fc10a70d1e072a50825254f5"} Apr 19 12:09:54.889402 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:54.889352 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zfq9x" event={"ID":"2527ca81-1ebd-4808-a264-00f75b2caea4","Type":"ContainerStarted","Data":"fcb6b1f4a780776c373678f11455e6550b7e85d226cbc55e3dcaddffc07bb304"} Apr 19 12:09:54.896814 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:54.894951 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" event={"ID":"08fea0f9-3a6e-4ab6-b269-5668dab364ea","Type":"ContainerStarted","Data":"7458f7eaf1e59d0f77ac28efd689c07e192f2f26e0aa7c338d02e5a0f0ada1f9"} Apr 19 12:09:54.897746 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:54.897712 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-150.ec2.internal" event={"ID":"21c64da2dfd5e799e05beb88f86623ce","Type":"ContainerStarted","Data":"44262aac35428e779eab591a8a530ae5ac44279a27ef0dc0df7bc10979de3cd2"} Apr 19 12:09:54.906902 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:54.906856 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-57vt2" event={"ID":"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216","Type":"ContainerStarted","Data":"178cb6b354f38d48fcdc135a51025edf11949089d5373fa79123366185e350b4"} Apr 19 12:09:54.912133 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:54.912049 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-150.ec2.internal" podStartSLOduration=2.912031224 podStartE2EDuration="2.912031224s" podCreationTimestamp="2026-04-19 12:09:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:09:54.911596154 +0000 UTC m=+3.665787983" watchObservedRunningTime="2026-04-19 12:09:54.912031224 +0000 UTC m=+3.666223051" Apr 19 12:09:55.364674 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:55.364630 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddsvq\" (UniqueName: \"kubernetes.io/projected/c60217db-a1f2-446f-bada-9675c7c62201-kube-api-access-ddsvq\") pod \"network-check-target-69kxr\" (UID: \"c60217db-a1f2-446f-bada-9675c7c62201\") " pod="openshift-network-diagnostics/network-check-target-69kxr" Apr 19 12:09:55.364898 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:55.364713 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a9d31f9-fb41-43e0-9946-0611710438a1-metrics-certs\") pod \"network-metrics-daemon-rxnrj\" (UID: \"3a9d31f9-fb41-43e0-9946-0611710438a1\") " pod="openshift-multus/network-metrics-daemon-rxnrj" Apr 19 12:09:55.364898 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:55.364836 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:09:55.364898 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:55.364899 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a9d31f9-fb41-43e0-9946-0611710438a1-metrics-certs podName:3a9d31f9-fb41-43e0-9946-0611710438a1 nodeName:}" failed. No retries permitted until 2026-04-19 12:09:57.364879918 +0000 UTC m=+6.119071729 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3a9d31f9-fb41-43e0-9946-0611710438a1-metrics-certs") pod "network-metrics-daemon-rxnrj" (UID: "3a9d31f9-fb41-43e0-9946-0611710438a1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:09:55.365408 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:55.365357 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:09:55.365408 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:55.365383 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:09:55.365408 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:55.365396 2567 projected.go:194] Error preparing data for projected volume kube-api-access-ddsvq for pod openshift-network-diagnostics/network-check-target-69kxr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:09:55.365633 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:55.365444 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c60217db-a1f2-446f-bada-9675c7c62201-kube-api-access-ddsvq podName:c60217db-a1f2-446f-bada-9675c7c62201 nodeName:}" failed. No retries permitted until 2026-04-19 12:09:57.365428368 +0000 UTC m=+6.119620177 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-ddsvq" (UniqueName: "kubernetes.io/projected/c60217db-a1f2-446f-bada-9675c7c62201-kube-api-access-ddsvq") pod "network-check-target-69kxr" (UID: "c60217db-a1f2-446f-bada-9675c7c62201") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:09:55.833952 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:55.833328 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-69kxr" Apr 19 12:09:55.833952 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:55.833349 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxnrj" Apr 19 12:09:55.833952 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:55.833460 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-69kxr" podUID="c60217db-a1f2-446f-bada-9675c7c62201" Apr 19 12:09:55.833952 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:55.833893 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rxnrj" podUID="3a9d31f9-fb41-43e0-9946-0611710438a1" Apr 19 12:09:55.922524 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:55.921194 2567 generic.go:358] "Generic (PLEG): container finished" podID="482a6c1a610e53baa96869a662ee4be0" containerID="7e4fa8fc7c46ecf11d4ebcffa8bda9224ede67a14fd1a0fe36102e0f47dce105" exitCode=0 Apr 19 12:09:55.922524 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:55.922231 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-150.ec2.internal" event={"ID":"482a6c1a610e53baa96869a662ee4be0","Type":"ContainerDied","Data":"7e4fa8fc7c46ecf11d4ebcffa8bda9224ede67a14fd1a0fe36102e0f47dce105"} Apr 19 12:09:56.928725 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:56.928057 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-150.ec2.internal" event={"ID":"482a6c1a610e53baa96869a662ee4be0","Type":"ContainerStarted","Data":"50496f6c369ca90a38567fc740070957e2760b573ebf13a20193632db9c512b1"} Apr 19 12:09:57.382226 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:57.382133 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a9d31f9-fb41-43e0-9946-0611710438a1-metrics-certs\") pod \"network-metrics-daemon-rxnrj\" (UID: \"3a9d31f9-fb41-43e0-9946-0611710438a1\") " pod="openshift-multus/network-metrics-daemon-rxnrj" Apr 19 12:09:57.382226 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:57.382203 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddsvq\" (UniqueName: \"kubernetes.io/projected/c60217db-a1f2-446f-bada-9675c7c62201-kube-api-access-ddsvq\") pod \"network-check-target-69kxr\" (UID: \"c60217db-a1f2-446f-bada-9675c7c62201\") " pod="openshift-network-diagnostics/network-check-target-69kxr" Apr 19 12:09:57.382455 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:57.382367 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:09:57.382455 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:57.382386 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:09:57.382455 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:57.382399 2567 projected.go:194] Error preparing data for projected volume kube-api-access-ddsvq for pod openshift-network-diagnostics/network-check-target-69kxr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:09:57.382455 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:57.382455 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c60217db-a1f2-446f-bada-9675c7c62201-kube-api-access-ddsvq podName:c60217db-a1f2-446f-bada-9675c7c62201 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:01.382437637 +0000 UTC m=+10.136629444 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-ddsvq" (UniqueName: "kubernetes.io/projected/c60217db-a1f2-446f-bada-9675c7c62201-kube-api-access-ddsvq") pod "network-check-target-69kxr" (UID: "c60217db-a1f2-446f-bada-9675c7c62201") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:09:57.382883 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:57.382770 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:09:57.382883 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:57.382839 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a9d31f9-fb41-43e0-9946-0611710438a1-metrics-certs podName:3a9d31f9-fb41-43e0-9946-0611710438a1 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:01.382819594 +0000 UTC m=+10.137011403 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3a9d31f9-fb41-43e0-9946-0611710438a1-metrics-certs") pod "network-metrics-daemon-rxnrj" (UID: "3a9d31f9-fb41-43e0-9946-0611710438a1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:09:57.833843 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:57.833753 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxnrj" Apr 19 12:09:57.834005 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:57.833973 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rxnrj" podUID="3a9d31f9-fb41-43e0-9946-0611710438a1" Apr 19 12:09:57.834068 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:57.834028 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-69kxr" Apr 19 12:09:57.834150 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:57.834129 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-69kxr" podUID="c60217db-a1f2-446f-bada-9675c7c62201" Apr 19 12:09:59.833847 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:59.833813 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxnrj" Apr 19 12:09:59.834327 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:09:59.833856 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-69kxr" Apr 19 12:09:59.834327 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:59.833961 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rxnrj" podUID="3a9d31f9-fb41-43e0-9946-0611710438a1" Apr 19 12:09:59.834438 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:09:59.834405 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-69kxr" podUID="c60217db-a1f2-446f-bada-9675c7c62201" Apr 19 12:10:01.416668 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:01.416628 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddsvq\" (UniqueName: \"kubernetes.io/projected/c60217db-a1f2-446f-bada-9675c7c62201-kube-api-access-ddsvq\") pod \"network-check-target-69kxr\" (UID: \"c60217db-a1f2-446f-bada-9675c7c62201\") " pod="openshift-network-diagnostics/network-check-target-69kxr" Apr 19 12:10:01.417168 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:01.416716 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a9d31f9-fb41-43e0-9946-0611710438a1-metrics-certs\") pod \"network-metrics-daemon-rxnrj\" (UID: \"3a9d31f9-fb41-43e0-9946-0611710438a1\") " pod="openshift-multus/network-metrics-daemon-rxnrj" Apr 19 12:10:01.417168 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:01.416813 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:10:01.417168 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:01.416838 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:10:01.417168 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:01.416847 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:10:01.417168 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:01.416852 2567 projected.go:194] Error preparing data for projected volume kube-api-access-ddsvq for pod openshift-network-diagnostics/network-check-target-69kxr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:10:01.417168 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:01.416915 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a9d31f9-fb41-43e0-9946-0611710438a1-metrics-certs podName:3a9d31f9-fb41-43e0-9946-0611710438a1 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:09.416888157 +0000 UTC m=+18.171079967 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3a9d31f9-fb41-43e0-9946-0611710438a1-metrics-certs") pod "network-metrics-daemon-rxnrj" (UID: "3a9d31f9-fb41-43e0-9946-0611710438a1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:10:01.417168 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:01.416936 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c60217db-a1f2-446f-bada-9675c7c62201-kube-api-access-ddsvq podName:c60217db-a1f2-446f-bada-9675c7c62201 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:09.416924982 +0000 UTC m=+18.171116793 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-ddsvq" (UniqueName: "kubernetes.io/projected/c60217db-a1f2-446f-bada-9675c7c62201-kube-api-access-ddsvq") pod "network-check-target-69kxr" (UID: "c60217db-a1f2-446f-bada-9675c7c62201") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:10:01.834125 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:01.834027 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-69kxr" Apr 19 12:10:01.834271 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:01.834139 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-69kxr" podUID="c60217db-a1f2-446f-bada-9675c7c62201" Apr 19 12:10:01.834538 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:01.834386 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxnrj" Apr 19 12:10:01.834538 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:01.834499 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rxnrj" podUID="3a9d31f9-fb41-43e0-9946-0611710438a1" Apr 19 12:10:03.833735 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:03.833653 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-69kxr" Apr 19 12:10:03.834182 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:03.833783 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-69kxr" podUID="c60217db-a1f2-446f-bada-9675c7c62201" Apr 19 12:10:03.834182 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:03.833847 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxnrj" Apr 19 12:10:03.834182 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:03.833975 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rxnrj" podUID="3a9d31f9-fb41-43e0-9946-0611710438a1" Apr 19 12:10:05.832988 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:05.832954 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxnrj" Apr 19 12:10:05.833548 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:05.833087 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rxnrj" podUID="3a9d31f9-fb41-43e0-9946-0611710438a1" Apr 19 12:10:05.833548 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:05.833134 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-69kxr" Apr 19 12:10:05.833548 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:05.833247 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-69kxr" podUID="c60217db-a1f2-446f-bada-9675c7c62201" Apr 19 12:10:07.833440 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:07.833397 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-69kxr" Apr 19 12:10:07.833865 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:07.833531 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-69kxr" podUID="c60217db-a1f2-446f-bada-9675c7c62201" Apr 19 12:10:07.833865 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:07.833596 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxnrj" Apr 19 12:10:07.833865 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:07.833713 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rxnrj" podUID="3a9d31f9-fb41-43e0-9946-0611710438a1" Apr 19 12:10:09.472974 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:09.472935 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddsvq\" (UniqueName: \"kubernetes.io/projected/c60217db-a1f2-446f-bada-9675c7c62201-kube-api-access-ddsvq\") pod \"network-check-target-69kxr\" (UID: \"c60217db-a1f2-446f-bada-9675c7c62201\") " pod="openshift-network-diagnostics/network-check-target-69kxr" Apr 19 12:10:09.473362 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:09.472997 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a9d31f9-fb41-43e0-9946-0611710438a1-metrics-certs\") pod \"network-metrics-daemon-rxnrj\" (UID: \"3a9d31f9-fb41-43e0-9946-0611710438a1\") " pod="openshift-multus/network-metrics-daemon-rxnrj" Apr 19 12:10:09.473362 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:09.473097 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:10:09.473362 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:09.473169 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a9d31f9-fb41-43e0-9946-0611710438a1-metrics-certs podName:3a9d31f9-fb41-43e0-9946-0611710438a1 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:25.473155317 +0000 UTC m=+34.227347135 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3a9d31f9-fb41-43e0-9946-0611710438a1-metrics-certs") pod "network-metrics-daemon-rxnrj" (UID: "3a9d31f9-fb41-43e0-9946-0611710438a1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:10:09.473362 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:09.473173 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:10:09.473362 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:09.473200 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:10:09.473362 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:09.473215 2567 projected.go:194] Error preparing data for projected volume kube-api-access-ddsvq for pod openshift-network-diagnostics/network-check-target-69kxr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:10:09.473362 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:09.473280 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c60217db-a1f2-446f-bada-9675c7c62201-kube-api-access-ddsvq podName:c60217db-a1f2-446f-bada-9675c7c62201 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:25.473262608 +0000 UTC m=+34.227454417 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-ddsvq" (UniqueName: "kubernetes.io/projected/c60217db-a1f2-446f-bada-9675c7c62201-kube-api-access-ddsvq") pod "network-check-target-69kxr" (UID: "c60217db-a1f2-446f-bada-9675c7c62201") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:10:09.833646 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:09.833562 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxnrj" Apr 19 12:10:09.833646 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:09.833611 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-69kxr" Apr 19 12:10:09.833870 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:09.833710 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rxnrj" podUID="3a9d31f9-fb41-43e0-9946-0611710438a1" Apr 19 12:10:09.833870 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:09.833844 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-69kxr" podUID="c60217db-a1f2-446f-bada-9675c7c62201" Apr 19 12:10:11.833888 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:11.833858 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-69kxr" Apr 19 12:10:11.834778 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:11.833941 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-69kxr" podUID="c60217db-a1f2-446f-bada-9675c7c62201" Apr 19 12:10:11.834778 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:11.834056 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxnrj" Apr 19 12:10:11.834778 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:11.834187 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rxnrj" podUID="3a9d31f9-fb41-43e0-9946-0611710438a1" Apr 19 12:10:11.957172 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:11.957071 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6ljqn" event={"ID":"3ed6a38c-6faa-41d7-855c-af958b4e6898","Type":"ContainerStarted","Data":"9845cd4b55b838268082eba653a237856aa9c299c000f88dae97abc77da4b917"} Apr 19 12:10:11.958982 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:11.958950 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zfq9x" event={"ID":"2527ca81-1ebd-4808-a264-00f75b2caea4","Type":"ContainerStarted","Data":"ea3e83d04ce87f8580d1beaeaeb7a54285cd9fca2dc8d7798d1ac752020aea13"} Apr 19 12:10:11.961809 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:11.961785 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5n96s_08fea0f9-3a6e-4ab6-b269-5668dab364ea/ovn-acl-logging/0.log" Apr 19 12:10:11.962197 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:11.962167 2567 generic.go:358] "Generic (PLEG): container finished" podID="08fea0f9-3a6e-4ab6-b269-5668dab364ea" containerID="382dfccba8fe61bc0333594657746334c759acde9e1141a8f2f337544ee8da5f" exitCode=1 Apr 19 12:10:11.962289 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:11.962239 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" event={"ID":"08fea0f9-3a6e-4ab6-b269-5668dab364ea","Type":"ContainerStarted","Data":"4be99f9d967408eac0dee4812b68d5280dd5e2e2fcb5d2510b168115effdb90c"} Apr 19 12:10:11.962289 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:11.962264 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" event={"ID":"08fea0f9-3a6e-4ab6-b269-5668dab364ea","Type":"ContainerStarted","Data":"e836ea6c8deb9397b793dc732843ad79948b1b0debb7de04ecb8a563d51126ea"} Apr 19 12:10:11.962289 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:11.962284 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" event={"ID":"08fea0f9-3a6e-4ab6-b269-5668dab364ea","Type":"ContainerStarted","Data":"47a63f85aa62dab95216726a19890286b9edfecbe9b1e28b8cc2f5c8234f7775"} Apr 19 12:10:11.962451 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:11.962296 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" event={"ID":"08fea0f9-3a6e-4ab6-b269-5668dab364ea","Type":"ContainerStarted","Data":"3b47c8c938b956855bc813239cc37b2ec64a47d56efd0828f45fe0e06e34ec57"} Apr 19 12:10:11.962451 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:11.962305 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" event={"ID":"08fea0f9-3a6e-4ab6-b269-5668dab364ea","Type":"ContainerDied","Data":"382dfccba8fe61bc0333594657746334c759acde9e1141a8f2f337544ee8da5f"} Apr 19 12:10:11.962451 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:11.962324 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" event={"ID":"08fea0f9-3a6e-4ab6-b269-5668dab364ea","Type":"ContainerStarted","Data":"913a1801912d9facbf7b4f5cc462d16f5c97d983ac5bdcc50693446536c01569"} Apr 19 12:10:11.963910 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:11.963869 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-57vt2" event={"ID":"d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216","Type":"ContainerStarted","Data":"8e066e9071e0c4c5a99b8c0b66ae9add172615aba9b2e2728d82f7100a0a3da1"} Apr 19 12:10:11.965423 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:11.965398 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8gjrs" event={"ID":"8cd0f946-6502-4d2a-94d4-721582219a2f","Type":"ContainerStarted","Data":"87cb8436d7ace2c26d3f66b55c6fcb50163d170150dbdccba494d940e79c916e"} Apr 19 12:10:11.966826 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:11.966804 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-txn87" event={"ID":"fd753b45-0daa-4581-ae69-246f33099827","Type":"ContainerStarted","Data":"6b5d07b6177c6859801dd7f9957ae17f8247e2025a08a9ff39d2f298c8b68adf"} Apr 19 12:10:11.968406 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:11.968373 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd75q" event={"ID":"b06d6e65-9964-4785-8589-43ef464433aa","Type":"ContainerStarted","Data":"c58d79692fd1d5a4ea4d2573885377f5ddd7aed5b0954083814f53d9cc141b66"} Apr 19 12:10:11.969910 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:11.969867 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-150.ec2.internal" podStartSLOduration=19.969853763 podStartE2EDuration="19.969853763s" podCreationTimestamp="2026-04-19 12:09:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:09:56.944305868 +0000 UTC m=+5.698497687" watchObservedRunningTime="2026-04-19 12:10:11.969853763 +0000 UTC m=+20.724045591" Apr 19 12:10:11.969910 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:11.969895 2567 generic.go:358] "Generic (PLEG): container finished" podID="f601721a-a6ac-4b15-8bc0-48274f620286" containerID="0610252502b92b45fae6c5df7c3da10c2538e3b152aa76b0f41843fc4d65cbad" exitCode=0 Apr 19 12:10:11.970060 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:11.969923 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2tbjd" event={"ID":"f601721a-a6ac-4b15-8bc0-48274f620286","Type":"ContainerDied","Data":"0610252502b92b45fae6c5df7c3da10c2538e3b152aa76b0f41843fc4d65cbad"} Apr 19 12:10:11.970682 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:11.970642 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-6ljqn" podStartSLOduration=4.325917885 podStartE2EDuration="20.970630483s" podCreationTimestamp="2026-04-19 12:09:51 +0000 UTC" firstStartedPulling="2026-04-19 12:09:54.319218633 +0000 UTC m=+3.073410452" lastFinishedPulling="2026-04-19 12:10:10.96393123 +0000 UTC m=+19.718123050" observedRunningTime="2026-04-19 12:10:11.969936575 +0000 UTC m=+20.724128415" watchObservedRunningTime="2026-04-19 12:10:11.970630483 +0000 UTC m=+20.724822312" Apr 19 12:10:11.997395 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:11.997357 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-txn87" podStartSLOduration=4.43972862 podStartE2EDuration="20.997342818s" podCreationTimestamp="2026-04-19 12:09:51 +0000 UTC" firstStartedPulling="2026-04-19 12:09:54.326673805 +0000 UTC m=+3.080865613" lastFinishedPulling="2026-04-19 12:10:10.884288003 +0000 UTC m=+19.638479811" observedRunningTime="2026-04-19 12:10:11.997248287 +0000 UTC m=+20.751440114" watchObservedRunningTime="2026-04-19 12:10:11.997342818 +0000 UTC m=+20.751534649" Apr 19 12:10:11.997757 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:11.997737 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-zfq9x" podStartSLOduration=4.433535324 podStartE2EDuration="20.997731168s" podCreationTimestamp="2026-04-19 12:09:51 +0000 UTC" firstStartedPulling="2026-04-19 12:09:54.32009557 +0000 UTC m=+3.074287390" lastFinishedPulling="2026-04-19 12:10:10.884291413 +0000 UTC m=+19.638483234" observedRunningTime="2026-04-19 12:10:11.982957025 +0000 UTC m=+20.737148851" watchObservedRunningTime="2026-04-19 12:10:11.997731168 +0000 UTC m=+20.751922995" Apr 19 12:10:12.011865 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:12.011831 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-8gjrs" podStartSLOduration=3.417014165 podStartE2EDuration="20.011822345s" podCreationTimestamp="2026-04-19 12:09:52 +0000 UTC" firstStartedPulling="2026-04-19 12:09:54.326619022 +0000 UTC m=+3.080810842" lastFinishedPulling="2026-04-19 12:10:10.921427203 +0000 UTC m=+19.675619022" observedRunningTime="2026-04-19 12:10:12.011389864 +0000 UTC m=+20.765581701" watchObservedRunningTime="2026-04-19 12:10:12.011822345 +0000 UTC m=+20.766014172" Apr 19 12:10:12.026693 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:12.026649 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-57vt2" podStartSLOduration=3.191467593 podStartE2EDuration="20.026637943s" podCreationTimestamp="2026-04-19 12:09:52 +0000 UTC" firstStartedPulling="2026-04-19 12:09:54.327841412 +0000 UTC m=+3.082033231" lastFinishedPulling="2026-04-19 12:10:11.163011761 +0000 UTC m=+19.917203581" observedRunningTime="2026-04-19 12:10:12.026512565 +0000 UTC m=+20.780704407" watchObservedRunningTime="2026-04-19 12:10:12.026637943 +0000 UTC m=+20.780829770" Apr 19 12:10:12.410989 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:12.410959 2567 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 19 12:10:12.807797 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:12.807571 2567 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-19T12:10:12.410982403Z","UUID":"8bd6d8dc-bbf9-409c-b313-22bd80691b29","Handler":null,"Name":"","Endpoint":""} Apr 19 12:10:12.810910 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:12.810882 2567 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 19 12:10:12.810910 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:12.810917 2567 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 19 12:10:12.977905 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:12.977827 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd75q" event={"ID":"b06d6e65-9964-4785-8589-43ef464433aa","Type":"ContainerStarted","Data":"761d2d540eb503c1e08c71a03a0be05e027bbc5dbc0d51dcf306b99dd12b2a1e"} Apr 19 12:10:12.979615 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:12.979585 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qgqpz" event={"ID":"2c805a23-696a-4038-acd9-e934f8c66c1d","Type":"ContainerStarted","Data":"22d51110da3e5610808970022850f78c7f432afabd55e0fcde88f874fd991402"} Apr 19 12:10:12.993957 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:12.993899 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-qgqpz" podStartSLOduration=5.347071591 podStartE2EDuration="21.993882185s" podCreationTimestamp="2026-04-19 12:09:51 +0000 UTC" firstStartedPulling="2026-04-19 12:09:54.326690817 +0000 UTC m=+3.080882630" lastFinishedPulling="2026-04-19 12:10:10.973501416 +0000 UTC m=+19.727693224" observedRunningTime="2026-04-19 12:10:12.993231746 +0000 UTC m=+21.747423573" watchObservedRunningTime="2026-04-19 12:10:12.993882185 +0000 UTC m=+21.748074014" Apr 19 12:10:13.833931 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:13.833854 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-69kxr" Apr 19 12:10:13.833931 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:13.833886 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxnrj" Apr 19 12:10:13.834251 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:13.833986 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-69kxr" podUID="c60217db-a1f2-446f-bada-9675c7c62201" Apr 19 12:10:13.834251 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:13.834139 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rxnrj" podUID="3a9d31f9-fb41-43e0-9946-0611710438a1" Apr 19 12:10:13.985533 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:13.985493 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5n96s_08fea0f9-3a6e-4ab6-b269-5668dab364ea/ovn-acl-logging/0.log" Apr 19 12:10:13.985959 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:13.985929 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" event={"ID":"08fea0f9-3a6e-4ab6-b269-5668dab364ea","Type":"ContainerStarted","Data":"3a4b42b6fdc161abac8f6e8ba1fc66c483a731caa5e4bc9b766fa28b0d8e427d"} Apr 19 12:10:13.988276 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:13.988233 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd75q" event={"ID":"b06d6e65-9964-4785-8589-43ef464433aa","Type":"ContainerStarted","Data":"3b6213b2097c05c00274b8c99464331140f67a8082f5e115235e5cd32edd3eb5"} Apr 19 12:10:14.003964 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:14.003912 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vd75q" podStartSLOduration=3.904885804 podStartE2EDuration="23.003894294s" podCreationTimestamp="2026-04-19 12:09:51 +0000 UTC" firstStartedPulling="2026-04-19 12:09:54.314779768 +0000 UTC m=+3.068971574" lastFinishedPulling="2026-04-19 12:10:13.413788245 +0000 UTC m=+22.167980064" observedRunningTime="2026-04-19 12:10:14.00354463 +0000 UTC m=+22.757736487" watchObservedRunningTime="2026-04-19 12:10:14.003894294 +0000 UTC m=+22.758086138" Apr 19 12:10:15.508368 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:15.508100 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-6ljqn" Apr 19 12:10:15.508834 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:15.508785 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-6ljqn" Apr 19 12:10:15.833038 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:15.832946 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-69kxr" Apr 19 12:10:15.833216 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:15.833068 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-69kxr" podUID="c60217db-a1f2-446f-bada-9675c7c62201" Apr 19 12:10:15.833216 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:15.833172 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxnrj" Apr 19 12:10:15.833322 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:15.833294 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rxnrj" podUID="3a9d31f9-fb41-43e0-9946-0611710438a1" Apr 19 12:10:16.995508 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:16.995304 2567 generic.go:358] "Generic (PLEG): container finished" podID="f601721a-a6ac-4b15-8bc0-48274f620286" containerID="273807845af77df93a0a7abf41e7428ccaaf7b40bcbccbdfa64210ea2b442cd9" exitCode=0 Apr 19 12:10:16.996288 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:16.995390 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2tbjd" event={"ID":"f601721a-a6ac-4b15-8bc0-48274f620286","Type":"ContainerDied","Data":"273807845af77df93a0a7abf41e7428ccaaf7b40bcbccbdfa64210ea2b442cd9"} Apr 19 12:10:16.998580 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:16.998456 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5n96s_08fea0f9-3a6e-4ab6-b269-5668dab364ea/ovn-acl-logging/0.log" Apr 19 12:10:16.998770 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:16.998741 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" event={"ID":"08fea0f9-3a6e-4ab6-b269-5668dab364ea","Type":"ContainerStarted","Data":"a38df1fd34ca43b104dea5d77614755797781716ef55144bda848bf52b7a2f71"} Apr 19 12:10:16.999177 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:16.999158 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:10:16.999283 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:16.999184 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:10:16.999404 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:16.999389 2567 scope.go:117] "RemoveContainer" containerID="382dfccba8fe61bc0333594657746334c759acde9e1141a8f2f337544ee8da5f" Apr 19 12:10:17.014936 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:17.014915 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:10:17.015035 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:17.014981 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:10:17.833565 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:17.833534 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-69kxr" Apr 19 12:10:17.833684 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:17.833568 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxnrj" Apr 19 12:10:17.833749 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:17.833681 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-69kxr" podUID="c60217db-a1f2-446f-bada-9675c7c62201" Apr 19 12:10:17.833828 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:17.833806 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rxnrj" podUID="3a9d31f9-fb41-43e0-9946-0611710438a1" Apr 19 12:10:18.002381 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:18.002296 2567 generic.go:358] "Generic (PLEG): container finished" podID="f601721a-a6ac-4b15-8bc0-48274f620286" containerID="52158645c9afbe88c95250a5425c1408737010f1a23f06054633a221b9af4f11" exitCode=0 Apr 19 12:10:18.002839 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:18.002371 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2tbjd" event={"ID":"f601721a-a6ac-4b15-8bc0-48274f620286","Type":"ContainerDied","Data":"52158645c9afbe88c95250a5425c1408737010f1a23f06054633a221b9af4f11"} Apr 19 12:10:18.005631 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:18.005610 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5n96s_08fea0f9-3a6e-4ab6-b269-5668dab364ea/ovn-acl-logging/0.log" Apr 19 12:10:18.005956 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:18.005936 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" event={"ID":"08fea0f9-3a6e-4ab6-b269-5668dab364ea","Type":"ContainerStarted","Data":"002b29a594e7cf52bb07f63cf4928fa3cb5e3006a4ede92e72c06c37585100ee"} Apr 19 12:10:18.006181 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:18.006161 2567 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 19 12:10:18.040467 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:18.040430 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-69kxr"] Apr 19 12:10:18.040619 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:18.040548 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-69kxr" Apr 19 12:10:18.040661 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:18.040628 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-69kxr" podUID="c60217db-a1f2-446f-bada-9675c7c62201" Apr 19 12:10:18.043965 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:18.043935 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rxnrj"] Apr 19 12:10:18.044083 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:18.044055 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxnrj" Apr 19 12:10:18.044297 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:18.044267 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rxnrj" podUID="3a9d31f9-fb41-43e0-9946-0611710438a1" Apr 19 12:10:18.045002 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:18.044944 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" podStartSLOduration=10.331646746 podStartE2EDuration="27.044929697s" podCreationTimestamp="2026-04-19 12:09:51 +0000 UTC" firstStartedPulling="2026-04-19 12:09:54.319004319 +0000 UTC m=+3.073196130" lastFinishedPulling="2026-04-19 12:10:11.032287262 +0000 UTC m=+19.786479081" observedRunningTime="2026-04-19 12:10:18.043722745 +0000 UTC m=+26.797914573" watchObservedRunningTime="2026-04-19 12:10:18.044929697 +0000 UTC m=+26.799121521" Apr 19 12:10:18.257527 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:18.257446 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:10:19.009781 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:19.009747 2567 generic.go:358] "Generic (PLEG): container finished" podID="f601721a-a6ac-4b15-8bc0-48274f620286" containerID="184d0c68b338bc39678e1bf4c020bea89f73bd8a255dd39a0453c0d714e1e03b" exitCode=0 Apr 19 12:10:19.010253 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:19.009834 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2tbjd" event={"ID":"f601721a-a6ac-4b15-8bc0-48274f620286","Type":"ContainerDied","Data":"184d0c68b338bc39678e1bf4c020bea89f73bd8a255dd39a0453c0d714e1e03b"} Apr 19 12:10:19.832935 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:19.832900 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-69kxr" Apr 19 12:10:19.833137 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:19.833065 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-69kxr" podUID="c60217db-a1f2-446f-bada-9675c7c62201" Apr 19 12:10:19.833200 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:19.833147 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxnrj" Apr 19 12:10:19.833268 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:19.833243 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rxnrj" podUID="3a9d31f9-fb41-43e0-9946-0611710438a1" Apr 19 12:10:21.066197 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:21.065921 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-6ljqn" Apr 19 12:10:21.066661 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:21.066335 2567 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 19 12:10:21.066810 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:21.066784 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-6ljqn" Apr 19 12:10:21.834503 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:21.834466 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxnrj" Apr 19 12:10:21.834692 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:21.834517 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-69kxr" Apr 19 12:10:21.834692 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:21.834582 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-69kxr" podUID="c60217db-a1f2-446f-bada-9675c7c62201" Apr 19 12:10:21.834692 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:21.834645 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rxnrj" podUID="3a9d31f9-fb41-43e0-9946-0611710438a1" Apr 19 12:10:23.833028 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:23.832992 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-69kxr" Apr 19 12:10:23.833700 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:23.833003 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxnrj" Apr 19 12:10:23.833700 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:23.833152 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-69kxr" podUID="c60217db-a1f2-446f-bada-9675c7c62201" Apr 19 12:10:23.833700 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:23.833462 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rxnrj" podUID="3a9d31f9-fb41-43e0-9946-0611710438a1" Apr 19 12:10:24.048949 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:24.048867 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-150.ec2.internal" event="NodeReady" Apr 19 12:10:24.049094 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:24.049027 2567 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 19 12:10:24.092031 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:24.091996 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-7jlhs"] Apr 19 12:10:24.094638 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:24.094614 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7jlhs" Apr 19 12:10:24.095380 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:24.095345 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-l5qnw"] Apr 19 12:10:24.096855 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:24.096828 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 19 12:10:24.096973 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:24.096827 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7hvqw\"" Apr 19 12:10:24.096973 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:24.096925 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 19 12:10:24.097283 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:24.097268 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-l5qnw" Apr 19 12:10:24.099273 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:24.099254 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 19 12:10:24.099370 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:24.099308 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 19 12:10:24.099434 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:24.099375 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 19 12:10:24.099633 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:24.099612 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-zfmpv\"" Apr 19 12:10:24.104132 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:24.104094 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7jlhs"] Apr 19 12:10:24.106838 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:24.106809 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-l5qnw"] Apr 19 12:10:24.174121 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:24.174088 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb22769b-c18a-471e-9118-2fca21dc6606-config-volume\") pod \"dns-default-7jlhs\" (UID: \"bb22769b-c18a-471e-9118-2fca21dc6606\") " pod="openshift-dns/dns-default-7jlhs" Apr 19 12:10:24.174278 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:24.174146 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cdd2a15-9ab0-45f9-9373-938372482e1a-cert\") pod \"ingress-canary-l5qnw\" (UID: \"0cdd2a15-9ab0-45f9-9373-938372482e1a\") " pod="openshift-ingress-canary/ingress-canary-l5qnw" Apr 19 12:10:24.174278 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:24.174168 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnhsl\" (UniqueName: \"kubernetes.io/projected/bb22769b-c18a-471e-9118-2fca21dc6606-kube-api-access-wnhsl\") pod \"dns-default-7jlhs\" (UID: \"bb22769b-c18a-471e-9118-2fca21dc6606\") " pod="openshift-dns/dns-default-7jlhs" Apr 19 12:10:24.174278 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:24.174185 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjj47\" (UniqueName: \"kubernetes.io/projected/0cdd2a15-9ab0-45f9-9373-938372482e1a-kube-api-access-pjj47\") pod \"ingress-canary-l5qnw\" (UID: \"0cdd2a15-9ab0-45f9-9373-938372482e1a\") " pod="openshift-ingress-canary/ingress-canary-l5qnw" Apr 19 12:10:24.174278 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:24.174219 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bb22769b-c18a-471e-9118-2fca21dc6606-tmp-dir\") pod \"dns-default-7jlhs\" (UID: \"bb22769b-c18a-471e-9118-2fca21dc6606\") " pod="openshift-dns/dns-default-7jlhs" Apr 19 12:10:24.174462 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:24.174321 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bb22769b-c18a-471e-9118-2fca21dc6606-metrics-tls\") pod \"dns-default-7jlhs\" (UID: \"bb22769b-c18a-471e-9118-2fca21dc6606\") " pod="openshift-dns/dns-default-7jlhs" Apr 19 12:10:24.275583 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:24.275546 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb22769b-c18a-471e-9118-2fca21dc6606-config-volume\") pod \"dns-default-7jlhs\" (UID: \"bb22769b-c18a-471e-9118-2fca21dc6606\") " pod="openshift-dns/dns-default-7jlhs" Apr 19 12:10:24.275745 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:24.275594 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cdd2a15-9ab0-45f9-9373-938372482e1a-cert\") pod \"ingress-canary-l5qnw\" (UID: \"0cdd2a15-9ab0-45f9-9373-938372482e1a\") " pod="openshift-ingress-canary/ingress-canary-l5qnw" Apr 19 12:10:24.275745 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:24.275622 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wnhsl\" (UniqueName: \"kubernetes.io/projected/bb22769b-c18a-471e-9118-2fca21dc6606-kube-api-access-wnhsl\") pod \"dns-default-7jlhs\" (UID: \"bb22769b-c18a-471e-9118-2fca21dc6606\") " pod="openshift-dns/dns-default-7jlhs" Apr 19 12:10:24.275745 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:24.275648 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pjj47\" (UniqueName: \"kubernetes.io/projected/0cdd2a15-9ab0-45f9-9373-938372482e1a-kube-api-access-pjj47\") pod \"ingress-canary-l5qnw\" (UID: \"0cdd2a15-9ab0-45f9-9373-938372482e1a\") " pod="openshift-ingress-canary/ingress-canary-l5qnw" Apr 19 12:10:24.275745 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:24.275731 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bb22769b-c18a-471e-9118-2fca21dc6606-tmp-dir\") pod \"dns-default-7jlhs\" (UID: \"bb22769b-c18a-471e-9118-2fca21dc6606\") " pod="openshift-dns/dns-default-7jlhs" Apr 19 12:10:24.276004 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:24.275822 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bb22769b-c18a-471e-9118-2fca21dc6606-metrics-tls\") pod \"dns-default-7jlhs\" (UID: \"bb22769b-c18a-471e-9118-2fca21dc6606\") " pod="openshift-dns/dns-default-7jlhs" Apr 19 12:10:24.276004 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:24.275871 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:10:24.276004 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:24.275939 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:10:24.276004 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:24.275946 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cdd2a15-9ab0-45f9-9373-938372482e1a-cert podName:0cdd2a15-9ab0-45f9-9373-938372482e1a nodeName:}" failed. No retries permitted until 2026-04-19 12:10:24.775913736 +0000 UTC m=+33.530105574 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0cdd2a15-9ab0-45f9-9373-938372482e1a-cert") pod "ingress-canary-l5qnw" (UID: "0cdd2a15-9ab0-45f9-9373-938372482e1a") : secret "canary-serving-cert" not found Apr 19 12:10:24.276204 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:24.276012 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb22769b-c18a-471e-9118-2fca21dc6606-metrics-tls podName:bb22769b-c18a-471e-9118-2fca21dc6606 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:24.775995552 +0000 UTC m=+33.530187373 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bb22769b-c18a-471e-9118-2fca21dc6606-metrics-tls") pod "dns-default-7jlhs" (UID: "bb22769b-c18a-471e-9118-2fca21dc6606") : secret "dns-default-metrics-tls" not found Apr 19 12:10:24.276249 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:24.276232 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bb22769b-c18a-471e-9118-2fca21dc6606-tmp-dir\") pod \"dns-default-7jlhs\" (UID: \"bb22769b-c18a-471e-9118-2fca21dc6606\") " pod="openshift-dns/dns-default-7jlhs" Apr 19 12:10:24.276285 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:24.276257 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb22769b-c18a-471e-9118-2fca21dc6606-config-volume\") pod \"dns-default-7jlhs\" (UID: \"bb22769b-c18a-471e-9118-2fca21dc6606\") " pod="openshift-dns/dns-default-7jlhs" Apr 19 12:10:24.289441 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:24.289408 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnhsl\" (UniqueName: \"kubernetes.io/projected/bb22769b-c18a-471e-9118-2fca21dc6606-kube-api-access-wnhsl\") pod \"dns-default-7jlhs\" (UID: \"bb22769b-c18a-471e-9118-2fca21dc6606\") " pod="openshift-dns/dns-default-7jlhs" Apr 19 12:10:24.289591 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:24.289481 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjj47\" (UniqueName: \"kubernetes.io/projected/0cdd2a15-9ab0-45f9-9373-938372482e1a-kube-api-access-pjj47\") pod \"ingress-canary-l5qnw\" (UID: \"0cdd2a15-9ab0-45f9-9373-938372482e1a\") " pod="openshift-ingress-canary/ingress-canary-l5qnw" Apr 19 12:10:24.779204 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:24.779174 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cdd2a15-9ab0-45f9-9373-938372482e1a-cert\") pod \"ingress-canary-l5qnw\" (UID: \"0cdd2a15-9ab0-45f9-9373-938372482e1a\") " pod="openshift-ingress-canary/ingress-canary-l5qnw" Apr 19 12:10:24.779347 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:24.779254 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bb22769b-c18a-471e-9118-2fca21dc6606-metrics-tls\") pod \"dns-default-7jlhs\" (UID: \"bb22769b-c18a-471e-9118-2fca21dc6606\") " pod="openshift-dns/dns-default-7jlhs" Apr 19 12:10:24.779347 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:24.779310 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:10:24.779439 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:24.779350 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:10:24.779439 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:24.779387 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cdd2a15-9ab0-45f9-9373-938372482e1a-cert podName:0cdd2a15-9ab0-45f9-9373-938372482e1a nodeName:}" failed. No retries permitted until 2026-04-19 12:10:25.779363764 +0000 UTC m=+34.533555583 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0cdd2a15-9ab0-45f9-9373-938372482e1a-cert") pod "ingress-canary-l5qnw" (UID: "0cdd2a15-9ab0-45f9-9373-938372482e1a") : secret "canary-serving-cert" not found Apr 19 12:10:24.779439 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:24.779404 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb22769b-c18a-471e-9118-2fca21dc6606-metrics-tls podName:bb22769b-c18a-471e-9118-2fca21dc6606 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:25.779396542 +0000 UTC m=+34.533588347 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bb22769b-c18a-471e-9118-2fca21dc6606-metrics-tls") pod "dns-default-7jlhs" (UID: "bb22769b-c18a-471e-9118-2fca21dc6606") : secret "dns-default-metrics-tls" not found Apr 19 12:10:25.023517 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:25.023486 2567 generic.go:358] "Generic (PLEG): container finished" podID="f601721a-a6ac-4b15-8bc0-48274f620286" containerID="e68dfef9c0b9db8a9db12a0fc0cbe28314cc4953b9e4fec639e1ace0ee5db531" exitCode=0 Apr 19 12:10:25.024177 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:25.023554 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2tbjd" event={"ID":"f601721a-a6ac-4b15-8bc0-48274f620286","Type":"ContainerDied","Data":"e68dfef9c0b9db8a9db12a0fc0cbe28314cc4953b9e4fec639e1ace0ee5db531"} Apr 19 12:10:25.484769 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:25.484667 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddsvq\" (UniqueName: \"kubernetes.io/projected/c60217db-a1f2-446f-bada-9675c7c62201-kube-api-access-ddsvq\") pod \"network-check-target-69kxr\" (UID: \"c60217db-a1f2-446f-bada-9675c7c62201\") " pod="openshift-network-diagnostics/network-check-target-69kxr" Apr 19 12:10:25.484769 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:25.484749 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a9d31f9-fb41-43e0-9946-0611710438a1-metrics-certs\") pod \"network-metrics-daemon-rxnrj\" (UID: \"3a9d31f9-fb41-43e0-9946-0611710438a1\") " pod="openshift-multus/network-metrics-daemon-rxnrj" Apr 19 12:10:25.485326 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:25.484844 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:10:25.485326 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:25.484867 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:10:25.485326 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:25.484887 2567 projected.go:194] Error preparing data for projected volume kube-api-access-ddsvq for pod openshift-network-diagnostics/network-check-target-69kxr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:10:25.485326 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:25.484854 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:10:25.485326 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:25.484947 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c60217db-a1f2-446f-bada-9675c7c62201-kube-api-access-ddsvq podName:c60217db-a1f2-446f-bada-9675c7c62201 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:57.484927856 +0000 UTC m=+66.239119677 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-ddsvq" (UniqueName: "kubernetes.io/projected/c60217db-a1f2-446f-bada-9675c7c62201-kube-api-access-ddsvq") pod "network-check-target-69kxr" (UID: "c60217db-a1f2-446f-bada-9675c7c62201") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:10:25.485326 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:25.484968 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a9d31f9-fb41-43e0-9946-0611710438a1-metrics-certs podName:3a9d31f9-fb41-43e0-9946-0611710438a1 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:57.484954259 +0000 UTC m=+66.239146063 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3a9d31f9-fb41-43e0-9946-0611710438a1-metrics-certs") pod "network-metrics-daemon-rxnrj" (UID: "3a9d31f9-fb41-43e0-9946-0611710438a1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:10:25.786394 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:25.786297 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bb22769b-c18a-471e-9118-2fca21dc6606-metrics-tls\") pod \"dns-default-7jlhs\" (UID: \"bb22769b-c18a-471e-9118-2fca21dc6606\") " pod="openshift-dns/dns-default-7jlhs" Apr 19 12:10:25.786394 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:25.786370 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cdd2a15-9ab0-45f9-9373-938372482e1a-cert\") pod \"ingress-canary-l5qnw\" (UID: \"0cdd2a15-9ab0-45f9-9373-938372482e1a\") " pod="openshift-ingress-canary/ingress-canary-l5qnw" Apr 19 12:10:25.786617 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:25.786446 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:10:25.786617 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:25.786483 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:10:25.786617 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:25.786512 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb22769b-c18a-471e-9118-2fca21dc6606-metrics-tls podName:bb22769b-c18a-471e-9118-2fca21dc6606 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:27.786496561 +0000 UTC m=+36.540688365 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bb22769b-c18a-471e-9118-2fca21dc6606-metrics-tls") pod "dns-default-7jlhs" (UID: "bb22769b-c18a-471e-9118-2fca21dc6606") : secret "dns-default-metrics-tls" not found Apr 19 12:10:25.786617 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:25.786533 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cdd2a15-9ab0-45f9-9373-938372482e1a-cert podName:0cdd2a15-9ab0-45f9-9373-938372482e1a nodeName:}" failed. No retries permitted until 2026-04-19 12:10:27.786518414 +0000 UTC m=+36.540710228 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0cdd2a15-9ab0-45f9-9373-938372482e1a-cert") pod "ingress-canary-l5qnw" (UID: "0cdd2a15-9ab0-45f9-9373-938372482e1a") : secret "canary-serving-cert" not found Apr 19 12:10:25.833284 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:25.833243 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxnrj" Apr 19 12:10:25.833453 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:25.833258 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-69kxr" Apr 19 12:10:25.835515 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:25.835489 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 19 12:10:25.836441 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:25.836409 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lsl4x\"" Apr 19 12:10:25.836441 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:25.836413 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-d9pnm\"" Apr 19 12:10:25.836614 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:25.836414 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 19 12:10:25.836614 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:25.836414 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 19 12:10:26.028212 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:26.028181 2567 generic.go:358] "Generic (PLEG): container finished" podID="f601721a-a6ac-4b15-8bc0-48274f620286" containerID="fad1cdf436b2a558a8ef94d26c542bf7c9aa7ba9397ff534ee28f4b06f43d23b" exitCode=0 Apr 19 12:10:26.028579 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:26.028230 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2tbjd" event={"ID":"f601721a-a6ac-4b15-8bc0-48274f620286","Type":"ContainerDied","Data":"fad1cdf436b2a558a8ef94d26c542bf7c9aa7ba9397ff534ee28f4b06f43d23b"} Apr 19 12:10:27.033267 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:27.033079 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2tbjd" event={"ID":"f601721a-a6ac-4b15-8bc0-48274f620286","Type":"ContainerStarted","Data":"e67f1686fd5014e489d4bce4b0ab42b8d1fbcfdaaf70915f94d40101781c212a"} Apr 19 12:10:27.053864 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:27.053816 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-2tbjd" podStartSLOduration=5.672712401 podStartE2EDuration="36.053802507s" podCreationTimestamp="2026-04-19 12:09:51 +0000 UTC" firstStartedPulling="2026-04-19 12:09:54.308916016 +0000 UTC m=+3.063107822" lastFinishedPulling="2026-04-19 12:10:24.690006123 +0000 UTC m=+33.444197928" observedRunningTime="2026-04-19 12:10:27.053216186 +0000 UTC m=+35.807408013" watchObservedRunningTime="2026-04-19 12:10:27.053802507 +0000 UTC m=+35.807994333" Apr 19 12:10:27.802295 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:27.802205 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cdd2a15-9ab0-45f9-9373-938372482e1a-cert\") pod \"ingress-canary-l5qnw\" (UID: \"0cdd2a15-9ab0-45f9-9373-938372482e1a\") " pod="openshift-ingress-canary/ingress-canary-l5qnw" Apr 19 12:10:27.802295 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:27.802284 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bb22769b-c18a-471e-9118-2fca21dc6606-metrics-tls\") pod \"dns-default-7jlhs\" (UID: \"bb22769b-c18a-471e-9118-2fca21dc6606\") " pod="openshift-dns/dns-default-7jlhs" Apr 19 12:10:27.802512 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:27.802350 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:10:27.802512 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:27.802410 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:10:27.802512 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:27.802415 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cdd2a15-9ab0-45f9-9373-938372482e1a-cert podName:0cdd2a15-9ab0-45f9-9373-938372482e1a nodeName:}" failed. No retries permitted until 2026-04-19 12:10:31.802399393 +0000 UTC m=+40.556591202 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0cdd2a15-9ab0-45f9-9373-938372482e1a-cert") pod "ingress-canary-l5qnw" (UID: "0cdd2a15-9ab0-45f9-9373-938372482e1a") : secret "canary-serving-cert" not found Apr 19 12:10:27.802512 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:27.802470 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb22769b-c18a-471e-9118-2fca21dc6606-metrics-tls podName:bb22769b-c18a-471e-9118-2fca21dc6606 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:31.802454847 +0000 UTC m=+40.556646652 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bb22769b-c18a-471e-9118-2fca21dc6606-metrics-tls") pod "dns-default-7jlhs" (UID: "bb22769b-c18a-471e-9118-2fca21dc6606") : secret "dns-default-metrics-tls" not found Apr 19 12:10:31.829801 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:31.829766 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cdd2a15-9ab0-45f9-9373-938372482e1a-cert\") pod \"ingress-canary-l5qnw\" (UID: \"0cdd2a15-9ab0-45f9-9373-938372482e1a\") " pod="openshift-ingress-canary/ingress-canary-l5qnw" Apr 19 12:10:31.830238 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:31.829825 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bb22769b-c18a-471e-9118-2fca21dc6606-metrics-tls\") pod \"dns-default-7jlhs\" (UID: \"bb22769b-c18a-471e-9118-2fca21dc6606\") " pod="openshift-dns/dns-default-7jlhs" Apr 19 12:10:31.830238 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:31.829907 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:10:31.830238 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:31.829960 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cdd2a15-9ab0-45f9-9373-938372482e1a-cert podName:0cdd2a15-9ab0-45f9-9373-938372482e1a nodeName:}" failed. No retries permitted until 2026-04-19 12:10:39.829946083 +0000 UTC m=+48.584137888 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0cdd2a15-9ab0-45f9-9373-938372482e1a-cert") pod "ingress-canary-l5qnw" (UID: "0cdd2a15-9ab0-45f9-9373-938372482e1a") : secret "canary-serving-cert" not found Apr 19 12:10:31.830238 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:31.829914 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:10:31.830238 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:31.830023 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb22769b-c18a-471e-9118-2fca21dc6606-metrics-tls podName:bb22769b-c18a-471e-9118-2fca21dc6606 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:39.830012008 +0000 UTC m=+48.584203812 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bb22769b-c18a-471e-9118-2fca21dc6606-metrics-tls") pod "dns-default-7jlhs" (UID: "bb22769b-c18a-471e-9118-2fca21dc6606") : secret "dns-default-metrics-tls" not found Apr 19 12:10:39.887574 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:39.887532 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bb22769b-c18a-471e-9118-2fca21dc6606-metrics-tls\") pod \"dns-default-7jlhs\" (UID: \"bb22769b-c18a-471e-9118-2fca21dc6606\") " pod="openshift-dns/dns-default-7jlhs" Apr 19 12:10:39.888023 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:39.887621 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cdd2a15-9ab0-45f9-9373-938372482e1a-cert\") pod \"ingress-canary-l5qnw\" (UID: \"0cdd2a15-9ab0-45f9-9373-938372482e1a\") " pod="openshift-ingress-canary/ingress-canary-l5qnw" Apr 19 12:10:39.888023 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:39.887679 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:10:39.888023 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:39.887730 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:10:39.888023 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:39.887757 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb22769b-c18a-471e-9118-2fca21dc6606-metrics-tls podName:bb22769b-c18a-471e-9118-2fca21dc6606 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:55.88774002 +0000 UTC m=+64.641931825 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bb22769b-c18a-471e-9118-2fca21dc6606-metrics-tls") pod "dns-default-7jlhs" (UID: "bb22769b-c18a-471e-9118-2fca21dc6606") : secret "dns-default-metrics-tls" not found Apr 19 12:10:39.888023 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:39.887780 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cdd2a15-9ab0-45f9-9373-938372482e1a-cert podName:0cdd2a15-9ab0-45f9-9373-938372482e1a nodeName:}" failed. No retries permitted until 2026-04-19 12:10:55.887769144 +0000 UTC m=+64.641960948 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0cdd2a15-9ab0-45f9-9373-938372482e1a-cert") pod "ingress-canary-l5qnw" (UID: "0cdd2a15-9ab0-45f9-9373-938372482e1a") : secret "canary-serving-cert" not found Apr 19 12:10:50.022636 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:50.022607 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5n96s" Apr 19 12:10:55.899676 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:55.899637 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bb22769b-c18a-471e-9118-2fca21dc6606-metrics-tls\") pod \"dns-default-7jlhs\" (UID: \"bb22769b-c18a-471e-9118-2fca21dc6606\") " pod="openshift-dns/dns-default-7jlhs" Apr 19 12:10:55.900158 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:55.899716 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cdd2a15-9ab0-45f9-9373-938372482e1a-cert\") pod \"ingress-canary-l5qnw\" (UID: \"0cdd2a15-9ab0-45f9-9373-938372482e1a\") " pod="openshift-ingress-canary/ingress-canary-l5qnw" Apr 19 12:10:55.900158 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:55.899788 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:10:55.900158 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:55.899802 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:10:55.900158 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:55.899850 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cdd2a15-9ab0-45f9-9373-938372482e1a-cert podName:0cdd2a15-9ab0-45f9-9373-938372482e1a nodeName:}" failed. No retries permitted until 2026-04-19 12:11:27.899836526 +0000 UTC m=+96.654028331 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0cdd2a15-9ab0-45f9-9373-938372482e1a-cert") pod "ingress-canary-l5qnw" (UID: "0cdd2a15-9ab0-45f9-9373-938372482e1a") : secret "canary-serving-cert" not found Apr 19 12:10:55.900158 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:55.899862 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb22769b-c18a-471e-9118-2fca21dc6606-metrics-tls podName:bb22769b-c18a-471e-9118-2fca21dc6606 nodeName:}" failed. No retries permitted until 2026-04-19 12:11:27.899856506 +0000 UTC m=+96.654048310 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bb22769b-c18a-471e-9118-2fca21dc6606-metrics-tls") pod "dns-default-7jlhs" (UID: "bb22769b-c18a-471e-9118-2fca21dc6606") : secret "dns-default-metrics-tls" not found Apr 19 12:10:57.511125 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:57.511057 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a9d31f9-fb41-43e0-9946-0611710438a1-metrics-certs\") pod \"network-metrics-daemon-rxnrj\" (UID: \"3a9d31f9-fb41-43e0-9946-0611710438a1\") " pod="openshift-multus/network-metrics-daemon-rxnrj" Apr 19 12:10:57.511523 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:57.511160 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddsvq\" (UniqueName: \"kubernetes.io/projected/c60217db-a1f2-446f-bada-9675c7c62201-kube-api-access-ddsvq\") pod \"network-check-target-69kxr\" (UID: \"c60217db-a1f2-446f-bada-9675c7c62201\") " pod="openshift-network-diagnostics/network-check-target-69kxr" Apr 19 12:10:57.514068 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:57.514044 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 19 12:10:57.514186 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:57.514124 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 19 12:10:57.522052 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:57.522034 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 19 12:10:57.522145 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:10:57.522096 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a9d31f9-fb41-43e0-9946-0611710438a1-metrics-certs podName:3a9d31f9-fb41-43e0-9946-0611710438a1 nodeName:}" failed. No retries permitted until 2026-04-19 12:12:01.522079027 +0000 UTC m=+130.276270831 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3a9d31f9-fb41-43e0-9946-0611710438a1-metrics-certs") pod "network-metrics-daemon-rxnrj" (UID: "3a9d31f9-fb41-43e0-9946-0611710438a1") : secret "metrics-daemon-secret" not found Apr 19 12:10:57.523875 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:57.523860 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 19 12:10:57.535675 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:57.535644 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddsvq\" (UniqueName: \"kubernetes.io/projected/c60217db-a1f2-446f-bada-9675c7c62201-kube-api-access-ddsvq\") pod \"network-check-target-69kxr\" (UID: \"c60217db-a1f2-446f-bada-9675c7c62201\") " pod="openshift-network-diagnostics/network-check-target-69kxr" Apr 19 12:10:57.650378 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:57.650347 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-d9pnm\"" Apr 19 12:10:57.658099 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:57.658074 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-69kxr" Apr 19 12:10:57.780669 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:57.780600 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-69kxr"] Apr 19 12:10:57.783574 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:10:57.783546 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc60217db_a1f2_446f_bada_9675c7c62201.slice/crio-5710947f05987624656c8f62bc9fcd9365f83782339d82727018d4ef6931e272 WatchSource:0}: Error finding container 5710947f05987624656c8f62bc9fcd9365f83782339d82727018d4ef6931e272: Status 404 returned error can't find the container with id 5710947f05987624656c8f62bc9fcd9365f83782339d82727018d4ef6931e272 Apr 19 12:10:58.090094 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:10:58.090005 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-69kxr" event={"ID":"c60217db-a1f2-446f-bada-9675c7c62201","Type":"ContainerStarted","Data":"5710947f05987624656c8f62bc9fcd9365f83782339d82727018d4ef6931e272"} Apr 19 12:11:01.096454 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:01.096356 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-69kxr" event={"ID":"c60217db-a1f2-446f-bada-9675c7c62201","Type":"ContainerStarted","Data":"78c9149d2555f8c7328da6f994112ad41d264f6d63a1db8148c8d833025c6a3a"} Apr 19 12:11:01.096838 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:01.096509 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-69kxr" Apr 19 12:11:01.114625 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:01.114577 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-69kxr" podStartSLOduration=67.099955471 podStartE2EDuration="1m10.114562677s" podCreationTimestamp="2026-04-19 12:09:51 +0000 UTC" firstStartedPulling="2026-04-19 12:10:57.785247868 +0000 UTC m=+66.539439677" lastFinishedPulling="2026-04-19 12:11:00.79985506 +0000 UTC m=+69.554046883" observedRunningTime="2026-04-19 12:11:01.108857508 +0000 UTC m=+69.863049334" watchObservedRunningTime="2026-04-19 12:11:01.114562677 +0000 UTC m=+69.868754523" Apr 19 12:11:01.517511 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:01.517473 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-767986f5c5-csjhw"] Apr 19 12:11:01.520277 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:01.520260 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-767986f5c5-csjhw" Apr 19 12:11:01.523370 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:01.523345 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 19 12:11:01.523370 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:01.523345 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-c9fds\"" Apr 19 12:11:01.523761 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:01.523385 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 19 12:11:01.523761 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:01.523427 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 19 12:11:01.523761 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:01.523345 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 19 12:11:01.528830 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:01.528808 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-767986f5c5-csjhw"] Apr 19 12:11:01.536253 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:01.536228 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/48c58967-efc0-4973-a843-94a140c8f876-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-767986f5c5-csjhw\" (UID: \"48c58967-efc0-4973-a843-94a140c8f876\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-767986f5c5-csjhw" Apr 19 12:11:01.536360 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:01.536283 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpzx2\" (UniqueName: \"kubernetes.io/projected/48c58967-efc0-4973-a843-94a140c8f876-kube-api-access-tpzx2\") pod \"managed-serviceaccount-addon-agent-767986f5c5-csjhw\" (UID: \"48c58967-efc0-4973-a843-94a140c8f876\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-767986f5c5-csjhw" Apr 19 12:11:01.637276 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:01.637230 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/48c58967-efc0-4973-a843-94a140c8f876-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-767986f5c5-csjhw\" (UID: \"48c58967-efc0-4973-a843-94a140c8f876\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-767986f5c5-csjhw" Apr 19 12:11:01.637276 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:01.637283 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tpzx2\" (UniqueName: \"kubernetes.io/projected/48c58967-efc0-4973-a843-94a140c8f876-kube-api-access-tpzx2\") pod \"managed-serviceaccount-addon-agent-767986f5c5-csjhw\" (UID: \"48c58967-efc0-4973-a843-94a140c8f876\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-767986f5c5-csjhw" Apr 19 12:11:01.639757 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:01.639733 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/48c58967-efc0-4973-a843-94a140c8f876-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-767986f5c5-csjhw\" (UID: \"48c58967-efc0-4973-a843-94a140c8f876\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-767986f5c5-csjhw" Apr 19 12:11:01.644493 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:01.644464 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpzx2\" (UniqueName: \"kubernetes.io/projected/48c58967-efc0-4973-a843-94a140c8f876-kube-api-access-tpzx2\") pod \"managed-serviceaccount-addon-agent-767986f5c5-csjhw\" (UID: \"48c58967-efc0-4973-a843-94a140c8f876\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-767986f5c5-csjhw" Apr 19 12:11:01.837316 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:01.837231 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-767986f5c5-csjhw" Apr 19 12:11:01.946306 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:01.946221 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-767986f5c5-csjhw"] Apr 19 12:11:01.949521 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:11:01.949484 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48c58967_efc0_4973_a843_94a140c8f876.slice/crio-f767f286ef2432e3a0016521961bf4d80bb83f2afc3eeb439cfd1491004165a7 WatchSource:0}: Error finding container f767f286ef2432e3a0016521961bf4d80bb83f2afc3eeb439cfd1491004165a7: Status 404 returned error can't find the container with id f767f286ef2432e3a0016521961bf4d80bb83f2afc3eeb439cfd1491004165a7 Apr 19 12:11:02.099321 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:02.099222 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-767986f5c5-csjhw" event={"ID":"48c58967-efc0-4973-a843-94a140c8f876","Type":"ContainerStarted","Data":"f767f286ef2432e3a0016521961bf4d80bb83f2afc3eeb439cfd1491004165a7"} Apr 19 12:11:06.108706 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:06.108667 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-767986f5c5-csjhw" event={"ID":"48c58967-efc0-4973-a843-94a140c8f876","Type":"ContainerStarted","Data":"21d672f46c5406b1896040562bd43911a7a3961acc508bc0a0b63f40600ea4f1"} Apr 19 12:11:06.122546 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:06.122501 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-767986f5c5-csjhw" podStartSLOduration=1.448569061 podStartE2EDuration="5.122488175s" podCreationTimestamp="2026-04-19 12:11:01 +0000 UTC" firstStartedPulling="2026-04-19 12:11:01.951318349 +0000 UTC m=+70.705510155" lastFinishedPulling="2026-04-19 12:11:05.625237459 +0000 UTC m=+74.379429269" observedRunningTime="2026-04-19 12:11:06.121800821 +0000 UTC m=+74.875992648" watchObservedRunningTime="2026-04-19 12:11:06.122488175 +0000 UTC m=+74.876680002" Apr 19 12:11:20.485557 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.485520 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-b5csj"] Apr 19 12:11:20.489751 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.489730 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-b5csj" Apr 19 12:11:20.492418 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.492387 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 19 12:11:20.492556 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.492484 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 19 12:11:20.493494 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.493479 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 19 12:11:20.493576 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.493486 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-jngvd\"" Apr 19 12:11:20.493576 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.493517 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 19 12:11:20.498048 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.498020 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 19 12:11:20.498480 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.498453 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-b5csj"] Apr 19 12:11:20.561614 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.561574 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/a80fa3ab-60ec-4151-bfb0-4bc0ed470f79-snapshots\") pod \"insights-operator-585dfdc468-b5csj\" (UID: \"a80fa3ab-60ec-4151-bfb0-4bc0ed470f79\") " pod="openshift-insights/insights-operator-585dfdc468-b5csj" Apr 19 12:11:20.561784 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.561624 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a80fa3ab-60ec-4151-bfb0-4bc0ed470f79-tmp\") pod \"insights-operator-585dfdc468-b5csj\" (UID: \"a80fa3ab-60ec-4151-bfb0-4bc0ed470f79\") " pod="openshift-insights/insights-operator-585dfdc468-b5csj" Apr 19 12:11:20.561784 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.561656 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a80fa3ab-60ec-4151-bfb0-4bc0ed470f79-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-b5csj\" (UID: \"a80fa3ab-60ec-4151-bfb0-4bc0ed470f79\") " pod="openshift-insights/insights-operator-585dfdc468-b5csj" Apr 19 12:11:20.561784 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.561691 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a80fa3ab-60ec-4151-bfb0-4bc0ed470f79-service-ca-bundle\") pod \"insights-operator-585dfdc468-b5csj\" (UID: \"a80fa3ab-60ec-4151-bfb0-4bc0ed470f79\") " pod="openshift-insights/insights-operator-585dfdc468-b5csj" Apr 19 12:11:20.561784 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.561752 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a80fa3ab-60ec-4151-bfb0-4bc0ed470f79-serving-cert\") pod \"insights-operator-585dfdc468-b5csj\" (UID: \"a80fa3ab-60ec-4151-bfb0-4bc0ed470f79\") " pod="openshift-insights/insights-operator-585dfdc468-b5csj" Apr 19 12:11:20.561908 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.561789 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxdcj\" (UniqueName: \"kubernetes.io/projected/a80fa3ab-60ec-4151-bfb0-4bc0ed470f79-kube-api-access-kxdcj\") pod \"insights-operator-585dfdc468-b5csj\" (UID: \"a80fa3ab-60ec-4151-bfb0-4bc0ed470f79\") " pod="openshift-insights/insights-operator-585dfdc468-b5csj" Apr 19 12:11:20.574407 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.574376 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-nzrxw"] Apr 19 12:11:20.577100 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.577085 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nzrxw" Apr 19 12:11:20.579475 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.579448 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 19 12:11:20.579475 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.579452 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 19 12:11:20.579657 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.579480 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-25dmw\"" Apr 19 12:11:20.579657 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.579553 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 19 12:11:20.580701 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.580683 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 19 12:11:20.584141 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.584107 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-nzrxw"] Apr 19 12:11:20.662316 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.662277 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/f239109c-ba15-4cff-b322-22aa684c3f58-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-nzrxw\" (UID: \"f239109c-ba15-4cff-b322-22aa684c3f58\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nzrxw" Apr 19 12:11:20.662316 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.662317 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f239109c-ba15-4cff-b322-22aa684c3f58-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-nzrxw\" (UID: \"f239109c-ba15-4cff-b322-22aa684c3f58\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nzrxw" Apr 19 12:11:20.662540 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.662370 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a80fa3ab-60ec-4151-bfb0-4bc0ed470f79-service-ca-bundle\") pod \"insights-operator-585dfdc468-b5csj\" (UID: \"a80fa3ab-60ec-4151-bfb0-4bc0ed470f79\") " pod="openshift-insights/insights-operator-585dfdc468-b5csj" Apr 19 12:11:20.662540 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.662420 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a80fa3ab-60ec-4151-bfb0-4bc0ed470f79-serving-cert\") pod \"insights-operator-585dfdc468-b5csj\" (UID: \"a80fa3ab-60ec-4151-bfb0-4bc0ed470f79\") " pod="openshift-insights/insights-operator-585dfdc468-b5csj" Apr 19 12:11:20.662540 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.662439 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kxdcj\" (UniqueName: \"kubernetes.io/projected/a80fa3ab-60ec-4151-bfb0-4bc0ed470f79-kube-api-access-kxdcj\") pod \"insights-operator-585dfdc468-b5csj\" (UID: \"a80fa3ab-60ec-4151-bfb0-4bc0ed470f79\") " pod="openshift-insights/insights-operator-585dfdc468-b5csj" Apr 19 12:11:20.662540 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.662477 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6z5c\" (UniqueName: \"kubernetes.io/projected/f239109c-ba15-4cff-b322-22aa684c3f58-kube-api-access-f6z5c\") pod \"cluster-monitoring-operator-75587bd455-nzrxw\" (UID: \"f239109c-ba15-4cff-b322-22aa684c3f58\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nzrxw" Apr 19 12:11:20.662540 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.662498 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/a80fa3ab-60ec-4151-bfb0-4bc0ed470f79-snapshots\") pod \"insights-operator-585dfdc468-b5csj\" (UID: \"a80fa3ab-60ec-4151-bfb0-4bc0ed470f79\") " pod="openshift-insights/insights-operator-585dfdc468-b5csj" Apr 19 12:11:20.662540 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.662527 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a80fa3ab-60ec-4151-bfb0-4bc0ed470f79-tmp\") pod \"insights-operator-585dfdc468-b5csj\" (UID: \"a80fa3ab-60ec-4151-bfb0-4bc0ed470f79\") " pod="openshift-insights/insights-operator-585dfdc468-b5csj" Apr 19 12:11:20.662817 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.662560 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a80fa3ab-60ec-4151-bfb0-4bc0ed470f79-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-b5csj\" (UID: \"a80fa3ab-60ec-4151-bfb0-4bc0ed470f79\") " pod="openshift-insights/insights-operator-585dfdc468-b5csj" Apr 19 12:11:20.663065 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.663041 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a80fa3ab-60ec-4151-bfb0-4bc0ed470f79-service-ca-bundle\") pod \"insights-operator-585dfdc468-b5csj\" (UID: \"a80fa3ab-60ec-4151-bfb0-4bc0ed470f79\") " pod="openshift-insights/insights-operator-585dfdc468-b5csj" Apr 19 12:11:20.663170 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.663152 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a80fa3ab-60ec-4151-bfb0-4bc0ed470f79-tmp\") pod \"insights-operator-585dfdc468-b5csj\" (UID: \"a80fa3ab-60ec-4151-bfb0-4bc0ed470f79\") " pod="openshift-insights/insights-operator-585dfdc468-b5csj" Apr 19 12:11:20.663266 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.663250 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/a80fa3ab-60ec-4151-bfb0-4bc0ed470f79-snapshots\") pod \"insights-operator-585dfdc468-b5csj\" (UID: \"a80fa3ab-60ec-4151-bfb0-4bc0ed470f79\") " pod="openshift-insights/insights-operator-585dfdc468-b5csj" Apr 19 12:11:20.663496 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.663478 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a80fa3ab-60ec-4151-bfb0-4bc0ed470f79-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-b5csj\" (UID: \"a80fa3ab-60ec-4151-bfb0-4bc0ed470f79\") " pod="openshift-insights/insights-operator-585dfdc468-b5csj" Apr 19 12:11:20.664715 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.664696 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a80fa3ab-60ec-4151-bfb0-4bc0ed470f79-serving-cert\") pod \"insights-operator-585dfdc468-b5csj\" (UID: \"a80fa3ab-60ec-4151-bfb0-4bc0ed470f79\") " pod="openshift-insights/insights-operator-585dfdc468-b5csj" Apr 19 12:11:20.675000 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.674977 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxdcj\" (UniqueName: \"kubernetes.io/projected/a80fa3ab-60ec-4151-bfb0-4bc0ed470f79-kube-api-access-kxdcj\") pod \"insights-operator-585dfdc468-b5csj\" (UID: \"a80fa3ab-60ec-4151-bfb0-4bc0ed470f79\") " pod="openshift-insights/insights-operator-585dfdc468-b5csj" Apr 19 12:11:20.763051 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.762965 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f6z5c\" (UniqueName: \"kubernetes.io/projected/f239109c-ba15-4cff-b322-22aa684c3f58-kube-api-access-f6z5c\") pod \"cluster-monitoring-operator-75587bd455-nzrxw\" (UID: \"f239109c-ba15-4cff-b322-22aa684c3f58\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nzrxw" Apr 19 12:11:20.763051 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.763023 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/f239109c-ba15-4cff-b322-22aa684c3f58-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-nzrxw\" (UID: \"f239109c-ba15-4cff-b322-22aa684c3f58\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nzrxw" Apr 19 12:11:20.763051 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.763047 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f239109c-ba15-4cff-b322-22aa684c3f58-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-nzrxw\" (UID: \"f239109c-ba15-4cff-b322-22aa684c3f58\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nzrxw" Apr 19 12:11:20.763323 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:11:20.763163 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 19 12:11:20.763323 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:11:20.763222 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f239109c-ba15-4cff-b322-22aa684c3f58-cluster-monitoring-operator-tls podName:f239109c-ba15-4cff-b322-22aa684c3f58 nodeName:}" failed. No retries permitted until 2026-04-19 12:11:21.263208524 +0000 UTC m=+90.017400334 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/f239109c-ba15-4cff-b322-22aa684c3f58-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-nzrxw" (UID: "f239109c-ba15-4cff-b322-22aa684c3f58") : secret "cluster-monitoring-operator-tls" not found Apr 19 12:11:20.763734 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.763715 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/f239109c-ba15-4cff-b322-22aa684c3f58-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-nzrxw\" (UID: \"f239109c-ba15-4cff-b322-22aa684c3f58\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nzrxw" Apr 19 12:11:20.770725 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.770697 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6z5c\" (UniqueName: \"kubernetes.io/projected/f239109c-ba15-4cff-b322-22aa684c3f58-kube-api-access-f6z5c\") pod \"cluster-monitoring-operator-75587bd455-nzrxw\" (UID: \"f239109c-ba15-4cff-b322-22aa684c3f58\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nzrxw" Apr 19 12:11:20.799572 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.799545 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-b5csj" Apr 19 12:11:20.911292 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:20.911260 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-b5csj"] Apr 19 12:11:20.914207 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:11:20.914182 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda80fa3ab_60ec_4151_bfb0_4bc0ed470f79.slice/crio-f09e517013f5d77ca8a60c9044baeeca973c369ce18ca99c59521bddbde64e32 WatchSource:0}: Error finding container f09e517013f5d77ca8a60c9044baeeca973c369ce18ca99c59521bddbde64e32: Status 404 returned error can't find the container with id f09e517013f5d77ca8a60c9044baeeca973c369ce18ca99c59521bddbde64e32 Apr 19 12:11:21.136961 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:21.136867 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-b5csj" event={"ID":"a80fa3ab-60ec-4151-bfb0-4bc0ed470f79","Type":"ContainerStarted","Data":"f09e517013f5d77ca8a60c9044baeeca973c369ce18ca99c59521bddbde64e32"} Apr 19 12:11:21.267485 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:21.267450 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f239109c-ba15-4cff-b322-22aa684c3f58-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-nzrxw\" (UID: \"f239109c-ba15-4cff-b322-22aa684c3f58\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nzrxw" Apr 19 12:11:21.267655 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:11:21.267605 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 19 12:11:21.267697 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:11:21.267675 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f239109c-ba15-4cff-b322-22aa684c3f58-cluster-monitoring-operator-tls podName:f239109c-ba15-4cff-b322-22aa684c3f58 nodeName:}" failed. No retries permitted until 2026-04-19 12:11:22.267657377 +0000 UTC m=+91.021849206 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/f239109c-ba15-4cff-b322-22aa684c3f58-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-nzrxw" (UID: "f239109c-ba15-4cff-b322-22aa684c3f58") : secret "cluster-monitoring-operator-tls" not found Apr 19 12:11:21.395508 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:21.395431 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-c8kc9"] Apr 19 12:11:21.399596 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:21.399578 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-c8kc9" Apr 19 12:11:21.402063 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:21.402036 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 19 12:11:21.402063 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:21.402052 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 19 12:11:21.402277 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:21.402077 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-s7g9c\"" Apr 19 12:11:21.402277 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:21.402080 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 19 12:11:21.402277 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:21.402077 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 19 12:11:21.407814 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:21.407785 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-c8kc9"] Apr 19 12:11:21.408101 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:21.408076 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 19 12:11:21.469421 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:21.469385 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88822cf0-d332-4d8f-ab99-d2460f2ad404-serving-cert\") pod \"console-operator-9d4b6777b-c8kc9\" (UID: \"88822cf0-d332-4d8f-ab99-d2460f2ad404\") " pod="openshift-console-operator/console-operator-9d4b6777b-c8kc9" Apr 19 12:11:21.469593 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:21.469435 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/88822cf0-d332-4d8f-ab99-d2460f2ad404-trusted-ca\") pod \"console-operator-9d4b6777b-c8kc9\" (UID: \"88822cf0-d332-4d8f-ab99-d2460f2ad404\") " pod="openshift-console-operator/console-operator-9d4b6777b-c8kc9" Apr 19 12:11:21.469593 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:21.469565 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsf22\" (UniqueName: \"kubernetes.io/projected/88822cf0-d332-4d8f-ab99-d2460f2ad404-kube-api-access-qsf22\") pod \"console-operator-9d4b6777b-c8kc9\" (UID: \"88822cf0-d332-4d8f-ab99-d2460f2ad404\") " pod="openshift-console-operator/console-operator-9d4b6777b-c8kc9" Apr 19 12:11:21.469705 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:21.469625 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88822cf0-d332-4d8f-ab99-d2460f2ad404-config\") pod \"console-operator-9d4b6777b-c8kc9\" (UID: \"88822cf0-d332-4d8f-ab99-d2460f2ad404\") " pod="openshift-console-operator/console-operator-9d4b6777b-c8kc9" Apr 19 12:11:21.570960 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:21.570926 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/88822cf0-d332-4d8f-ab99-d2460f2ad404-trusted-ca\") pod \"console-operator-9d4b6777b-c8kc9\" (UID: \"88822cf0-d332-4d8f-ab99-d2460f2ad404\") " pod="openshift-console-operator/console-operator-9d4b6777b-c8kc9" Apr 19 12:11:21.571388 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:21.571019 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qsf22\" (UniqueName: \"kubernetes.io/projected/88822cf0-d332-4d8f-ab99-d2460f2ad404-kube-api-access-qsf22\") pod \"console-operator-9d4b6777b-c8kc9\" (UID: \"88822cf0-d332-4d8f-ab99-d2460f2ad404\") " pod="openshift-console-operator/console-operator-9d4b6777b-c8kc9" Apr 19 12:11:21.571388 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:21.571080 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88822cf0-d332-4d8f-ab99-d2460f2ad404-config\") pod \"console-operator-9d4b6777b-c8kc9\" (UID: \"88822cf0-d332-4d8f-ab99-d2460f2ad404\") " pod="openshift-console-operator/console-operator-9d4b6777b-c8kc9" Apr 19 12:11:21.571388 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:21.571153 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88822cf0-d332-4d8f-ab99-d2460f2ad404-serving-cert\") pod \"console-operator-9d4b6777b-c8kc9\" (UID: \"88822cf0-d332-4d8f-ab99-d2460f2ad404\") " pod="openshift-console-operator/console-operator-9d4b6777b-c8kc9" Apr 19 12:11:21.571981 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:21.571952 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/88822cf0-d332-4d8f-ab99-d2460f2ad404-trusted-ca\") pod \"console-operator-9d4b6777b-c8kc9\" (UID: \"88822cf0-d332-4d8f-ab99-d2460f2ad404\") " pod="openshift-console-operator/console-operator-9d4b6777b-c8kc9" Apr 19 12:11:21.572080 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:21.571978 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88822cf0-d332-4d8f-ab99-d2460f2ad404-config\") pod \"console-operator-9d4b6777b-c8kc9\" (UID: \"88822cf0-d332-4d8f-ab99-d2460f2ad404\") " pod="openshift-console-operator/console-operator-9d4b6777b-c8kc9" Apr 19 12:11:21.573746 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:21.573722 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88822cf0-d332-4d8f-ab99-d2460f2ad404-serving-cert\") pod \"console-operator-9d4b6777b-c8kc9\" (UID: \"88822cf0-d332-4d8f-ab99-d2460f2ad404\") " pod="openshift-console-operator/console-operator-9d4b6777b-c8kc9" Apr 19 12:11:21.579002 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:21.578966 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsf22\" (UniqueName: \"kubernetes.io/projected/88822cf0-d332-4d8f-ab99-d2460f2ad404-kube-api-access-qsf22\") pod \"console-operator-9d4b6777b-c8kc9\" (UID: \"88822cf0-d332-4d8f-ab99-d2460f2ad404\") " pod="openshift-console-operator/console-operator-9d4b6777b-c8kc9" Apr 19 12:11:21.712846 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:21.712812 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-c8kc9" Apr 19 12:11:21.831415 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:21.831377 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-c8kc9"] Apr 19 12:11:21.836224 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:11:21.836186 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88822cf0_d332_4d8f_ab99_d2460f2ad404.slice/crio-bdc70c9c13c37b6b2f37ae0db269f11836f26698a4ff9a05f7dba7dff683faac WatchSource:0}: Error finding container bdc70c9c13c37b6b2f37ae0db269f11836f26698a4ff9a05f7dba7dff683faac: Status 404 returned error can't find the container with id bdc70c9c13c37b6b2f37ae0db269f11836f26698a4ff9a05f7dba7dff683faac Apr 19 12:11:22.139970 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:22.139889 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-c8kc9" event={"ID":"88822cf0-d332-4d8f-ab99-d2460f2ad404","Type":"ContainerStarted","Data":"bdc70c9c13c37b6b2f37ae0db269f11836f26698a4ff9a05f7dba7dff683faac"} Apr 19 12:11:22.275762 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:22.275717 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f239109c-ba15-4cff-b322-22aa684c3f58-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-nzrxw\" (UID: \"f239109c-ba15-4cff-b322-22aa684c3f58\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nzrxw" Apr 19 12:11:22.275950 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:11:22.275900 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 19 12:11:22.276012 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:11:22.275974 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f239109c-ba15-4cff-b322-22aa684c3f58-cluster-monitoring-operator-tls podName:f239109c-ba15-4cff-b322-22aa684c3f58 nodeName:}" failed. No retries permitted until 2026-04-19 12:11:24.275953232 +0000 UTC m=+93.030145040 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/f239109c-ba15-4cff-b322-22aa684c3f58-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-nzrxw" (UID: "f239109c-ba15-4cff-b322-22aa684c3f58") : secret "cluster-monitoring-operator-tls" not found Apr 19 12:11:24.146438 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:24.146398 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-b5csj" event={"ID":"a80fa3ab-60ec-4151-bfb0-4bc0ed470f79","Type":"ContainerStarted","Data":"69ff75f7c77e6da02208b8e9d3d69820ade4b764d1a5b12002983165f475373c"} Apr 19 12:11:24.149017 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:24.148989 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c8kc9_88822cf0-d332-4d8f-ab99-d2460f2ad404/console-operator/0.log" Apr 19 12:11:24.149205 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:24.149041 2567 generic.go:358] "Generic (PLEG): container finished" podID="88822cf0-d332-4d8f-ab99-d2460f2ad404" containerID="35d09f239c1a85a595122a5e8c4a3afbde6939dabbb1bbc809a641ea0f851494" exitCode=255 Apr 19 12:11:24.149302 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:24.149203 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-c8kc9" event={"ID":"88822cf0-d332-4d8f-ab99-d2460f2ad404","Type":"ContainerDied","Data":"35d09f239c1a85a595122a5e8c4a3afbde6939dabbb1bbc809a641ea0f851494"} Apr 19 12:11:24.149821 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:24.149524 2567 scope.go:117] "RemoveContainer" containerID="35d09f239c1a85a595122a5e8c4a3afbde6939dabbb1bbc809a641ea0f851494" Apr 19 12:11:24.167171 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:24.167123 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-b5csj" podStartSLOduration=1.854684245 podStartE2EDuration="4.167091382s" podCreationTimestamp="2026-04-19 12:11:20 +0000 UTC" firstStartedPulling="2026-04-19 12:11:20.91600679 +0000 UTC m=+89.670198594" lastFinishedPulling="2026-04-19 12:11:23.228413926 +0000 UTC m=+91.982605731" observedRunningTime="2026-04-19 12:11:24.16621166 +0000 UTC m=+92.920403486" watchObservedRunningTime="2026-04-19 12:11:24.167091382 +0000 UTC m=+92.921283209" Apr 19 12:11:24.292034 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:24.292003 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f239109c-ba15-4cff-b322-22aa684c3f58-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-nzrxw\" (UID: \"f239109c-ba15-4cff-b322-22aa684c3f58\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nzrxw" Apr 19 12:11:24.292171 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:11:24.292131 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 19 12:11:24.292216 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:11:24.292198 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f239109c-ba15-4cff-b322-22aa684c3f58-cluster-monitoring-operator-tls podName:f239109c-ba15-4cff-b322-22aa684c3f58 nodeName:}" failed. No retries permitted until 2026-04-19 12:11:28.292182522 +0000 UTC m=+97.046374327 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/f239109c-ba15-4cff-b322-22aa684c3f58-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-nzrxw" (UID: "f239109c-ba15-4cff-b322-22aa684c3f58") : secret "cluster-monitoring-operator-tls" not found Apr 19 12:11:25.153682 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:25.153654 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c8kc9_88822cf0-d332-4d8f-ab99-d2460f2ad404/console-operator/1.log" Apr 19 12:11:25.154079 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:25.154044 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c8kc9_88822cf0-d332-4d8f-ab99-d2460f2ad404/console-operator/0.log" Apr 19 12:11:25.154147 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:25.154075 2567 generic.go:358] "Generic (PLEG): container finished" podID="88822cf0-d332-4d8f-ab99-d2460f2ad404" containerID="5255eaf59e0e909f317c3b7b514a0717b54d58243ec36a710855c93838c4e1a6" exitCode=255 Apr 19 12:11:25.154187 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:25.154172 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-c8kc9" event={"ID":"88822cf0-d332-4d8f-ab99-d2460f2ad404","Type":"ContainerDied","Data":"5255eaf59e0e909f317c3b7b514a0717b54d58243ec36a710855c93838c4e1a6"} Apr 19 12:11:25.154220 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:25.154213 2567 scope.go:117] "RemoveContainer" containerID="35d09f239c1a85a595122a5e8c4a3afbde6939dabbb1bbc809a641ea0f851494" Apr 19 12:11:25.154448 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:25.154424 2567 scope.go:117] "RemoveContainer" containerID="5255eaf59e0e909f317c3b7b514a0717b54d58243ec36a710855c93838c4e1a6" Apr 19 12:11:25.154667 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:11:25.154642 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-c8kc9_openshift-console-operator(88822cf0-d332-4d8f-ab99-d2460f2ad404)\"" pod="openshift-console-operator/console-operator-9d4b6777b-c8kc9" podUID="88822cf0-d332-4d8f-ab99-d2460f2ad404" Apr 19 12:11:25.394969 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:25.394935 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xhcck"] Apr 19 12:11:25.398897 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:25.398879 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xhcck" Apr 19 12:11:25.401005 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:25.400976 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 19 12:11:25.401154 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:25.401048 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 19 12:11:25.401154 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:25.401098 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-z7b9x\"" Apr 19 12:11:25.401154 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:25.401098 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 19 12:11:25.401851 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:25.401836 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 19 12:11:25.403519 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:25.403494 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xhcck"] Apr 19 12:11:25.500361 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:25.500319 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ff01778-4b50-4a8a-ab4e-abf54e99b970-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-xhcck\" (UID: \"0ff01778-4b50-4a8a-ab4e-abf54e99b970\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xhcck" Apr 19 12:11:25.500361 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:25.500362 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ldhv\" (UniqueName: \"kubernetes.io/projected/0ff01778-4b50-4a8a-ab4e-abf54e99b970-kube-api-access-7ldhv\") pod \"kube-storage-version-migrator-operator-6769c5d45-xhcck\" (UID: \"0ff01778-4b50-4a8a-ab4e-abf54e99b970\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xhcck" Apr 19 12:11:25.500614 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:25.500400 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ff01778-4b50-4a8a-ab4e-abf54e99b970-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-xhcck\" (UID: \"0ff01778-4b50-4a8a-ab4e-abf54e99b970\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xhcck" Apr 19 12:11:25.601508 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:25.601465 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ff01778-4b50-4a8a-ab4e-abf54e99b970-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-xhcck\" (UID: \"0ff01778-4b50-4a8a-ab4e-abf54e99b970\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xhcck" Apr 19 12:11:25.601508 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:25.601505 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7ldhv\" (UniqueName: \"kubernetes.io/projected/0ff01778-4b50-4a8a-ab4e-abf54e99b970-kube-api-access-7ldhv\") pod \"kube-storage-version-migrator-operator-6769c5d45-xhcck\" (UID: \"0ff01778-4b50-4a8a-ab4e-abf54e99b970\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xhcck" Apr 19 12:11:25.601713 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:25.601543 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ff01778-4b50-4a8a-ab4e-abf54e99b970-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-xhcck\" (UID: \"0ff01778-4b50-4a8a-ab4e-abf54e99b970\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xhcck" Apr 19 12:11:25.602129 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:25.602089 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ff01778-4b50-4a8a-ab4e-abf54e99b970-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-xhcck\" (UID: \"0ff01778-4b50-4a8a-ab4e-abf54e99b970\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xhcck" Apr 19 12:11:25.603754 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:25.603731 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ff01778-4b50-4a8a-ab4e-abf54e99b970-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-xhcck\" (UID: \"0ff01778-4b50-4a8a-ab4e-abf54e99b970\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xhcck" Apr 19 12:11:25.608899 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:25.608867 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ldhv\" (UniqueName: \"kubernetes.io/projected/0ff01778-4b50-4a8a-ab4e-abf54e99b970-kube-api-access-7ldhv\") pod \"kube-storage-version-migrator-operator-6769c5d45-xhcck\" (UID: \"0ff01778-4b50-4a8a-ab4e-abf54e99b970\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xhcck" Apr 19 12:11:25.711473 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:25.711423 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xhcck" Apr 19 12:11:25.820780 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:25.820747 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xhcck"] Apr 19 12:11:25.823495 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:11:25.823468 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ff01778_4b50_4a8a_ab4e_abf54e99b970.slice/crio-a8e5adbc89767a2bbdf0d8e8871ddc36dd4abd2b0fb151d06294e7d41e40beff WatchSource:0}: Error finding container a8e5adbc89767a2bbdf0d8e8871ddc36dd4abd2b0fb151d06294e7d41e40beff: Status 404 returned error can't find the container with id a8e5adbc89767a2bbdf0d8e8871ddc36dd4abd2b0fb151d06294e7d41e40beff Apr 19 12:11:25.958944 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:25.958918 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-8gjrs_8cd0f946-6502-4d2a-94d4-721582219a2f/dns-node-resolver/0.log" Apr 19 12:11:26.157250 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:26.157163 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xhcck" event={"ID":"0ff01778-4b50-4a8a-ab4e-abf54e99b970","Type":"ContainerStarted","Data":"a8e5adbc89767a2bbdf0d8e8871ddc36dd4abd2b0fb151d06294e7d41e40beff"} Apr 19 12:11:26.158372 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:26.158354 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c8kc9_88822cf0-d332-4d8f-ab99-d2460f2ad404/console-operator/1.log" Apr 19 12:11:26.158693 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:26.158678 2567 scope.go:117] "RemoveContainer" containerID="5255eaf59e0e909f317c3b7b514a0717b54d58243ec36a710855c93838c4e1a6" Apr 19 12:11:26.158861 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:11:26.158845 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-c8kc9_openshift-console-operator(88822cf0-d332-4d8f-ab99-d2460f2ad404)\"" pod="openshift-console-operator/console-operator-9d4b6777b-c8kc9" podUID="88822cf0-d332-4d8f-ab99-d2460f2ad404" Apr 19 12:11:27.159387 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:27.159355 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-zfq9x_2527ca81-1ebd-4808-a264-00f75b2caea4/node-ca/0.log" Apr 19 12:11:27.919854 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:27.919737 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bb22769b-c18a-471e-9118-2fca21dc6606-metrics-tls\") pod \"dns-default-7jlhs\" (UID: \"bb22769b-c18a-471e-9118-2fca21dc6606\") " pod="openshift-dns/dns-default-7jlhs" Apr 19 12:11:27.919854 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:27.919816 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cdd2a15-9ab0-45f9-9373-938372482e1a-cert\") pod \"ingress-canary-l5qnw\" (UID: \"0cdd2a15-9ab0-45f9-9373-938372482e1a\") " pod="openshift-ingress-canary/ingress-canary-l5qnw" Apr 19 12:11:27.920080 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:11:27.919909 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:11:27.920080 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:11:27.919912 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:11:27.920080 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:11:27.919971 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cdd2a15-9ab0-45f9-9373-938372482e1a-cert podName:0cdd2a15-9ab0-45f9-9373-938372482e1a nodeName:}" failed. No retries permitted until 2026-04-19 12:12:31.919953851 +0000 UTC m=+160.674145664 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0cdd2a15-9ab0-45f9-9373-938372482e1a-cert") pod "ingress-canary-l5qnw" (UID: "0cdd2a15-9ab0-45f9-9373-938372482e1a") : secret "canary-serving-cert" not found Apr 19 12:11:27.920080 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:11:27.919987 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb22769b-c18a-471e-9118-2fca21dc6606-metrics-tls podName:bb22769b-c18a-471e-9118-2fca21dc6606 nodeName:}" failed. No retries permitted until 2026-04-19 12:12:31.919980569 +0000 UTC m=+160.674172377 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bb22769b-c18a-471e-9118-2fca21dc6606-metrics-tls") pod "dns-default-7jlhs" (UID: "bb22769b-c18a-471e-9118-2fca21dc6606") : secret "dns-default-metrics-tls" not found Apr 19 12:11:28.323467 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:28.323429 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f239109c-ba15-4cff-b322-22aa684c3f58-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-nzrxw\" (UID: \"f239109c-ba15-4cff-b322-22aa684c3f58\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nzrxw" Apr 19 12:11:28.323919 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:11:28.323563 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 19 12:11:28.323919 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:11:28.323623 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f239109c-ba15-4cff-b322-22aa684c3f58-cluster-monitoring-operator-tls podName:f239109c-ba15-4cff-b322-22aa684c3f58 nodeName:}" failed. No retries permitted until 2026-04-19 12:11:36.323607661 +0000 UTC m=+105.077799482 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/f239109c-ba15-4cff-b322-22aa684c3f58-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-nzrxw" (UID: "f239109c-ba15-4cff-b322-22aa684c3f58") : secret "cluster-monitoring-operator-tls" not found Apr 19 12:11:29.166611 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:29.166578 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xhcck" event={"ID":"0ff01778-4b50-4a8a-ab4e-abf54e99b970","Type":"ContainerStarted","Data":"16da9d22ff6d5c44228cc72b110cc586b33dc1571562544b38630a7a4a413d5d"} Apr 19 12:11:29.181502 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:29.181459 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xhcck" podStartSLOduration=1.503694552 podStartE2EDuration="4.181443972s" podCreationTimestamp="2026-04-19 12:11:25 +0000 UTC" firstStartedPulling="2026-04-19 12:11:25.825302723 +0000 UTC m=+94.579494529" lastFinishedPulling="2026-04-19 12:11:28.503052139 +0000 UTC m=+97.257243949" observedRunningTime="2026-04-19 12:11:29.181064344 +0000 UTC m=+97.935256168" watchObservedRunningTime="2026-04-19 12:11:29.181443972 +0000 UTC m=+97.935635799" Apr 19 12:11:30.755520 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:30.755488 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-pllfr"] Apr 19 12:11:30.758579 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:30.758557 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-pllfr" Apr 19 12:11:30.761096 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:30.761075 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 19 12:11:30.761781 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:30.761763 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 19 12:11:30.761872 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:30.761772 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-ql5mw\"" Apr 19 12:11:30.764308 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:30.764289 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-pllfr"] Apr 19 12:11:30.845042 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:30.845003 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpdqc\" (UniqueName: \"kubernetes.io/projected/d63f033d-38ec-4dd5-be2a-1c76af7cde7d-kube-api-access-gpdqc\") pod \"migrator-74bb7799d9-pllfr\" (UID: \"d63f033d-38ec-4dd5-be2a-1c76af7cde7d\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-pllfr" Apr 19 12:11:30.945591 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:30.945554 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gpdqc\" (UniqueName: \"kubernetes.io/projected/d63f033d-38ec-4dd5-be2a-1c76af7cde7d-kube-api-access-gpdqc\") pod \"migrator-74bb7799d9-pllfr\" (UID: \"d63f033d-38ec-4dd5-be2a-1c76af7cde7d\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-pllfr" Apr 19 12:11:30.952594 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:30.952563 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpdqc\" (UniqueName: \"kubernetes.io/projected/d63f033d-38ec-4dd5-be2a-1c76af7cde7d-kube-api-access-gpdqc\") pod \"migrator-74bb7799d9-pllfr\" (UID: \"d63f033d-38ec-4dd5-be2a-1c76af7cde7d\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-pllfr" Apr 19 12:11:31.067690 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:31.067586 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-pllfr" Apr 19 12:11:31.180479 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:31.180449 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-pllfr"] Apr 19 12:11:31.183435 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:11:31.183404 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd63f033d_38ec_4dd5_be2a_1c76af7cde7d.slice/crio-79dd377e7b31ffec551abb2f367656aeaa09b8c61d2b62de185cd307190daece WatchSource:0}: Error finding container 79dd377e7b31ffec551abb2f367656aeaa09b8c61d2b62de185cd307190daece: Status 404 returned error can't find the container with id 79dd377e7b31ffec551abb2f367656aeaa09b8c61d2b62de185cd307190daece Apr 19 12:11:31.713818 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:31.713784 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-c8kc9" Apr 19 12:11:31.713818 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:31.713814 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-c8kc9" Apr 19 12:11:31.714167 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:31.714154 2567 scope.go:117] "RemoveContainer" containerID="5255eaf59e0e909f317c3b7b514a0717b54d58243ec36a710855c93838c4e1a6" Apr 19 12:11:31.714321 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:11:31.714304 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-c8kc9_openshift-console-operator(88822cf0-d332-4d8f-ab99-d2460f2ad404)\"" pod="openshift-console-operator/console-operator-9d4b6777b-c8kc9" podUID="88822cf0-d332-4d8f-ab99-d2460f2ad404" Apr 19 12:11:32.102466 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:32.102388 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-69kxr" Apr 19 12:11:32.175732 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:32.175686 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-pllfr" event={"ID":"d63f033d-38ec-4dd5-be2a-1c76af7cde7d","Type":"ContainerStarted","Data":"79dd377e7b31ffec551abb2f367656aeaa09b8c61d2b62de185cd307190daece"} Apr 19 12:11:33.179187 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:33.179155 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-pllfr" event={"ID":"d63f033d-38ec-4dd5-be2a-1c76af7cde7d","Type":"ContainerStarted","Data":"83da6bf1fcc5df629092debc5f8eb52a77ad86be56286729673ead89e1c6c89f"} Apr 19 12:11:33.179585 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:33.179195 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-pllfr" event={"ID":"d63f033d-38ec-4dd5-be2a-1c76af7cde7d","Type":"ContainerStarted","Data":"62b948fb2c24372e651058e5be38eb05b9bb5b954201eb4b64f636851aba3889"} Apr 19 12:11:33.192895 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:33.192843 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-pllfr" podStartSLOduration=1.684235412 podStartE2EDuration="3.192821937s" podCreationTimestamp="2026-04-19 12:11:30 +0000 UTC" firstStartedPulling="2026-04-19 12:11:31.185611836 +0000 UTC m=+99.939803642" lastFinishedPulling="2026-04-19 12:11:32.694198348 +0000 UTC m=+101.448390167" observedRunningTime="2026-04-19 12:11:33.192067457 +0000 UTC m=+101.946259285" watchObservedRunningTime="2026-04-19 12:11:33.192821937 +0000 UTC m=+101.947013758" Apr 19 12:11:33.612602 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:33.612518 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-fg8rh"] Apr 19 12:11:33.615387 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:33.615370 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-fg8rh" Apr 19 12:11:33.617579 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:33.617560 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 19 12:11:33.618372 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:33.618353 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 19 12:11:33.618487 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:33.618391 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 19 12:11:33.618487 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:33.618398 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-m6r6c\"" Apr 19 12:11:33.618487 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:33.618422 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 19 12:11:33.624369 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:33.624347 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-fg8rh"] Apr 19 12:11:33.667736 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:33.667703 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt5sr\" (UniqueName: \"kubernetes.io/projected/f34016fd-bc2f-43ff-b35c-a0544b792d06-kube-api-access-mt5sr\") pod \"service-ca-865cb79987-fg8rh\" (UID: \"f34016fd-bc2f-43ff-b35c-a0544b792d06\") " pod="openshift-service-ca/service-ca-865cb79987-fg8rh" Apr 19 12:11:33.667736 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:33.667751 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f34016fd-bc2f-43ff-b35c-a0544b792d06-signing-cabundle\") pod \"service-ca-865cb79987-fg8rh\" (UID: \"f34016fd-bc2f-43ff-b35c-a0544b792d06\") " pod="openshift-service-ca/service-ca-865cb79987-fg8rh" Apr 19 12:11:33.667946 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:33.667783 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f34016fd-bc2f-43ff-b35c-a0544b792d06-signing-key\") pod \"service-ca-865cb79987-fg8rh\" (UID: \"f34016fd-bc2f-43ff-b35c-a0544b792d06\") " pod="openshift-service-ca/service-ca-865cb79987-fg8rh" Apr 19 12:11:33.768496 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:33.768457 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mt5sr\" (UniqueName: \"kubernetes.io/projected/f34016fd-bc2f-43ff-b35c-a0544b792d06-kube-api-access-mt5sr\") pod \"service-ca-865cb79987-fg8rh\" (UID: \"f34016fd-bc2f-43ff-b35c-a0544b792d06\") " pod="openshift-service-ca/service-ca-865cb79987-fg8rh" Apr 19 12:11:33.768496 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:33.768504 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f34016fd-bc2f-43ff-b35c-a0544b792d06-signing-cabundle\") pod \"service-ca-865cb79987-fg8rh\" (UID: \"f34016fd-bc2f-43ff-b35c-a0544b792d06\") " pod="openshift-service-ca/service-ca-865cb79987-fg8rh" Apr 19 12:11:33.768695 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:33.768541 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f34016fd-bc2f-43ff-b35c-a0544b792d06-signing-key\") pod \"service-ca-865cb79987-fg8rh\" (UID: \"f34016fd-bc2f-43ff-b35c-a0544b792d06\") " pod="openshift-service-ca/service-ca-865cb79987-fg8rh" Apr 19 12:11:33.769224 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:33.769204 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f34016fd-bc2f-43ff-b35c-a0544b792d06-signing-cabundle\") pod \"service-ca-865cb79987-fg8rh\" (UID: \"f34016fd-bc2f-43ff-b35c-a0544b792d06\") " pod="openshift-service-ca/service-ca-865cb79987-fg8rh" Apr 19 12:11:33.771582 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:33.771565 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f34016fd-bc2f-43ff-b35c-a0544b792d06-signing-key\") pod \"service-ca-865cb79987-fg8rh\" (UID: \"f34016fd-bc2f-43ff-b35c-a0544b792d06\") " pod="openshift-service-ca/service-ca-865cb79987-fg8rh" Apr 19 12:11:33.775517 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:33.775495 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt5sr\" (UniqueName: \"kubernetes.io/projected/f34016fd-bc2f-43ff-b35c-a0544b792d06-kube-api-access-mt5sr\") pod \"service-ca-865cb79987-fg8rh\" (UID: \"f34016fd-bc2f-43ff-b35c-a0544b792d06\") " pod="openshift-service-ca/service-ca-865cb79987-fg8rh" Apr 19 12:11:33.924342 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:33.924268 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-fg8rh" Apr 19 12:11:34.038666 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:34.038525 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-fg8rh"] Apr 19 12:11:34.041101 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:11:34.041072 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf34016fd_bc2f_43ff_b35c_a0544b792d06.slice/crio-5888b57e76f3375e1020d2b821a1eb463863754bb70f02eab8b44e680ecee5fc WatchSource:0}: Error finding container 5888b57e76f3375e1020d2b821a1eb463863754bb70f02eab8b44e680ecee5fc: Status 404 returned error can't find the container with id 5888b57e76f3375e1020d2b821a1eb463863754bb70f02eab8b44e680ecee5fc Apr 19 12:11:34.186137 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:34.186093 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-fg8rh" event={"ID":"f34016fd-bc2f-43ff-b35c-a0544b792d06","Type":"ContainerStarted","Data":"5888b57e76f3375e1020d2b821a1eb463863754bb70f02eab8b44e680ecee5fc"} Apr 19 12:11:36.192595 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:36.192557 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-fg8rh" event={"ID":"f34016fd-bc2f-43ff-b35c-a0544b792d06","Type":"ContainerStarted","Data":"2d9e14e7d6c7b9283b9569e4ec775034b908381abdec0be93df769c72a28f98b"} Apr 19 12:11:36.208068 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:36.208018 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-fg8rh" podStartSLOduration=1.6778325619999999 podStartE2EDuration="3.208005307s" podCreationTimestamp="2026-04-19 12:11:33 +0000 UTC" firstStartedPulling="2026-04-19 12:11:34.042895585 +0000 UTC m=+102.797087391" lastFinishedPulling="2026-04-19 12:11:35.573068328 +0000 UTC m=+104.327260136" observedRunningTime="2026-04-19 12:11:36.207253026 +0000 UTC m=+104.961444854" watchObservedRunningTime="2026-04-19 12:11:36.208005307 +0000 UTC m=+104.962197134" Apr 19 12:11:36.390332 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:36.390296 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f239109c-ba15-4cff-b322-22aa684c3f58-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-nzrxw\" (UID: \"f239109c-ba15-4cff-b322-22aa684c3f58\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nzrxw" Apr 19 12:11:36.390484 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:11:36.390434 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 19 12:11:36.390524 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:11:36.390496 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f239109c-ba15-4cff-b322-22aa684c3f58-cluster-monitoring-operator-tls podName:f239109c-ba15-4cff-b322-22aa684c3f58 nodeName:}" failed. No retries permitted until 2026-04-19 12:11:52.390478534 +0000 UTC m=+121.144670363 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/f239109c-ba15-4cff-b322-22aa684c3f58-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-nzrxw" (UID: "f239109c-ba15-4cff-b322-22aa684c3f58") : secret "cluster-monitoring-operator-tls" not found Apr 19 12:11:42.833862 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:42.833831 2567 scope.go:117] "RemoveContainer" containerID="5255eaf59e0e909f317c3b7b514a0717b54d58243ec36a710855c93838c4e1a6" Apr 19 12:11:43.210671 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:43.210644 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c8kc9_88822cf0-d332-4d8f-ab99-d2460f2ad404/console-operator/2.log" Apr 19 12:11:43.211030 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:43.211016 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c8kc9_88822cf0-d332-4d8f-ab99-d2460f2ad404/console-operator/1.log" Apr 19 12:11:43.211126 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:43.211047 2567 generic.go:358] "Generic (PLEG): container finished" podID="88822cf0-d332-4d8f-ab99-d2460f2ad404" containerID="9858adefffef3312f227b1c09cfbb90a1e5d77627f92302cc427f29f99dbcdbc" exitCode=255 Apr 19 12:11:43.211126 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:43.211074 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-c8kc9" event={"ID":"88822cf0-d332-4d8f-ab99-d2460f2ad404","Type":"ContainerDied","Data":"9858adefffef3312f227b1c09cfbb90a1e5d77627f92302cc427f29f99dbcdbc"} Apr 19 12:11:43.211126 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:43.211100 2567 scope.go:117] "RemoveContainer" containerID="5255eaf59e0e909f317c3b7b514a0717b54d58243ec36a710855c93838c4e1a6" Apr 19 12:11:43.211460 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:43.211443 2567 scope.go:117] "RemoveContainer" containerID="9858adefffef3312f227b1c09cfbb90a1e5d77627f92302cc427f29f99dbcdbc" Apr 19 12:11:43.211620 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:11:43.211602 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-c8kc9_openshift-console-operator(88822cf0-d332-4d8f-ab99-d2460f2ad404)\"" pod="openshift-console-operator/console-operator-9d4b6777b-c8kc9" podUID="88822cf0-d332-4d8f-ab99-d2460f2ad404" Apr 19 12:11:44.214492 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:44.214464 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c8kc9_88822cf0-d332-4d8f-ab99-d2460f2ad404/console-operator/2.log" Apr 19 12:11:51.713757 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:51.713722 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-c8kc9" Apr 19 12:11:51.713757 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:51.713758 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-c8kc9" Apr 19 12:11:51.714253 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:51.714039 2567 scope.go:117] "RemoveContainer" containerID="9858adefffef3312f227b1c09cfbb90a1e5d77627f92302cc427f29f99dbcdbc" Apr 19 12:11:51.714253 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:11:51.714213 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-c8kc9_openshift-console-operator(88822cf0-d332-4d8f-ab99-d2460f2ad404)\"" pod="openshift-console-operator/console-operator-9d4b6777b-c8kc9" podUID="88822cf0-d332-4d8f-ab99-d2460f2ad404" Apr 19 12:11:52.412066 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:52.412025 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f239109c-ba15-4cff-b322-22aa684c3f58-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-nzrxw\" (UID: \"f239109c-ba15-4cff-b322-22aa684c3f58\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nzrxw" Apr 19 12:11:52.414380 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:52.414350 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f239109c-ba15-4cff-b322-22aa684c3f58-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-nzrxw\" (UID: \"f239109c-ba15-4cff-b322-22aa684c3f58\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nzrxw" Apr 19 12:11:52.688654 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:52.688630 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-25dmw\"" Apr 19 12:11:52.696363 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:52.696344 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nzrxw" Apr 19 12:11:52.805228 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:52.805192 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-nzrxw"] Apr 19 12:11:52.808314 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:11:52.808287 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf239109c_ba15_4cff_b322_22aa684c3f58.slice/crio-bdf2e992f81d676560c0cf28ee5065dca6d4a4b039848113e1c0cfc6c4729039 WatchSource:0}: Error finding container bdf2e992f81d676560c0cf28ee5065dca6d4a4b039848113e1c0cfc6c4729039: Status 404 returned error can't find the container with id bdf2e992f81d676560c0cf28ee5065dca6d4a4b039848113e1c0cfc6c4729039 Apr 19 12:11:53.238726 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:53.238690 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nzrxw" event={"ID":"f239109c-ba15-4cff-b322-22aa684c3f58","Type":"ContainerStarted","Data":"bdf2e992f81d676560c0cf28ee5065dca6d4a4b039848113e1c0cfc6c4729039"} Apr 19 12:11:54.433571 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.433542 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-vbdrc"] Apr 19 12:11:54.438786 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.438764 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-vbdrc" Apr 19 12:11:54.440886 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.440868 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-lhvjw\"" Apr 19 12:11:54.441730 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.441710 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 19 12:11:54.441820 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.441806 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 19 12:11:54.443252 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.443220 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-vbdrc"] Apr 19 12:11:54.527641 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.527613 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/eaa8980b-c94a-4ef2-8be6-d92a83338f04-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vbdrc\" (UID: \"eaa8980b-c94a-4ef2-8be6-d92a83338f04\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vbdrc" Apr 19 12:11:54.527786 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.527711 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/eaa8980b-c94a-4ef2-8be6-d92a83338f04-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-vbdrc\" (UID: \"eaa8980b-c94a-4ef2-8be6-d92a83338f04\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vbdrc" Apr 19 12:11:54.539377 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.539345 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-srrrj"] Apr 19 12:11:54.542614 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.542587 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-845d94bb8-vt2lj"] Apr 19 12:11:54.542745 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.542715 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-srrrj" Apr 19 12:11:54.545566 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.545541 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-845d94bb8-vt2lj" Apr 19 12:11:54.546094 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.546076 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-k9qgl\"" Apr 19 12:11:54.546227 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.546151 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 19 12:11:54.546303 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.546272 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 19 12:11:54.547745 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.547720 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 19 12:11:54.547859 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.547790 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-rh992\"" Apr 19 12:11:54.547859 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.547838 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 19 12:11:54.547973 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.547720 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 19 12:11:54.553203 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.553147 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 19 12:11:54.553596 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.553573 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-srrrj"] Apr 19 12:11:54.556546 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.556495 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-845d94bb8-vt2lj"] Apr 19 12:11:54.593947 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.593905 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-845d94bb8-vt2lj"] Apr 19 12:11:54.594181 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:11:54.594151 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[bound-sa-token ca-trust-extracted image-registry-private-configuration installation-pull-secrets kube-api-access-bkww9 registry-certificates registry-tls trusted-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-845d94bb8-vt2lj" podUID="02bef613-dc00-4550-8a61-17496a7669c2" Apr 19 12:11:54.628758 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.628725 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/02bef613-dc00-4550-8a61-17496a7669c2-bound-sa-token\") pod \"image-registry-845d94bb8-vt2lj\" (UID: \"02bef613-dc00-4550-8a61-17496a7669c2\") " pod="openshift-image-registry/image-registry-845d94bb8-vt2lj" Apr 19 12:11:54.628910 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.628778 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/02bef613-dc00-4550-8a61-17496a7669c2-image-registry-private-configuration\") pod \"image-registry-845d94bb8-vt2lj\" (UID: \"02bef613-dc00-4550-8a61-17496a7669c2\") " pod="openshift-image-registry/image-registry-845d94bb8-vt2lj" Apr 19 12:11:54.628910 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.628811 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/eaa8980b-c94a-4ef2-8be6-d92a83338f04-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vbdrc\" (UID: \"eaa8980b-c94a-4ef2-8be6-d92a83338f04\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vbdrc" Apr 19 12:11:54.628910 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.628845 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/02bef613-dc00-4550-8a61-17496a7669c2-registry-certificates\") pod \"image-registry-845d94bb8-vt2lj\" (UID: \"02bef613-dc00-4550-8a61-17496a7669c2\") " pod="openshift-image-registry/image-registry-845d94bb8-vt2lj" Apr 19 12:11:54.628910 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.628869 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/02bef613-dc00-4550-8a61-17496a7669c2-registry-tls\") pod \"image-registry-845d94bb8-vt2lj\" (UID: \"02bef613-dc00-4550-8a61-17496a7669c2\") " pod="openshift-image-registry/image-registry-845d94bb8-vt2lj" Apr 19 12:11:54.629163 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.629014 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkww9\" (UniqueName: \"kubernetes.io/projected/02bef613-dc00-4550-8a61-17496a7669c2-kube-api-access-bkww9\") pod \"image-registry-845d94bb8-vt2lj\" (UID: \"02bef613-dc00-4550-8a61-17496a7669c2\") " pod="openshift-image-registry/image-registry-845d94bb8-vt2lj" Apr 19 12:11:54.629163 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.629047 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c089de39-3da5-44fa-94ec-ad70051ece2c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-srrrj\" (UID: \"c089de39-3da5-44fa-94ec-ad70051ece2c\") " pod="openshift-insights/insights-runtime-extractor-srrrj" Apr 19 12:11:54.629163 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.629068 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l8t6\" (UniqueName: \"kubernetes.io/projected/c089de39-3da5-44fa-94ec-ad70051ece2c-kube-api-access-9l8t6\") pod \"insights-runtime-extractor-srrrj\" (UID: \"c089de39-3da5-44fa-94ec-ad70051ece2c\") " pod="openshift-insights/insights-runtime-extractor-srrrj" Apr 19 12:11:54.629163 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.629141 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c089de39-3da5-44fa-94ec-ad70051ece2c-data-volume\") pod \"insights-runtime-extractor-srrrj\" (UID: \"c089de39-3da5-44fa-94ec-ad70051ece2c\") " pod="openshift-insights/insights-runtime-extractor-srrrj" Apr 19 12:11:54.629363 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.629172 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c089de39-3da5-44fa-94ec-ad70051ece2c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-srrrj\" (UID: \"c089de39-3da5-44fa-94ec-ad70051ece2c\") " pod="openshift-insights/insights-runtime-extractor-srrrj" Apr 19 12:11:54.629363 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.629225 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/02bef613-dc00-4550-8a61-17496a7669c2-installation-pull-secrets\") pod \"image-registry-845d94bb8-vt2lj\" (UID: \"02bef613-dc00-4550-8a61-17496a7669c2\") " pod="openshift-image-registry/image-registry-845d94bb8-vt2lj" Apr 19 12:11:54.629363 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.629275 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c089de39-3da5-44fa-94ec-ad70051ece2c-crio-socket\") pod \"insights-runtime-extractor-srrrj\" (UID: \"c089de39-3da5-44fa-94ec-ad70051ece2c\") " pod="openshift-insights/insights-runtime-extractor-srrrj" Apr 19 12:11:54.629363 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.629307 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/02bef613-dc00-4550-8a61-17496a7669c2-ca-trust-extracted\") pod \"image-registry-845d94bb8-vt2lj\" (UID: \"02bef613-dc00-4550-8a61-17496a7669c2\") " pod="openshift-image-registry/image-registry-845d94bb8-vt2lj" Apr 19 12:11:54.629363 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.629332 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/eaa8980b-c94a-4ef2-8be6-d92a83338f04-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-vbdrc\" (UID: \"eaa8980b-c94a-4ef2-8be6-d92a83338f04\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vbdrc" Apr 19 12:11:54.629363 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.629359 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02bef613-dc00-4550-8a61-17496a7669c2-trusted-ca\") pod \"image-registry-845d94bb8-vt2lj\" (UID: \"02bef613-dc00-4550-8a61-17496a7669c2\") " pod="openshift-image-registry/image-registry-845d94bb8-vt2lj" Apr 19 12:11:54.629947 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.629924 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/eaa8980b-c94a-4ef2-8be6-d92a83338f04-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-vbdrc\" (UID: \"eaa8980b-c94a-4ef2-8be6-d92a83338f04\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vbdrc" Apr 19 12:11:54.631476 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.631458 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/eaa8980b-c94a-4ef2-8be6-d92a83338f04-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vbdrc\" (UID: \"eaa8980b-c94a-4ef2-8be6-d92a83338f04\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vbdrc" Apr 19 12:11:54.729810 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.729728 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c089de39-3da5-44fa-94ec-ad70051ece2c-data-volume\") pod \"insights-runtime-extractor-srrrj\" (UID: \"c089de39-3da5-44fa-94ec-ad70051ece2c\") " pod="openshift-insights/insights-runtime-extractor-srrrj" Apr 19 12:11:54.729810 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.729772 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c089de39-3da5-44fa-94ec-ad70051ece2c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-srrrj\" (UID: \"c089de39-3da5-44fa-94ec-ad70051ece2c\") " pod="openshift-insights/insights-runtime-extractor-srrrj" Apr 19 12:11:54.729999 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.729831 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/02bef613-dc00-4550-8a61-17496a7669c2-installation-pull-secrets\") pod \"image-registry-845d94bb8-vt2lj\" (UID: \"02bef613-dc00-4550-8a61-17496a7669c2\") " pod="openshift-image-registry/image-registry-845d94bb8-vt2lj" Apr 19 12:11:54.729999 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.729871 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c089de39-3da5-44fa-94ec-ad70051ece2c-crio-socket\") pod \"insights-runtime-extractor-srrrj\" (UID: \"c089de39-3da5-44fa-94ec-ad70051ece2c\") " pod="openshift-insights/insights-runtime-extractor-srrrj" Apr 19 12:11:54.729999 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.729902 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/02bef613-dc00-4550-8a61-17496a7669c2-ca-trust-extracted\") pod \"image-registry-845d94bb8-vt2lj\" (UID: \"02bef613-dc00-4550-8a61-17496a7669c2\") " pod="openshift-image-registry/image-registry-845d94bb8-vt2lj" Apr 19 12:11:54.729999 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.729933 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02bef613-dc00-4550-8a61-17496a7669c2-trusted-ca\") pod \"image-registry-845d94bb8-vt2lj\" (UID: \"02bef613-dc00-4550-8a61-17496a7669c2\") " pod="openshift-image-registry/image-registry-845d94bb8-vt2lj" Apr 19 12:11:54.729999 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.729967 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/02bef613-dc00-4550-8a61-17496a7669c2-bound-sa-token\") pod \"image-registry-845d94bb8-vt2lj\" (UID: \"02bef613-dc00-4550-8a61-17496a7669c2\") " pod="openshift-image-registry/image-registry-845d94bb8-vt2lj" Apr 19 12:11:54.730190 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.730008 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/02bef613-dc00-4550-8a61-17496a7669c2-image-registry-private-configuration\") pod \"image-registry-845d94bb8-vt2lj\" (UID: \"02bef613-dc00-4550-8a61-17496a7669c2\") " pod="openshift-image-registry/image-registry-845d94bb8-vt2lj" Apr 19 12:11:54.730190 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.730048 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/02bef613-dc00-4550-8a61-17496a7669c2-registry-certificates\") pod \"image-registry-845d94bb8-vt2lj\" (UID: \"02bef613-dc00-4550-8a61-17496a7669c2\") " pod="openshift-image-registry/image-registry-845d94bb8-vt2lj" Apr 19 12:11:54.730190 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.730071 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/02bef613-dc00-4550-8a61-17496a7669c2-registry-tls\") pod \"image-registry-845d94bb8-vt2lj\" (UID: \"02bef613-dc00-4550-8a61-17496a7669c2\") " pod="openshift-image-registry/image-registry-845d94bb8-vt2lj" Apr 19 12:11:54.730190 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.730096 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bkww9\" (UniqueName: \"kubernetes.io/projected/02bef613-dc00-4550-8a61-17496a7669c2-kube-api-access-bkww9\") pod \"image-registry-845d94bb8-vt2lj\" (UID: \"02bef613-dc00-4550-8a61-17496a7669c2\") " pod="openshift-image-registry/image-registry-845d94bb8-vt2lj" Apr 19 12:11:54.730190 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.730144 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c089de39-3da5-44fa-94ec-ad70051ece2c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-srrrj\" (UID: \"c089de39-3da5-44fa-94ec-ad70051ece2c\") " pod="openshift-insights/insights-runtime-extractor-srrrj" Apr 19 12:11:54.730190 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.730158 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c089de39-3da5-44fa-94ec-ad70051ece2c-data-volume\") pod \"insights-runtime-extractor-srrrj\" (UID: \"c089de39-3da5-44fa-94ec-ad70051ece2c\") " pod="openshift-insights/insights-runtime-extractor-srrrj" Apr 19 12:11:54.730190 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.730171 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9l8t6\" (UniqueName: \"kubernetes.io/projected/c089de39-3da5-44fa-94ec-ad70051ece2c-kube-api-access-9l8t6\") pod \"insights-runtime-extractor-srrrj\" (UID: \"c089de39-3da5-44fa-94ec-ad70051ece2c\") " pod="openshift-insights/insights-runtime-extractor-srrrj" Apr 19 12:11:54.730767 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.730358 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/02bef613-dc00-4550-8a61-17496a7669c2-ca-trust-extracted\") pod \"image-registry-845d94bb8-vt2lj\" (UID: \"02bef613-dc00-4550-8a61-17496a7669c2\") " pod="openshift-image-registry/image-registry-845d94bb8-vt2lj" Apr 19 12:11:54.730767 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.730410 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c089de39-3da5-44fa-94ec-ad70051ece2c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-srrrj\" (UID: \"c089de39-3da5-44fa-94ec-ad70051ece2c\") " pod="openshift-insights/insights-runtime-extractor-srrrj" Apr 19 12:11:54.730767 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.730587 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c089de39-3da5-44fa-94ec-ad70051ece2c-crio-socket\") pod \"insights-runtime-extractor-srrrj\" (UID: \"c089de39-3da5-44fa-94ec-ad70051ece2c\") " pod="openshift-insights/insights-runtime-extractor-srrrj" Apr 19 12:11:54.730964 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.730945 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/02bef613-dc00-4550-8a61-17496a7669c2-registry-certificates\") pod \"image-registry-845d94bb8-vt2lj\" (UID: \"02bef613-dc00-4550-8a61-17496a7669c2\") " pod="openshift-image-registry/image-registry-845d94bb8-vt2lj" Apr 19 12:11:54.731279 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.731234 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02bef613-dc00-4550-8a61-17496a7669c2-trusted-ca\") pod \"image-registry-845d94bb8-vt2lj\" (UID: \"02bef613-dc00-4550-8a61-17496a7669c2\") " pod="openshift-image-registry/image-registry-845d94bb8-vt2lj" Apr 19 12:11:54.733132 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.733088 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/02bef613-dc00-4550-8a61-17496a7669c2-image-registry-private-configuration\") pod \"image-registry-845d94bb8-vt2lj\" (UID: \"02bef613-dc00-4550-8a61-17496a7669c2\") " pod="openshift-image-registry/image-registry-845d94bb8-vt2lj" Apr 19 12:11:54.733252 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.733098 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c089de39-3da5-44fa-94ec-ad70051ece2c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-srrrj\" (UID: \"c089de39-3da5-44fa-94ec-ad70051ece2c\") " pod="openshift-insights/insights-runtime-extractor-srrrj" Apr 19 12:11:54.733252 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.733233 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/02bef613-dc00-4550-8a61-17496a7669c2-registry-tls\") pod \"image-registry-845d94bb8-vt2lj\" (UID: \"02bef613-dc00-4550-8a61-17496a7669c2\") " pod="openshift-image-registry/image-registry-845d94bb8-vt2lj" Apr 19 12:11:54.740918 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.740896 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/02bef613-dc00-4550-8a61-17496a7669c2-installation-pull-secrets\") pod \"image-registry-845d94bb8-vt2lj\" (UID: \"02bef613-dc00-4550-8a61-17496a7669c2\") " pod="openshift-image-registry/image-registry-845d94bb8-vt2lj" Apr 19 12:11:54.741319 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.741273 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkww9\" (UniqueName: \"kubernetes.io/projected/02bef613-dc00-4550-8a61-17496a7669c2-kube-api-access-bkww9\") pod \"image-registry-845d94bb8-vt2lj\" (UID: \"02bef613-dc00-4550-8a61-17496a7669c2\") " pod="openshift-image-registry/image-registry-845d94bb8-vt2lj" Apr 19 12:11:54.741556 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.741533 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l8t6\" (UniqueName: \"kubernetes.io/projected/c089de39-3da5-44fa-94ec-ad70051ece2c-kube-api-access-9l8t6\") pod \"insights-runtime-extractor-srrrj\" (UID: \"c089de39-3da5-44fa-94ec-ad70051ece2c\") " pod="openshift-insights/insights-runtime-extractor-srrrj" Apr 19 12:11:54.741688 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.741661 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/02bef613-dc00-4550-8a61-17496a7669c2-bound-sa-token\") pod \"image-registry-845d94bb8-vt2lj\" (UID: \"02bef613-dc00-4550-8a61-17496a7669c2\") " pod="openshift-image-registry/image-registry-845d94bb8-vt2lj" Apr 19 12:11:54.748290 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.748271 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-vbdrc" Apr 19 12:11:54.854363 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:54.854333 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-srrrj" Apr 19 12:11:55.115862 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:55.115836 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-vbdrc"] Apr 19 12:11:55.116688 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:11:55.116653 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeaa8980b_c94a_4ef2_8be6_d92a83338f04.slice/crio-6fc15c8084ae5b6d1d511214ee312aabb891a602532aba34d6490a23463a174b WatchSource:0}: Error finding container 6fc15c8084ae5b6d1d511214ee312aabb891a602532aba34d6490a23463a174b: Status 404 returned error can't find the container with id 6fc15c8084ae5b6d1d511214ee312aabb891a602532aba34d6490a23463a174b Apr 19 12:11:55.128930 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:55.128902 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-srrrj"] Apr 19 12:11:55.132469 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:11:55.132445 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc089de39_3da5_44fa_94ec_ad70051ece2c.slice/crio-33e21d8130129e9dfa798c60003c803c55fb7d351d8d69a6d40baa75267df8d8 WatchSource:0}: Error finding container 33e21d8130129e9dfa798c60003c803c55fb7d351d8d69a6d40baa75267df8d8: Status 404 returned error can't find the container with id 33e21d8130129e9dfa798c60003c803c55fb7d351d8d69a6d40baa75267df8d8 Apr 19 12:11:55.244445 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:55.244413 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-srrrj" event={"ID":"c089de39-3da5-44fa-94ec-ad70051ece2c","Type":"ContainerStarted","Data":"89fcda93d44082891474a90f4cb9d88c075b4f06480624850a4a8869eb870d7a"} Apr 19 12:11:55.244602 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:55.244452 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-srrrj" event={"ID":"c089de39-3da5-44fa-94ec-ad70051ece2c","Type":"ContainerStarted","Data":"33e21d8130129e9dfa798c60003c803c55fb7d351d8d69a6d40baa75267df8d8"} Apr 19 12:11:55.245646 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:55.245622 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nzrxw" event={"ID":"f239109c-ba15-4cff-b322-22aa684c3f58","Type":"ContainerStarted","Data":"f372c1ff060403deacaac10ce64a9a5886494e4fc02426d7312e1053d8ac8ea9"} Apr 19 12:11:55.246636 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:55.246613 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-vbdrc" event={"ID":"eaa8980b-c94a-4ef2-8be6-d92a83338f04","Type":"ContainerStarted","Data":"6fc15c8084ae5b6d1d511214ee312aabb891a602532aba34d6490a23463a174b"} Apr 19 12:11:55.246715 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:55.246645 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-845d94bb8-vt2lj" Apr 19 12:11:55.250709 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:55.250666 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-845d94bb8-vt2lj" Apr 19 12:11:55.261712 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:55.261676 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nzrxw" podStartSLOduration=33.031657933 podStartE2EDuration="35.261666617s" podCreationTimestamp="2026-04-19 12:11:20 +0000 UTC" firstStartedPulling="2026-04-19 12:11:52.810147102 +0000 UTC m=+121.564338906" lastFinishedPulling="2026-04-19 12:11:55.04015578 +0000 UTC m=+123.794347590" observedRunningTime="2026-04-19 12:11:55.260892789 +0000 UTC m=+124.015084616" watchObservedRunningTime="2026-04-19 12:11:55.261666617 +0000 UTC m=+124.015858443" Apr 19 12:11:55.334080 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:55.334050 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkww9\" (UniqueName: \"kubernetes.io/projected/02bef613-dc00-4550-8a61-17496a7669c2-kube-api-access-bkww9\") pod \"02bef613-dc00-4550-8a61-17496a7669c2\" (UID: \"02bef613-dc00-4550-8a61-17496a7669c2\") " Apr 19 12:11:55.334224 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:55.334086 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/02bef613-dc00-4550-8a61-17496a7669c2-installation-pull-secrets\") pod \"02bef613-dc00-4550-8a61-17496a7669c2\" (UID: \"02bef613-dc00-4550-8a61-17496a7669c2\") " Apr 19 12:11:55.334224 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:55.334104 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/02bef613-dc00-4550-8a61-17496a7669c2-ca-trust-extracted\") pod \"02bef613-dc00-4550-8a61-17496a7669c2\" (UID: \"02bef613-dc00-4550-8a61-17496a7669c2\") " Apr 19 12:11:55.334224 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:55.334144 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/02bef613-dc00-4550-8a61-17496a7669c2-bound-sa-token\") pod \"02bef613-dc00-4550-8a61-17496a7669c2\" (UID: \"02bef613-dc00-4550-8a61-17496a7669c2\") " Apr 19 12:11:55.334224 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:55.334171 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/02bef613-dc00-4550-8a61-17496a7669c2-registry-certificates\") pod \"02bef613-dc00-4550-8a61-17496a7669c2\" (UID: \"02bef613-dc00-4550-8a61-17496a7669c2\") " Apr 19 12:11:55.334432 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:55.334213 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/02bef613-dc00-4550-8a61-17496a7669c2-image-registry-private-configuration\") pod \"02bef613-dc00-4550-8a61-17496a7669c2\" (UID: \"02bef613-dc00-4550-8a61-17496a7669c2\") " Apr 19 12:11:55.334432 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:55.334265 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/02bef613-dc00-4550-8a61-17496a7669c2-registry-tls\") pod \"02bef613-dc00-4550-8a61-17496a7669c2\" (UID: \"02bef613-dc00-4550-8a61-17496a7669c2\") " Apr 19 12:11:55.334432 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:55.334293 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02bef613-dc00-4550-8a61-17496a7669c2-trusted-ca\") pod \"02bef613-dc00-4550-8a61-17496a7669c2\" (UID: \"02bef613-dc00-4550-8a61-17496a7669c2\") " Apr 19 12:11:55.334432 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:55.334397 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02bef613-dc00-4550-8a61-17496a7669c2-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "02bef613-dc00-4550-8a61-17496a7669c2" (UID: "02bef613-dc00-4550-8a61-17496a7669c2"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:11:55.335317 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:55.334614 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02bef613-dc00-4550-8a61-17496a7669c2-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "02bef613-dc00-4550-8a61-17496a7669c2" (UID: "02bef613-dc00-4550-8a61-17496a7669c2"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:11:55.335317 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:55.334644 2567 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/02bef613-dc00-4550-8a61-17496a7669c2-ca-trust-extracted\") on node \"ip-10-0-131-150.ec2.internal\" DevicePath \"\"" Apr 19 12:11:55.335317 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:55.334934 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02bef613-dc00-4550-8a61-17496a7669c2-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "02bef613-dc00-4550-8a61-17496a7669c2" (UID: "02bef613-dc00-4550-8a61-17496a7669c2"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:11:55.336662 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:55.336631 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02bef613-dc00-4550-8a61-17496a7669c2-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "02bef613-dc00-4550-8a61-17496a7669c2" (UID: "02bef613-dc00-4550-8a61-17496a7669c2"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:11:55.336776 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:55.336705 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02bef613-dc00-4550-8a61-17496a7669c2-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "02bef613-dc00-4550-8a61-17496a7669c2" (UID: "02bef613-dc00-4550-8a61-17496a7669c2"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:11:55.336776 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:55.336716 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02bef613-dc00-4550-8a61-17496a7669c2-kube-api-access-bkww9" (OuterVolumeSpecName: "kube-api-access-bkww9") pod "02bef613-dc00-4550-8a61-17496a7669c2" (UID: "02bef613-dc00-4550-8a61-17496a7669c2"). InnerVolumeSpecName "kube-api-access-bkww9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:11:55.336776 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:55.336749 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02bef613-dc00-4550-8a61-17496a7669c2-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "02bef613-dc00-4550-8a61-17496a7669c2" (UID: "02bef613-dc00-4550-8a61-17496a7669c2"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:11:55.337375 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:55.337355 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02bef613-dc00-4550-8a61-17496a7669c2-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "02bef613-dc00-4550-8a61-17496a7669c2" (UID: "02bef613-dc00-4550-8a61-17496a7669c2"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:11:55.435536 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:55.435500 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bkww9\" (UniqueName: \"kubernetes.io/projected/02bef613-dc00-4550-8a61-17496a7669c2-kube-api-access-bkww9\") on node \"ip-10-0-131-150.ec2.internal\" DevicePath \"\"" Apr 19 12:11:55.435536 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:55.435530 2567 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/02bef613-dc00-4550-8a61-17496a7669c2-installation-pull-secrets\") on node \"ip-10-0-131-150.ec2.internal\" DevicePath \"\"" Apr 19 12:11:55.435536 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:55.435540 2567 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/02bef613-dc00-4550-8a61-17496a7669c2-bound-sa-token\") on node \"ip-10-0-131-150.ec2.internal\" DevicePath \"\"" Apr 19 12:11:55.435936 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:55.435549 2567 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/02bef613-dc00-4550-8a61-17496a7669c2-registry-certificates\") on node \"ip-10-0-131-150.ec2.internal\" DevicePath \"\"" Apr 19 12:11:55.435936 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:55.435560 2567 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/02bef613-dc00-4550-8a61-17496a7669c2-image-registry-private-configuration\") on node \"ip-10-0-131-150.ec2.internal\" DevicePath \"\"" Apr 19 12:11:55.435936 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:55.435571 2567 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/02bef613-dc00-4550-8a61-17496a7669c2-registry-tls\") on node \"ip-10-0-131-150.ec2.internal\" DevicePath \"\"" Apr 19 12:11:55.435936 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:55.435580 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02bef613-dc00-4550-8a61-17496a7669c2-trusted-ca\") on node \"ip-10-0-131-150.ec2.internal\" DevicePath \"\"" Apr 19 12:11:56.255627 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:56.255592 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-vbdrc" event={"ID":"eaa8980b-c94a-4ef2-8be6-d92a83338f04","Type":"ContainerStarted","Data":"cbb96c729edc67a34fa9292c285bbcb5a8a0e65a898257e08735857a44bd81e0"} Apr 19 12:11:56.257032 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:56.257002 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-srrrj" event={"ID":"c089de39-3da5-44fa-94ec-ad70051ece2c","Type":"ContainerStarted","Data":"94798e696f6108691a7a3cab899383ca0598485a636fc819c8c8908b9e7cef10"} Apr 19 12:11:56.257169 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:56.257151 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-845d94bb8-vt2lj" Apr 19 12:11:56.270465 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:56.270426 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-vbdrc" podStartSLOduration=1.224220048 podStartE2EDuration="2.270414426s" podCreationTimestamp="2026-04-19 12:11:54 +0000 UTC" firstStartedPulling="2026-04-19 12:11:55.118707305 +0000 UTC m=+123.872899110" lastFinishedPulling="2026-04-19 12:11:56.164901678 +0000 UTC m=+124.919093488" observedRunningTime="2026-04-19 12:11:56.269513725 +0000 UTC m=+125.023705553" watchObservedRunningTime="2026-04-19 12:11:56.270414426 +0000 UTC m=+125.024606312" Apr 19 12:11:56.299294 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:56.299260 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-845d94bb8-vt2lj"] Apr 19 12:11:56.301969 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:56.301948 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-845d94bb8-vt2lj"] Apr 19 12:11:57.837986 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:57.837948 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02bef613-dc00-4550-8a61-17496a7669c2" path="/var/lib/kubelet/pods/02bef613-dc00-4550-8a61-17496a7669c2/volumes" Apr 19 12:11:58.264563 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:58.264528 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-srrrj" event={"ID":"c089de39-3da5-44fa-94ec-ad70051ece2c","Type":"ContainerStarted","Data":"a082f40b29e7218743b0545e145c5f4cae5961aaa9bca2b181f7c833149cb9e1"} Apr 19 12:11:58.289928 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:11:58.289874 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-srrrj" podStartSLOduration=1.4433137999999999 podStartE2EDuration="4.289858192s" podCreationTimestamp="2026-04-19 12:11:54 +0000 UTC" firstStartedPulling="2026-04-19 12:11:55.200543098 +0000 UTC m=+123.954734910" lastFinishedPulling="2026-04-19 12:11:58.047087483 +0000 UTC m=+126.801279302" observedRunningTime="2026-04-19 12:11:58.288251932 +0000 UTC m=+127.042443759" watchObservedRunningTime="2026-04-19 12:11:58.289858192 +0000 UTC m=+127.044050019" Apr 19 12:12:01.586902 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:01.586866 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a9d31f9-fb41-43e0-9946-0611710438a1-metrics-certs\") pod \"network-metrics-daemon-rxnrj\" (UID: \"3a9d31f9-fb41-43e0-9946-0611710438a1\") " pod="openshift-multus/network-metrics-daemon-rxnrj" Apr 19 12:12:01.589201 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:01.589178 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a9d31f9-fb41-43e0-9946-0611710438a1-metrics-certs\") pod \"network-metrics-daemon-rxnrj\" (UID: \"3a9d31f9-fb41-43e0-9946-0611710438a1\") " pod="openshift-multus/network-metrics-daemon-rxnrj" Apr 19 12:12:01.844460 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:01.844389 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lsl4x\"" Apr 19 12:12:01.852738 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:01.852713 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rxnrj" Apr 19 12:12:01.973424 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:01.972517 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-l4v2s"] Apr 19 12:12:01.977768 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:01.977747 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-4hzrs"] Apr 19 12:12:01.978853 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:01.978033 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-l4v2s" Apr 19 12:12:01.980515 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:01.980286 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 19 12:12:01.980662 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:01.980585 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 19 12:12:01.980931 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:01.980780 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 19 12:12:01.981177 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:01.981145 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-5nv8s\"" Apr 19 12:12:01.983137 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:01.981635 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 19 12:12:01.983137 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:01.982078 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4hzrs" Apr 19 12:12:01.984595 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:01.984574 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-l4v2s"] Apr 19 12:12:01.984912 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:01.984841 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-lgltw\"" Apr 19 12:12:01.985099 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:01.985083 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 19 12:12:01.985278 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:01.985263 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 19 12:12:01.985508 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:01.985493 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 19 12:12:02.018214 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.018166 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rxnrj"] Apr 19 12:12:02.090554 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.090533 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/898b1e53-98cc-4608-bcf9-727c3e285ef4-node-exporter-textfile\") pod \"node-exporter-4hzrs\" (UID: \"898b1e53-98cc-4608-bcf9-727c3e285ef4\") " pod="openshift-monitoring/node-exporter-4hzrs" Apr 19 12:12:02.090686 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.090647 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/898b1e53-98cc-4608-bcf9-727c3e285ef4-metrics-client-ca\") pod \"node-exporter-4hzrs\" (UID: \"898b1e53-98cc-4608-bcf9-727c3e285ef4\") " pod="openshift-monitoring/node-exporter-4hzrs" Apr 19 12:12:02.090746 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.090721 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/898b1e53-98cc-4608-bcf9-727c3e285ef4-root\") pod \"node-exporter-4hzrs\" (UID: \"898b1e53-98cc-4608-bcf9-727c3e285ef4\") " pod="openshift-monitoring/node-exporter-4hzrs" Apr 19 12:12:02.090801 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.090758 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/20d0fb14-1bf7-42b5-bee5-1769c34fea15-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-l4v2s\" (UID: \"20d0fb14-1bf7-42b5-bee5-1769c34fea15\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-l4v2s" Apr 19 12:12:02.090868 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.090830 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/898b1e53-98cc-4608-bcf9-727c3e285ef4-node-exporter-wtmp\") pod \"node-exporter-4hzrs\" (UID: \"898b1e53-98cc-4608-bcf9-727c3e285ef4\") " pod="openshift-monitoring/node-exporter-4hzrs" Apr 19 12:12:02.090929 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.090890 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/898b1e53-98cc-4608-bcf9-727c3e285ef4-node-exporter-accelerators-collector-config\") pod \"node-exporter-4hzrs\" (UID: \"898b1e53-98cc-4608-bcf9-727c3e285ef4\") " pod="openshift-monitoring/node-exporter-4hzrs" Apr 19 12:12:02.090985 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.090960 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/20d0fb14-1bf7-42b5-bee5-1769c34fea15-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-l4v2s\" (UID: \"20d0fb14-1bf7-42b5-bee5-1769c34fea15\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-l4v2s" Apr 19 12:12:02.091037 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.091011 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clc2q\" (UniqueName: \"kubernetes.io/projected/20d0fb14-1bf7-42b5-bee5-1769c34fea15-kube-api-access-clc2q\") pod \"kube-state-metrics-69db897b98-l4v2s\" (UID: \"20d0fb14-1bf7-42b5-bee5-1769c34fea15\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-l4v2s" Apr 19 12:12:02.091090 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.091042 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/20d0fb14-1bf7-42b5-bee5-1769c34fea15-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-l4v2s\" (UID: \"20d0fb14-1bf7-42b5-bee5-1769c34fea15\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-l4v2s" Apr 19 12:12:02.091164 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.091107 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/20d0fb14-1bf7-42b5-bee5-1769c34fea15-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-l4v2s\" (UID: \"20d0fb14-1bf7-42b5-bee5-1769c34fea15\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-l4v2s" Apr 19 12:12:02.091226 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.091169 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/898b1e53-98cc-4608-bcf9-727c3e285ef4-sys\") pod \"node-exporter-4hzrs\" (UID: \"898b1e53-98cc-4608-bcf9-727c3e285ef4\") " pod="openshift-monitoring/node-exporter-4hzrs" Apr 19 12:12:02.091226 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.091216 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/898b1e53-98cc-4608-bcf9-727c3e285ef4-node-exporter-tls\") pod \"node-exporter-4hzrs\" (UID: \"898b1e53-98cc-4608-bcf9-727c3e285ef4\") " pod="openshift-monitoring/node-exporter-4hzrs" Apr 19 12:12:02.091337 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.091246 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf88r\" (UniqueName: \"kubernetes.io/projected/898b1e53-98cc-4608-bcf9-727c3e285ef4-kube-api-access-xf88r\") pod \"node-exporter-4hzrs\" (UID: \"898b1e53-98cc-4608-bcf9-727c3e285ef4\") " pod="openshift-monitoring/node-exporter-4hzrs" Apr 19 12:12:02.091337 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.091301 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/898b1e53-98cc-4608-bcf9-727c3e285ef4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4hzrs\" (UID: \"898b1e53-98cc-4608-bcf9-727c3e285ef4\") " pod="openshift-monitoring/node-exporter-4hzrs" Apr 19 12:12:02.091440 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.091357 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/20d0fb14-1bf7-42b5-bee5-1769c34fea15-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-l4v2s\" (UID: \"20d0fb14-1bf7-42b5-bee5-1769c34fea15\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-l4v2s" Apr 19 12:12:02.195084 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.192490 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/898b1e53-98cc-4608-bcf9-727c3e285ef4-sys\") pod \"node-exporter-4hzrs\" (UID: \"898b1e53-98cc-4608-bcf9-727c3e285ef4\") " pod="openshift-monitoring/node-exporter-4hzrs" Apr 19 12:12:02.195084 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.192535 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/898b1e53-98cc-4608-bcf9-727c3e285ef4-node-exporter-tls\") pod \"node-exporter-4hzrs\" (UID: \"898b1e53-98cc-4608-bcf9-727c3e285ef4\") " pod="openshift-monitoring/node-exporter-4hzrs" Apr 19 12:12:02.195084 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.192567 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xf88r\" (UniqueName: \"kubernetes.io/projected/898b1e53-98cc-4608-bcf9-727c3e285ef4-kube-api-access-xf88r\") pod \"node-exporter-4hzrs\" (UID: \"898b1e53-98cc-4608-bcf9-727c3e285ef4\") " pod="openshift-monitoring/node-exporter-4hzrs" Apr 19 12:12:02.195084 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.192585 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/898b1e53-98cc-4608-bcf9-727c3e285ef4-sys\") pod \"node-exporter-4hzrs\" (UID: \"898b1e53-98cc-4608-bcf9-727c3e285ef4\") " pod="openshift-monitoring/node-exporter-4hzrs" Apr 19 12:12:02.195084 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.192598 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/898b1e53-98cc-4608-bcf9-727c3e285ef4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4hzrs\" (UID: \"898b1e53-98cc-4608-bcf9-727c3e285ef4\") " pod="openshift-monitoring/node-exporter-4hzrs" Apr 19 12:12:02.195084 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.192639 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/20d0fb14-1bf7-42b5-bee5-1769c34fea15-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-l4v2s\" (UID: \"20d0fb14-1bf7-42b5-bee5-1769c34fea15\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-l4v2s" Apr 19 12:12:02.195084 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.192709 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/898b1e53-98cc-4608-bcf9-727c3e285ef4-node-exporter-textfile\") pod \"node-exporter-4hzrs\" (UID: \"898b1e53-98cc-4608-bcf9-727c3e285ef4\") " pod="openshift-monitoring/node-exporter-4hzrs" Apr 19 12:12:02.195084 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.192764 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/898b1e53-98cc-4608-bcf9-727c3e285ef4-metrics-client-ca\") pod \"node-exporter-4hzrs\" (UID: \"898b1e53-98cc-4608-bcf9-727c3e285ef4\") " pod="openshift-monitoring/node-exporter-4hzrs" Apr 19 12:12:02.195084 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.192790 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/898b1e53-98cc-4608-bcf9-727c3e285ef4-root\") pod \"node-exporter-4hzrs\" (UID: \"898b1e53-98cc-4608-bcf9-727c3e285ef4\") " pod="openshift-monitoring/node-exporter-4hzrs" Apr 19 12:12:02.195084 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.192815 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/20d0fb14-1bf7-42b5-bee5-1769c34fea15-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-l4v2s\" (UID: \"20d0fb14-1bf7-42b5-bee5-1769c34fea15\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-l4v2s" Apr 19 12:12:02.195084 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.192857 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/898b1e53-98cc-4608-bcf9-727c3e285ef4-node-exporter-wtmp\") pod \"node-exporter-4hzrs\" (UID: \"898b1e53-98cc-4608-bcf9-727c3e285ef4\") " pod="openshift-monitoring/node-exporter-4hzrs" Apr 19 12:12:02.195084 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.192890 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/898b1e53-98cc-4608-bcf9-727c3e285ef4-node-exporter-accelerators-collector-config\") pod \"node-exporter-4hzrs\" (UID: \"898b1e53-98cc-4608-bcf9-727c3e285ef4\") " pod="openshift-monitoring/node-exporter-4hzrs" Apr 19 12:12:02.195084 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.192919 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/20d0fb14-1bf7-42b5-bee5-1769c34fea15-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-l4v2s\" (UID: \"20d0fb14-1bf7-42b5-bee5-1769c34fea15\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-l4v2s" Apr 19 12:12:02.195084 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.192945 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clc2q\" (UniqueName: \"kubernetes.io/projected/20d0fb14-1bf7-42b5-bee5-1769c34fea15-kube-api-access-clc2q\") pod \"kube-state-metrics-69db897b98-l4v2s\" (UID: \"20d0fb14-1bf7-42b5-bee5-1769c34fea15\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-l4v2s" Apr 19 12:12:02.195084 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.192986 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/20d0fb14-1bf7-42b5-bee5-1769c34fea15-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-l4v2s\" (UID: \"20d0fb14-1bf7-42b5-bee5-1769c34fea15\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-l4v2s" Apr 19 12:12:02.195084 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.193025 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/20d0fb14-1bf7-42b5-bee5-1769c34fea15-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-l4v2s\" (UID: \"20d0fb14-1bf7-42b5-bee5-1769c34fea15\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-l4v2s" Apr 19 12:12:02.195084 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:12:02.193149 2567 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 19 12:12:02.196066 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.193190 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/898b1e53-98cc-4608-bcf9-727c3e285ef4-node-exporter-wtmp\") pod \"node-exporter-4hzrs\" (UID: \"898b1e53-98cc-4608-bcf9-727c3e285ef4\") " pod="openshift-monitoring/node-exporter-4hzrs" Apr 19 12:12:02.196066 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:12:02.193206 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20d0fb14-1bf7-42b5-bee5-1769c34fea15-kube-state-metrics-tls podName:20d0fb14-1bf7-42b5-bee5-1769c34fea15 nodeName:}" failed. No retries permitted until 2026-04-19 12:12:02.693185766 +0000 UTC m=+131.447377577 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/20d0fb14-1bf7-42b5-bee5-1769c34fea15-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-l4v2s" (UID: "20d0fb14-1bf7-42b5-bee5-1769c34fea15") : secret "kube-state-metrics-tls" not found Apr 19 12:12:02.196066 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.193461 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/898b1e53-98cc-4608-bcf9-727c3e285ef4-node-exporter-textfile\") pod \"node-exporter-4hzrs\" (UID: \"898b1e53-98cc-4608-bcf9-727c3e285ef4\") " pod="openshift-monitoring/node-exporter-4hzrs" Apr 19 12:12:02.196066 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.193864 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/898b1e53-98cc-4608-bcf9-727c3e285ef4-node-exporter-accelerators-collector-config\") pod \"node-exporter-4hzrs\" (UID: \"898b1e53-98cc-4608-bcf9-727c3e285ef4\") " pod="openshift-monitoring/node-exporter-4hzrs" Apr 19 12:12:02.196066 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.193921 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/898b1e53-98cc-4608-bcf9-727c3e285ef4-root\") pod \"node-exporter-4hzrs\" (UID: \"898b1e53-98cc-4608-bcf9-727c3e285ef4\") " pod="openshift-monitoring/node-exporter-4hzrs" Apr 19 12:12:02.196066 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.194157 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/20d0fb14-1bf7-42b5-bee5-1769c34fea15-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-l4v2s\" (UID: \"20d0fb14-1bf7-42b5-bee5-1769c34fea15\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-l4v2s" Apr 19 12:12:02.196066 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.194540 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/20d0fb14-1bf7-42b5-bee5-1769c34fea15-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-l4v2s\" (UID: \"20d0fb14-1bf7-42b5-bee5-1769c34fea15\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-l4v2s" Apr 19 12:12:02.196066 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.194686 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/20d0fb14-1bf7-42b5-bee5-1769c34fea15-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-l4v2s\" (UID: \"20d0fb14-1bf7-42b5-bee5-1769c34fea15\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-l4v2s" Apr 19 12:12:02.196066 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.195044 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/898b1e53-98cc-4608-bcf9-727c3e285ef4-metrics-client-ca\") pod \"node-exporter-4hzrs\" (UID: \"898b1e53-98cc-4608-bcf9-727c3e285ef4\") " pod="openshift-monitoring/node-exporter-4hzrs" Apr 19 12:12:02.197425 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.197381 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/20d0fb14-1bf7-42b5-bee5-1769c34fea15-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-l4v2s\" (UID: \"20d0fb14-1bf7-42b5-bee5-1769c34fea15\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-l4v2s" Apr 19 12:12:02.198076 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.198039 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/898b1e53-98cc-4608-bcf9-727c3e285ef4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4hzrs\" (UID: \"898b1e53-98cc-4608-bcf9-727c3e285ef4\") " pod="openshift-monitoring/node-exporter-4hzrs" Apr 19 12:12:02.199437 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.199413 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/898b1e53-98cc-4608-bcf9-727c3e285ef4-node-exporter-tls\") pod \"node-exporter-4hzrs\" (UID: \"898b1e53-98cc-4608-bcf9-727c3e285ef4\") " pod="openshift-monitoring/node-exporter-4hzrs" Apr 19 12:12:02.204406 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.204384 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-clc2q\" (UniqueName: \"kubernetes.io/projected/20d0fb14-1bf7-42b5-bee5-1769c34fea15-kube-api-access-clc2q\") pod \"kube-state-metrics-69db897b98-l4v2s\" (UID: \"20d0fb14-1bf7-42b5-bee5-1769c34fea15\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-l4v2s" Apr 19 12:12:02.205047 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.204858 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf88r\" (UniqueName: \"kubernetes.io/projected/898b1e53-98cc-4608-bcf9-727c3e285ef4-kube-api-access-xf88r\") pod \"node-exporter-4hzrs\" (UID: \"898b1e53-98cc-4608-bcf9-727c3e285ef4\") " pod="openshift-monitoring/node-exporter-4hzrs" Apr 19 12:12:02.275536 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.275500 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rxnrj" event={"ID":"3a9d31f9-fb41-43e0-9946-0611710438a1","Type":"ContainerStarted","Data":"2b347115008149e2f69e9aa77b7643ba5969cba82c8c5be9b08775dbcaf83068"} Apr 19 12:12:02.307218 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.307179 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4hzrs" Apr 19 12:12:02.317070 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:12:02.317046 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod898b1e53_98cc_4608_bcf9_727c3e285ef4.slice/crio-f513ef1103ce09902b78d85fb2f7ce042dee50e6dfe71179489a370a4f99d0bd WatchSource:0}: Error finding container f513ef1103ce09902b78d85fb2f7ce042dee50e6dfe71179489a370a4f99d0bd: Status 404 returned error can't find the container with id f513ef1103ce09902b78d85fb2f7ce042dee50e6dfe71179489a370a4f99d0bd Apr 19 12:12:02.696739 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.696702 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/20d0fb14-1bf7-42b5-bee5-1769c34fea15-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-l4v2s\" (UID: \"20d0fb14-1bf7-42b5-bee5-1769c34fea15\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-l4v2s" Apr 19 12:12:02.699342 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.699290 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/20d0fb14-1bf7-42b5-bee5-1769c34fea15-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-l4v2s\" (UID: \"20d0fb14-1bf7-42b5-bee5-1769c34fea15\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-l4v2s" Apr 19 12:12:02.900930 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:02.900885 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-l4v2s" Apr 19 12:12:03.065978 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:03.065892 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-l4v2s"] Apr 19 12:12:03.233446 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:12:03.233360 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20d0fb14_1bf7_42b5_bee5_1769c34fea15.slice/crio-f53748d38200f8c11821c4ace791175f0229bde18c4b2f0460e00d974c0d2b5d WatchSource:0}: Error finding container f53748d38200f8c11821c4ace791175f0229bde18c4b2f0460e00d974c0d2b5d: Status 404 returned error can't find the container with id f53748d38200f8c11821c4ace791175f0229bde18c4b2f0460e00d974c0d2b5d Apr 19 12:12:03.279907 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:03.279869 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4hzrs" event={"ID":"898b1e53-98cc-4608-bcf9-727c3e285ef4","Type":"ContainerStarted","Data":"f513ef1103ce09902b78d85fb2f7ce042dee50e6dfe71179489a370a4f99d0bd"} Apr 19 12:12:03.281013 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:03.280971 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-l4v2s" event={"ID":"20d0fb14-1bf7-42b5-bee5-1769c34fea15","Type":"ContainerStarted","Data":"f53748d38200f8c11821c4ace791175f0229bde18c4b2f0460e00d974c0d2b5d"} Apr 19 12:12:03.947301 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:03.947264 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5d8865455f-6w88z"] Apr 19 12:12:03.950794 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:03.950774 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5d8865455f-6w88z" Apr 19 12:12:03.954215 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:03.953450 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 19 12:12:03.954215 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:03.953672 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 19 12:12:03.954215 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:03.953141 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-h8x7h\"" Apr 19 12:12:03.954215 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:03.953686 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 19 12:12:03.954215 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:03.953672 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-blreie4sgjes0\"" Apr 19 12:12:03.954215 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:03.953850 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 19 12:12:03.954215 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:03.953928 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 19 12:12:03.960355 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:03.960334 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5d8865455f-6w88z"] Apr 19 12:12:04.008798 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:04.008725 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7e99e30a-bcfc-4582-b11a-60738ad7760f-secret-grpc-tls\") pod \"thanos-querier-5d8865455f-6w88z\" (UID: \"7e99e30a-bcfc-4582-b11a-60738ad7760f\") " pod="openshift-monitoring/thanos-querier-5d8865455f-6w88z" Apr 19 12:12:04.008798 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:04.008764 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7e99e30a-bcfc-4582-b11a-60738ad7760f-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5d8865455f-6w88z\" (UID: \"7e99e30a-bcfc-4582-b11a-60738ad7760f\") " pod="openshift-monitoring/thanos-querier-5d8865455f-6w88z" Apr 19 12:12:04.008798 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:04.008782 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/7e99e30a-bcfc-4582-b11a-60738ad7760f-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5d8865455f-6w88z\" (UID: \"7e99e30a-bcfc-4582-b11a-60738ad7760f\") " pod="openshift-monitoring/thanos-querier-5d8865455f-6w88z" Apr 19 12:12:04.009041 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:04.008803 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/7e99e30a-bcfc-4582-b11a-60738ad7760f-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5d8865455f-6w88z\" (UID: \"7e99e30a-bcfc-4582-b11a-60738ad7760f\") " pod="openshift-monitoring/thanos-querier-5d8865455f-6w88z" Apr 19 12:12:04.009041 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:04.008886 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7e99e30a-bcfc-4582-b11a-60738ad7760f-metrics-client-ca\") pod \"thanos-querier-5d8865455f-6w88z\" (UID: \"7e99e30a-bcfc-4582-b11a-60738ad7760f\") " pod="openshift-monitoring/thanos-querier-5d8865455f-6w88z" Apr 19 12:12:04.009041 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:04.008944 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/7e99e30a-bcfc-4582-b11a-60738ad7760f-secret-thanos-querier-tls\") pod \"thanos-querier-5d8865455f-6w88z\" (UID: \"7e99e30a-bcfc-4582-b11a-60738ad7760f\") " pod="openshift-monitoring/thanos-querier-5d8865455f-6w88z" Apr 19 12:12:04.009041 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:04.009027 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7e99e30a-bcfc-4582-b11a-60738ad7760f-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5d8865455f-6w88z\" (UID: \"7e99e30a-bcfc-4582-b11a-60738ad7760f\") " pod="openshift-monitoring/thanos-querier-5d8865455f-6w88z" Apr 19 12:12:04.009210 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:04.009048 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q7zr\" (UniqueName: \"kubernetes.io/projected/7e99e30a-bcfc-4582-b11a-60738ad7760f-kube-api-access-5q7zr\") pod \"thanos-querier-5d8865455f-6w88z\" (UID: \"7e99e30a-bcfc-4582-b11a-60738ad7760f\") " pod="openshift-monitoring/thanos-querier-5d8865455f-6w88z" Apr 19 12:12:04.110265 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:04.110215 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7e99e30a-bcfc-4582-b11a-60738ad7760f-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5d8865455f-6w88z\" (UID: \"7e99e30a-bcfc-4582-b11a-60738ad7760f\") " pod="openshift-monitoring/thanos-querier-5d8865455f-6w88z" Apr 19 12:12:04.110265 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:04.110264 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5q7zr\" (UniqueName: \"kubernetes.io/projected/7e99e30a-bcfc-4582-b11a-60738ad7760f-kube-api-access-5q7zr\") pod \"thanos-querier-5d8865455f-6w88z\" (UID: \"7e99e30a-bcfc-4582-b11a-60738ad7760f\") " pod="openshift-monitoring/thanos-querier-5d8865455f-6w88z" Apr 19 12:12:04.110500 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:04.110310 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7e99e30a-bcfc-4582-b11a-60738ad7760f-secret-grpc-tls\") pod \"thanos-querier-5d8865455f-6w88z\" (UID: \"7e99e30a-bcfc-4582-b11a-60738ad7760f\") " pod="openshift-monitoring/thanos-querier-5d8865455f-6w88z" Apr 19 12:12:04.110500 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:04.110479 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7e99e30a-bcfc-4582-b11a-60738ad7760f-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5d8865455f-6w88z\" (UID: \"7e99e30a-bcfc-4582-b11a-60738ad7760f\") " pod="openshift-monitoring/thanos-querier-5d8865455f-6w88z" Apr 19 12:12:04.110606 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:04.110525 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/7e99e30a-bcfc-4582-b11a-60738ad7760f-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5d8865455f-6w88z\" (UID: \"7e99e30a-bcfc-4582-b11a-60738ad7760f\") " pod="openshift-monitoring/thanos-querier-5d8865455f-6w88z" Apr 19 12:12:04.110606 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:04.110572 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/7e99e30a-bcfc-4582-b11a-60738ad7760f-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5d8865455f-6w88z\" (UID: \"7e99e30a-bcfc-4582-b11a-60738ad7760f\") " pod="openshift-monitoring/thanos-querier-5d8865455f-6w88z" Apr 19 12:12:04.110707 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:04.110622 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7e99e30a-bcfc-4582-b11a-60738ad7760f-metrics-client-ca\") pod \"thanos-querier-5d8865455f-6w88z\" (UID: \"7e99e30a-bcfc-4582-b11a-60738ad7760f\") " pod="openshift-monitoring/thanos-querier-5d8865455f-6w88z" Apr 19 12:12:04.110707 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:04.110679 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/7e99e30a-bcfc-4582-b11a-60738ad7760f-secret-thanos-querier-tls\") pod \"thanos-querier-5d8865455f-6w88z\" (UID: \"7e99e30a-bcfc-4582-b11a-60738ad7760f\") " pod="openshift-monitoring/thanos-querier-5d8865455f-6w88z" Apr 19 12:12:04.111477 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:04.111426 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7e99e30a-bcfc-4582-b11a-60738ad7760f-metrics-client-ca\") pod \"thanos-querier-5d8865455f-6w88z\" (UID: \"7e99e30a-bcfc-4582-b11a-60738ad7760f\") " pod="openshift-monitoring/thanos-querier-5d8865455f-6w88z" Apr 19 12:12:04.113581 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:04.113535 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/7e99e30a-bcfc-4582-b11a-60738ad7760f-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5d8865455f-6w88z\" (UID: \"7e99e30a-bcfc-4582-b11a-60738ad7760f\") " pod="openshift-monitoring/thanos-querier-5d8865455f-6w88z" Apr 19 12:12:04.113581 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:04.113535 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/7e99e30a-bcfc-4582-b11a-60738ad7760f-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5d8865455f-6w88z\" (UID: \"7e99e30a-bcfc-4582-b11a-60738ad7760f\") " pod="openshift-monitoring/thanos-querier-5d8865455f-6w88z" Apr 19 12:12:04.113742 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:04.113603 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7e99e30a-bcfc-4582-b11a-60738ad7760f-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5d8865455f-6w88z\" (UID: \"7e99e30a-bcfc-4582-b11a-60738ad7760f\") " pod="openshift-monitoring/thanos-querier-5d8865455f-6w88z" Apr 19 12:12:04.113933 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:04.113908 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/7e99e30a-bcfc-4582-b11a-60738ad7760f-secret-thanos-querier-tls\") pod \"thanos-querier-5d8865455f-6w88z\" (UID: \"7e99e30a-bcfc-4582-b11a-60738ad7760f\") " pod="openshift-monitoring/thanos-querier-5d8865455f-6w88z" Apr 19 12:12:04.114146 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:04.114103 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7e99e30a-bcfc-4582-b11a-60738ad7760f-secret-grpc-tls\") pod \"thanos-querier-5d8865455f-6w88z\" (UID: \"7e99e30a-bcfc-4582-b11a-60738ad7760f\") " pod="openshift-monitoring/thanos-querier-5d8865455f-6w88z" Apr 19 12:12:04.114326 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:04.114299 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7e99e30a-bcfc-4582-b11a-60738ad7760f-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5d8865455f-6w88z\" (UID: \"7e99e30a-bcfc-4582-b11a-60738ad7760f\") " pod="openshift-monitoring/thanos-querier-5d8865455f-6w88z" Apr 19 12:12:04.117760 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:04.117738 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q7zr\" (UniqueName: \"kubernetes.io/projected/7e99e30a-bcfc-4582-b11a-60738ad7760f-kube-api-access-5q7zr\") pod \"thanos-querier-5d8865455f-6w88z\" (UID: \"7e99e30a-bcfc-4582-b11a-60738ad7760f\") " pod="openshift-monitoring/thanos-querier-5d8865455f-6w88z" Apr 19 12:12:04.263130 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:04.263028 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5d8865455f-6w88z" Apr 19 12:12:04.286933 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:04.286892 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rxnrj" event={"ID":"3a9d31f9-fb41-43e0-9946-0611710438a1","Type":"ContainerStarted","Data":"7e857a28345077f30f6933c0b53ac0b63fa9ded4bc1df6f4dd4ef57bdef3c4bd"} Apr 19 12:12:04.287090 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:04.286941 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rxnrj" event={"ID":"3a9d31f9-fb41-43e0-9946-0611710438a1","Type":"ContainerStarted","Data":"0f7b1dcbb5a4e67219dff5c869c6f6b5fb57e4eba592f62f46f1a99538b76009"} Apr 19 12:12:04.288636 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:04.288606 2567 generic.go:358] "Generic (PLEG): container finished" podID="898b1e53-98cc-4608-bcf9-727c3e285ef4" containerID="3ee00fa9b69cee84d3708edf3a5ec79defc6cbe786fad7743dafa9d0419d539e" exitCode=0 Apr 19 12:12:04.288761 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:04.288713 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4hzrs" event={"ID":"898b1e53-98cc-4608-bcf9-727c3e285ef4","Type":"ContainerDied","Data":"3ee00fa9b69cee84d3708edf3a5ec79defc6cbe786fad7743dafa9d0419d539e"} Apr 19 12:12:04.301462 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:04.301409 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-rxnrj" podStartSLOduration=131.905374044 podStartE2EDuration="2m13.301391105s" podCreationTimestamp="2026-04-19 12:09:51 +0000 UTC" firstStartedPulling="2026-04-19 12:12:02.039167089 +0000 UTC m=+130.793358900" lastFinishedPulling="2026-04-19 12:12:03.435184156 +0000 UTC m=+132.189375961" observedRunningTime="2026-04-19 12:12:04.300102946 +0000 UTC m=+133.054294774" watchObservedRunningTime="2026-04-19 12:12:04.301391105 +0000 UTC m=+133.055582936" Apr 19 12:12:04.404630 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:04.404605 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5d8865455f-6w88z"] Apr 19 12:12:04.408790 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:12:04.408758 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e99e30a_bcfc_4582_b11a_60738ad7760f.slice/crio-3572de15f04e7c56f56140d13f2f65665457ce2a3e5e610f3947bc7c1ccf9e7e WatchSource:0}: Error finding container 3572de15f04e7c56f56140d13f2f65665457ce2a3e5e610f3947bc7c1ccf9e7e: Status 404 returned error can't find the container with id 3572de15f04e7c56f56140d13f2f65665457ce2a3e5e610f3947bc7c1ccf9e7e Apr 19 12:12:05.292915 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:05.292876 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d8865455f-6w88z" event={"ID":"7e99e30a-bcfc-4582-b11a-60738ad7760f","Type":"ContainerStarted","Data":"3572de15f04e7c56f56140d13f2f65665457ce2a3e5e610f3947bc7c1ccf9e7e"} Apr 19 12:12:05.295191 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:05.295159 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4hzrs" event={"ID":"898b1e53-98cc-4608-bcf9-727c3e285ef4","Type":"ContainerStarted","Data":"5d94555d1773b692f1260efe3c528446130ec963826bc0414b3a9314bd3c679c"} Apr 19 12:12:05.295333 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:05.295195 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4hzrs" event={"ID":"898b1e53-98cc-4608-bcf9-727c3e285ef4","Type":"ContainerStarted","Data":"3420a58d70bd4c99624761c786596c3f35e6eb359c60e64b8a7a61b65288bb4a"} Apr 19 12:12:05.297421 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:05.297352 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-l4v2s" event={"ID":"20d0fb14-1bf7-42b5-bee5-1769c34fea15","Type":"ContainerStarted","Data":"c39e49b52760e8cef51a42e9968c4e9e4e009ede5d247820675578c5ac33089e"} Apr 19 12:12:05.297421 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:05.297394 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-l4v2s" event={"ID":"20d0fb14-1bf7-42b5-bee5-1769c34fea15","Type":"ContainerStarted","Data":"2a03b90a893eeb593462cbb19dab13d2f4f6e0c60826bee43abb4f0d9852c55e"} Apr 19 12:12:05.297421 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:05.297411 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-l4v2s" event={"ID":"20d0fb14-1bf7-42b5-bee5-1769c34fea15","Type":"ContainerStarted","Data":"2c6823d4c18e230b79ae95fa033ade1ff5b36cfe76f46535216fe0cf4e345402"} Apr 19 12:12:05.313491 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:05.313434 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-4hzrs" podStartSLOduration=3.19600545 podStartE2EDuration="4.313422045s" podCreationTimestamp="2026-04-19 12:12:01 +0000 UTC" firstStartedPulling="2026-04-19 12:12:02.319026894 +0000 UTC m=+131.073218703" lastFinishedPulling="2026-04-19 12:12:03.43644349 +0000 UTC m=+132.190635298" observedRunningTime="2026-04-19 12:12:05.311549894 +0000 UTC m=+134.065741721" watchObservedRunningTime="2026-04-19 12:12:05.313422045 +0000 UTC m=+134.067613871" Apr 19 12:12:05.327037 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:05.326980 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-l4v2s" podStartSLOduration=2.709302529 podStartE2EDuration="4.326961142s" podCreationTimestamp="2026-04-19 12:12:01 +0000 UTC" firstStartedPulling="2026-04-19 12:12:03.235513921 +0000 UTC m=+131.989705731" lastFinishedPulling="2026-04-19 12:12:04.853172525 +0000 UTC m=+133.607364344" observedRunningTime="2026-04-19 12:12:05.326571539 +0000 UTC m=+134.080763365" watchObservedRunningTime="2026-04-19 12:12:05.326961142 +0000 UTC m=+134.081152972" Apr 19 12:12:06.239914 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.239872 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-77c459b5d9-4q4r2"] Apr 19 12:12:06.242458 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.242429 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-77c459b5d9-4q4r2" Apr 19 12:12:06.245589 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.245559 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 19 12:12:06.245710 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.245625 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 19 12:12:06.245710 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.245559 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 19 12:12:06.245710 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.245660 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-77290iup1g5t3\"" Apr 19 12:12:06.245876 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.245559 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 19 12:12:06.245876 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.245852 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-5rcwj\"" Apr 19 12:12:06.251319 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.251227 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-77c459b5d9-4q4r2"] Apr 19 12:12:06.330376 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.330341 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5870950f-5e7a-46c3-a04d-45a969812735-client-ca-bundle\") pod \"metrics-server-77c459b5d9-4q4r2\" (UID: \"5870950f-5e7a-46c3-a04d-45a969812735\") " pod="openshift-monitoring/metrics-server-77c459b5d9-4q4r2" Apr 19 12:12:06.330808 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.330424 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5870950f-5e7a-46c3-a04d-45a969812735-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-77c459b5d9-4q4r2\" (UID: \"5870950f-5e7a-46c3-a04d-45a969812735\") " pod="openshift-monitoring/metrics-server-77c459b5d9-4q4r2" Apr 19 12:12:06.330808 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.330467 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jqlf\" (UniqueName: \"kubernetes.io/projected/5870950f-5e7a-46c3-a04d-45a969812735-kube-api-access-5jqlf\") pod \"metrics-server-77c459b5d9-4q4r2\" (UID: \"5870950f-5e7a-46c3-a04d-45a969812735\") " pod="openshift-monitoring/metrics-server-77c459b5d9-4q4r2" Apr 19 12:12:06.330808 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.330508 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/5870950f-5e7a-46c3-a04d-45a969812735-secret-metrics-server-client-certs\") pod \"metrics-server-77c459b5d9-4q4r2\" (UID: \"5870950f-5e7a-46c3-a04d-45a969812735\") " pod="openshift-monitoring/metrics-server-77c459b5d9-4q4r2" Apr 19 12:12:06.330808 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.330543 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/5870950f-5e7a-46c3-a04d-45a969812735-secret-metrics-server-tls\") pod \"metrics-server-77c459b5d9-4q4r2\" (UID: \"5870950f-5e7a-46c3-a04d-45a969812735\") " pod="openshift-monitoring/metrics-server-77c459b5d9-4q4r2" Apr 19 12:12:06.330808 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.330561 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/5870950f-5e7a-46c3-a04d-45a969812735-audit-log\") pod \"metrics-server-77c459b5d9-4q4r2\" (UID: \"5870950f-5e7a-46c3-a04d-45a969812735\") " pod="openshift-monitoring/metrics-server-77c459b5d9-4q4r2" Apr 19 12:12:06.330808 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.330610 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/5870950f-5e7a-46c3-a04d-45a969812735-metrics-server-audit-profiles\") pod \"metrics-server-77c459b5d9-4q4r2\" (UID: \"5870950f-5e7a-46c3-a04d-45a969812735\") " pod="openshift-monitoring/metrics-server-77c459b5d9-4q4r2" Apr 19 12:12:06.431842 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.431799 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5870950f-5e7a-46c3-a04d-45a969812735-client-ca-bundle\") pod \"metrics-server-77c459b5d9-4q4r2\" (UID: \"5870950f-5e7a-46c3-a04d-45a969812735\") " pod="openshift-monitoring/metrics-server-77c459b5d9-4q4r2" Apr 19 12:12:06.432097 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.432072 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5870950f-5e7a-46c3-a04d-45a969812735-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-77c459b5d9-4q4r2\" (UID: \"5870950f-5e7a-46c3-a04d-45a969812735\") " pod="openshift-monitoring/metrics-server-77c459b5d9-4q4r2" Apr 19 12:12:06.432219 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.432198 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jqlf\" (UniqueName: \"kubernetes.io/projected/5870950f-5e7a-46c3-a04d-45a969812735-kube-api-access-5jqlf\") pod \"metrics-server-77c459b5d9-4q4r2\" (UID: \"5870950f-5e7a-46c3-a04d-45a969812735\") " pod="openshift-monitoring/metrics-server-77c459b5d9-4q4r2" Apr 19 12:12:06.432398 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.432336 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/5870950f-5e7a-46c3-a04d-45a969812735-secret-metrics-server-client-certs\") pod \"metrics-server-77c459b5d9-4q4r2\" (UID: \"5870950f-5e7a-46c3-a04d-45a969812735\") " pod="openshift-monitoring/metrics-server-77c459b5d9-4q4r2" Apr 19 12:12:06.432669 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.432647 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/5870950f-5e7a-46c3-a04d-45a969812735-secret-metrics-server-tls\") pod \"metrics-server-77c459b5d9-4q4r2\" (UID: \"5870950f-5e7a-46c3-a04d-45a969812735\") " pod="openshift-monitoring/metrics-server-77c459b5d9-4q4r2" Apr 19 12:12:06.432756 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.432686 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/5870950f-5e7a-46c3-a04d-45a969812735-audit-log\") pod \"metrics-server-77c459b5d9-4q4r2\" (UID: \"5870950f-5e7a-46c3-a04d-45a969812735\") " pod="openshift-monitoring/metrics-server-77c459b5d9-4q4r2" Apr 19 12:12:06.432756 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.432714 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/5870950f-5e7a-46c3-a04d-45a969812735-metrics-server-audit-profiles\") pod \"metrics-server-77c459b5d9-4q4r2\" (UID: \"5870950f-5e7a-46c3-a04d-45a969812735\") " pod="openshift-monitoring/metrics-server-77c459b5d9-4q4r2" Apr 19 12:12:06.432856 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.432797 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5870950f-5e7a-46c3-a04d-45a969812735-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-77c459b5d9-4q4r2\" (UID: \"5870950f-5e7a-46c3-a04d-45a969812735\") " pod="openshift-monitoring/metrics-server-77c459b5d9-4q4r2" Apr 19 12:12:06.433052 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.433023 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/5870950f-5e7a-46c3-a04d-45a969812735-audit-log\") pod \"metrics-server-77c459b5d9-4q4r2\" (UID: \"5870950f-5e7a-46c3-a04d-45a969812735\") " pod="openshift-monitoring/metrics-server-77c459b5d9-4q4r2" Apr 19 12:12:06.433736 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.433696 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/5870950f-5e7a-46c3-a04d-45a969812735-metrics-server-audit-profiles\") pod \"metrics-server-77c459b5d9-4q4r2\" (UID: \"5870950f-5e7a-46c3-a04d-45a969812735\") " pod="openshift-monitoring/metrics-server-77c459b5d9-4q4r2" Apr 19 12:12:06.435005 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.434980 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5870950f-5e7a-46c3-a04d-45a969812735-client-ca-bundle\") pod \"metrics-server-77c459b5d9-4q4r2\" (UID: \"5870950f-5e7a-46c3-a04d-45a969812735\") " pod="openshift-monitoring/metrics-server-77c459b5d9-4q4r2" Apr 19 12:12:06.435355 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.435338 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/5870950f-5e7a-46c3-a04d-45a969812735-secret-metrics-server-tls\") pod \"metrics-server-77c459b5d9-4q4r2\" (UID: \"5870950f-5e7a-46c3-a04d-45a969812735\") " pod="openshift-monitoring/metrics-server-77c459b5d9-4q4r2" Apr 19 12:12:06.435406 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.435393 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/5870950f-5e7a-46c3-a04d-45a969812735-secret-metrics-server-client-certs\") pod \"metrics-server-77c459b5d9-4q4r2\" (UID: \"5870950f-5e7a-46c3-a04d-45a969812735\") " pod="openshift-monitoring/metrics-server-77c459b5d9-4q4r2" Apr 19 12:12:06.439691 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.439668 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jqlf\" (UniqueName: \"kubernetes.io/projected/5870950f-5e7a-46c3-a04d-45a969812735-kube-api-access-5jqlf\") pod \"metrics-server-77c459b5d9-4q4r2\" (UID: \"5870950f-5e7a-46c3-a04d-45a969812735\") " pod="openshift-monitoring/metrics-server-77c459b5d9-4q4r2" Apr 19 12:12:06.554027 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.554005 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-77c459b5d9-4q4r2" Apr 19 12:12:06.679295 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.679257 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-77c459b5d9-4q4r2"] Apr 19 12:12:06.682315 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:12:06.682284 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5870950f_5e7a_46c3_a04d_45a969812735.slice/crio-973eccd912de7207571dc8bcdf8471279e339d591ea9cfcece936818215ecc51 WatchSource:0}: Error finding container 973eccd912de7207571dc8bcdf8471279e339d591ea9cfcece936818215ecc51: Status 404 returned error can't find the container with id 973eccd912de7207571dc8bcdf8471279e339d591ea9cfcece936818215ecc51 Apr 19 12:12:06.712021 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.711994 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-sdr94"] Apr 19 12:12:06.715440 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.715378 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-sdr94" Apr 19 12:12:06.717618 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.717596 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 19 12:12:06.717719 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.717648 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-kgrjp\"" Apr 19 12:12:06.721898 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.721840 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-sdr94"] Apr 19 12:12:06.833373 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.833293 2567 scope.go:117] "RemoveContainer" containerID="9858adefffef3312f227b1c09cfbb90a1e5d77627f92302cc427f29f99dbcdbc" Apr 19 12:12:06.835245 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.835221 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/28bf133a-7fde-473d-aa9a-ba2319484941-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-sdr94\" (UID: \"28bf133a-7fde-473d-aa9a-ba2319484941\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-sdr94" Apr 19 12:12:06.935636 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.935605 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/28bf133a-7fde-473d-aa9a-ba2319484941-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-sdr94\" (UID: \"28bf133a-7fde-473d-aa9a-ba2319484941\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-sdr94" Apr 19 12:12:06.938195 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:06.938166 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/28bf133a-7fde-473d-aa9a-ba2319484941-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-sdr94\" (UID: \"28bf133a-7fde-473d-aa9a-ba2319484941\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-sdr94" Apr 19 12:12:07.031155 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:07.031121 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-sdr94" Apr 19 12:12:07.157242 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:07.157205 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-sdr94"] Apr 19 12:12:07.159922 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:12:07.159893 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28bf133a_7fde_473d_aa9a_ba2319484941.slice/crio-901e7c324ff49a325cac436fcc04a86bf914e3d7cab4b124af59c036e7d7a69e WatchSource:0}: Error finding container 901e7c324ff49a325cac436fcc04a86bf914e3d7cab4b124af59c036e7d7a69e: Status 404 returned error can't find the container with id 901e7c324ff49a325cac436fcc04a86bf914e3d7cab4b124af59c036e7d7a69e Apr 19 12:12:07.305204 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:07.305163 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-sdr94" event={"ID":"28bf133a-7fde-473d-aa9a-ba2319484941","Type":"ContainerStarted","Data":"901e7c324ff49a325cac436fcc04a86bf914e3d7cab4b124af59c036e7d7a69e"} Apr 19 12:12:07.307495 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:07.307455 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d8865455f-6w88z" event={"ID":"7e99e30a-bcfc-4582-b11a-60738ad7760f","Type":"ContainerStarted","Data":"a388fe271ea383c107b8c2ef6655375676ad401e06c443bd95b142391542473d"} Apr 19 12:12:07.307650 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:07.307511 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d8865455f-6w88z" event={"ID":"7e99e30a-bcfc-4582-b11a-60738ad7760f","Type":"ContainerStarted","Data":"9dd1dd04812915fc45f71d1a3252be1040e94824b1328ecffa92ef9f086d4d61"} Apr 19 12:12:07.307650 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:07.307527 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d8865455f-6w88z" event={"ID":"7e99e30a-bcfc-4582-b11a-60738ad7760f","Type":"ContainerStarted","Data":"978ecedd8d8a965843f17534dbacd183bd7ec8775450296ccc203b06a5af1624"} Apr 19 12:12:07.309258 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:07.309235 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c8kc9_88822cf0-d332-4d8f-ab99-d2460f2ad404/console-operator/2.log" Apr 19 12:12:07.309390 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:07.309327 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-c8kc9" event={"ID":"88822cf0-d332-4d8f-ab99-d2460f2ad404","Type":"ContainerStarted","Data":"fb5feb021ce2173312a7cb17ca5689114cc8f9a9e969d7c6c62b0cf725696a09"} Apr 19 12:12:07.309791 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:07.309591 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-c8kc9" Apr 19 12:12:07.311325 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:07.311291 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-77c459b5d9-4q4r2" event={"ID":"5870950f-5e7a-46c3-a04d-45a969812735","Type":"ContainerStarted","Data":"973eccd912de7207571dc8bcdf8471279e339d591ea9cfcece936818215ecc51"} Apr 19 12:12:07.324019 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:07.323969 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-c8kc9" podStartSLOduration=44.19782528 podStartE2EDuration="46.323953146s" podCreationTimestamp="2026-04-19 12:11:21 +0000 UTC" firstStartedPulling="2026-04-19 12:11:21.839223341 +0000 UTC m=+90.593415153" lastFinishedPulling="2026-04-19 12:11:23.965351214 +0000 UTC m=+92.719543019" observedRunningTime="2026-04-19 12:12:07.323092309 +0000 UTC m=+136.077284146" watchObservedRunningTime="2026-04-19 12:12:07.323953146 +0000 UTC m=+136.078144974" Apr 19 12:12:07.581934 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:07.581899 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-c8kc9" Apr 19 12:12:07.759258 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:07.759220 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-lfqxb"] Apr 19 12:12:07.761670 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:07.761645 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-lfqxb" Apr 19 12:12:07.764149 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:07.764099 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 19 12:12:07.764276 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:07.764146 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-9jthf\"" Apr 19 12:12:07.764276 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:07.764237 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 19 12:12:07.768706 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:07.768669 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-lfqxb"] Apr 19 12:12:07.845714 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:07.845629 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5m45\" (UniqueName: \"kubernetes.io/projected/dea1fdb1-ab00-4985-b0cb-3078be61f4c5-kube-api-access-k5m45\") pod \"downloads-6bcc868b7-lfqxb\" (UID: \"dea1fdb1-ab00-4985-b0cb-3078be61f4c5\") " pod="openshift-console/downloads-6bcc868b7-lfqxb" Apr 19 12:12:07.946948 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:07.946907 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5m45\" (UniqueName: \"kubernetes.io/projected/dea1fdb1-ab00-4985-b0cb-3078be61f4c5-kube-api-access-k5m45\") pod \"downloads-6bcc868b7-lfqxb\" (UID: \"dea1fdb1-ab00-4985-b0cb-3078be61f4c5\") " pod="openshift-console/downloads-6bcc868b7-lfqxb" Apr 19 12:12:07.954607 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:07.954574 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5m45\" (UniqueName: \"kubernetes.io/projected/dea1fdb1-ab00-4985-b0cb-3078be61f4c5-kube-api-access-k5m45\") pod \"downloads-6bcc868b7-lfqxb\" (UID: \"dea1fdb1-ab00-4985-b0cb-3078be61f4c5\") " pod="openshift-console/downloads-6bcc868b7-lfqxb" Apr 19 12:12:08.074213 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.074160 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-lfqxb" Apr 19 12:12:08.126882 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.126773 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 19 12:12:08.130238 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.130213 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.134780 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.134540 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 19 12:12:08.134780 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.134629 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 19 12:12:08.134780 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.134776 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 19 12:12:08.134992 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.134895 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 19 12:12:08.135057 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.135012 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-p7h8h\"" Apr 19 12:12:08.135186 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.135164 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 19 12:12:08.135307 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.135256 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 19 12:12:08.135307 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.135269 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 19 12:12:08.135440 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.135388 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 19 12:12:08.135502 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.135458 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 19 12:12:08.135502 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.135490 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 19 12:12:08.135677 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.135659 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 19 12:12:08.135789 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.135767 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-e4d1jitc3a6fb\"" Apr 19 12:12:08.139562 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.139539 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 19 12:12:08.155614 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.155584 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 19 12:12:08.250243 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.250210 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.250389 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.250255 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-web-config\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.250389 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.250277 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.250389 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.250294 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-config\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.250389 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.250311 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a77860af-dadc-4d15-b10c-7995213c2600-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.250389 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.250345 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a77860af-dadc-4d15-b10c-7995213c2600-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.250389 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.250368 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.250389 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.250383 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a77860af-dadc-4d15-b10c-7995213c2600-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.250716 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.250410 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a77860af-dadc-4d15-b10c-7995213c2600-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.250716 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.250436 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.250716 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.250452 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.250716 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.250466 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4r7d\" (UniqueName: \"kubernetes.io/projected/a77860af-dadc-4d15-b10c-7995213c2600-kube-api-access-v4r7d\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.250716 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.250495 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a77860af-dadc-4d15-b10c-7995213c2600-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.250716 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.250538 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.250716 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.250560 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a77860af-dadc-4d15-b10c-7995213c2600-config-out\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.251031 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.250758 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.251031 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.250785 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a77860af-dadc-4d15-b10c-7995213c2600-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.251031 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.250824 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a77860af-dadc-4d15-b10c-7995213c2600-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.352733 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.351358 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a77860af-dadc-4d15-b10c-7995213c2600-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.352733 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.351413 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.352733 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.351440 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.352733 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.351467 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v4r7d\" (UniqueName: \"kubernetes.io/projected/a77860af-dadc-4d15-b10c-7995213c2600-kube-api-access-v4r7d\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.352733 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.351503 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a77860af-dadc-4d15-b10c-7995213c2600-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.352733 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.351558 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.352733 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.351586 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a77860af-dadc-4d15-b10c-7995213c2600-config-out\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.352733 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.351613 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.352733 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.351639 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a77860af-dadc-4d15-b10c-7995213c2600-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.352733 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.351676 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a77860af-dadc-4d15-b10c-7995213c2600-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.352733 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.351727 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.352733 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.351756 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-web-config\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.352733 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.351781 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.352733 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.351812 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-config\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.352733 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.351836 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a77860af-dadc-4d15-b10c-7995213c2600-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.352733 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.351877 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a77860af-dadc-4d15-b10c-7995213c2600-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.352733 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.351926 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.353801 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.351954 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a77860af-dadc-4d15-b10c-7995213c2600-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.353801 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.352372 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a77860af-dadc-4d15-b10c-7995213c2600-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.357004 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.355913 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a77860af-dadc-4d15-b10c-7995213c2600-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.357004 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.356032 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a77860af-dadc-4d15-b10c-7995213c2600-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.357004 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.356381 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a77860af-dadc-4d15-b10c-7995213c2600-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.357257 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.357157 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a77860af-dadc-4d15-b10c-7995213c2600-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.358879 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.358398 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a77860af-dadc-4d15-b10c-7995213c2600-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.360996 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.360963 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.361706 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.361666 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-config\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.361897 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.361874 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.362018 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.361993 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.362086 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.362001 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.362665 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.362486 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a77860af-dadc-4d15-b10c-7995213c2600-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.363334 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.363296 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-web-config\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.363334 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.363324 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.364156 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.363821 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a77860af-dadc-4d15-b10c-7995213c2600-config-out\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.364156 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.363986 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.365624 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.365571 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.366330 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.366308 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4r7d\" (UniqueName: \"kubernetes.io/projected/a77860af-dadc-4d15-b10c-7995213c2600-kube-api-access-v4r7d\") pod \"prometheus-k8s-0\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.378519 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.378395 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-lfqxb"] Apr 19 12:12:08.381907 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:12:08.381874 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddea1fdb1_ab00_4985_b0cb_3078be61f4c5.slice/crio-b837983dbb9c778fa407d2d85c03e4b5ff60c206b35d2524857ec6883d6ef383 WatchSource:0}: Error finding container b837983dbb9c778fa407d2d85c03e4b5ff60c206b35d2524857ec6883d6ef383: Status 404 returned error can't find the container with id b837983dbb9c778fa407d2d85c03e4b5ff60c206b35d2524857ec6883d6ef383 Apr 19 12:12:08.443176 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.443152 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:08.580883 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:08.580855 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 19 12:12:08.583811 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:12:08.583774 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda77860af_dadc_4d15_b10c_7995213c2600.slice/crio-09426331f13f6959cef774519a5bce910f37681b57e2a53e13499dbdb916ea98 WatchSource:0}: Error finding container 09426331f13f6959cef774519a5bce910f37681b57e2a53e13499dbdb916ea98: Status 404 returned error can't find the container with id 09426331f13f6959cef774519a5bce910f37681b57e2a53e13499dbdb916ea98 Apr 19 12:12:09.322610 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:09.322512 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-lfqxb" event={"ID":"dea1fdb1-ab00-4985-b0cb-3078be61f4c5","Type":"ContainerStarted","Data":"b837983dbb9c778fa407d2d85c03e4b5ff60c206b35d2524857ec6883d6ef383"} Apr 19 12:12:09.323791 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:09.323761 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a77860af-dadc-4d15-b10c-7995213c2600","Type":"ContainerStarted","Data":"09426331f13f6959cef774519a5bce910f37681b57e2a53e13499dbdb916ea98"} Apr 19 12:12:09.325297 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:09.325258 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-sdr94" event={"ID":"28bf133a-7fde-473d-aa9a-ba2319484941","Type":"ContainerStarted","Data":"df50b093aa9eb3964c391cc80589954360c506041f28060bcc60069635128f04"} Apr 19 12:12:09.325498 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:09.325480 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-sdr94" Apr 19 12:12:09.329181 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:09.329148 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d8865455f-6w88z" event={"ID":"7e99e30a-bcfc-4582-b11a-60738ad7760f","Type":"ContainerStarted","Data":"8661696d18b9e6a50b223e5f89900b677c4b1892fed80aa1f99e8873814fbcce"} Apr 19 12:12:09.329301 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:09.329191 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d8865455f-6w88z" event={"ID":"7e99e30a-bcfc-4582-b11a-60738ad7760f","Type":"ContainerStarted","Data":"0e7cd6ee8387de6c2db706a250afc1066f432b4a5daa02d24565c24b5afe58b8"} Apr 19 12:12:09.329301 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:09.329206 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d8865455f-6w88z" event={"ID":"7e99e30a-bcfc-4582-b11a-60738ad7760f","Type":"ContainerStarted","Data":"f931d91e0f723e839bd3d813617fcd37c8cca5594ee40fc8b2b8cc88c79a42fe"} Apr 19 12:12:09.329427 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:09.329328 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-5d8865455f-6w88z" Apr 19 12:12:09.331269 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:09.331236 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-77c459b5d9-4q4r2" event={"ID":"5870950f-5e7a-46c3-a04d-45a969812735","Type":"ContainerStarted","Data":"655cf48a21626879e03c9fd0ae08490a1be407e9e12e9e852242b2c43a736a19"} Apr 19 12:12:09.331695 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:09.331659 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-sdr94" Apr 19 12:12:09.339745 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:09.339629 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-sdr94" podStartSLOduration=1.52172728 podStartE2EDuration="3.33961719s" podCreationTimestamp="2026-04-19 12:12:06 +0000 UTC" firstStartedPulling="2026-04-19 12:12:07.161870934 +0000 UTC m=+135.916062739" lastFinishedPulling="2026-04-19 12:12:08.97976084 +0000 UTC m=+137.733952649" observedRunningTime="2026-04-19 12:12:09.338635107 +0000 UTC m=+138.092826935" watchObservedRunningTime="2026-04-19 12:12:09.33961719 +0000 UTC m=+138.093809067" Apr 19 12:12:09.373871 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:09.373515 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5d8865455f-6w88z" podStartSLOduration=2.549986186 podStartE2EDuration="6.373495494s" podCreationTimestamp="2026-04-19 12:12:03 +0000 UTC" firstStartedPulling="2026-04-19 12:12:04.410850105 +0000 UTC m=+133.165041917" lastFinishedPulling="2026-04-19 12:12:08.23435942 +0000 UTC m=+136.988551225" observedRunningTime="2026-04-19 12:12:09.370207726 +0000 UTC m=+138.124399556" watchObservedRunningTime="2026-04-19 12:12:09.373495494 +0000 UTC m=+138.127687322" Apr 19 12:12:09.388834 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:09.388772 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-77c459b5d9-4q4r2" podStartSLOduration=1.809529645 podStartE2EDuration="3.388752329s" podCreationTimestamp="2026-04-19 12:12:06 +0000 UTC" firstStartedPulling="2026-04-19 12:12:06.684289719 +0000 UTC m=+135.438481524" lastFinishedPulling="2026-04-19 12:12:08.263512391 +0000 UTC m=+137.017704208" observedRunningTime="2026-04-19 12:12:09.387581775 +0000 UTC m=+138.141773627" watchObservedRunningTime="2026-04-19 12:12:09.388752329 +0000 UTC m=+138.142944159" Apr 19 12:12:10.340432 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:10.340393 2567 generic.go:358] "Generic (PLEG): container finished" podID="a77860af-dadc-4d15-b10c-7995213c2600" containerID="91a1ba43464846b79d6ef3691a2b85a728aacdace5722e24244963d4e89efca0" exitCode=0 Apr 19 12:12:10.340896 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:10.340522 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a77860af-dadc-4d15-b10c-7995213c2600","Type":"ContainerDied","Data":"91a1ba43464846b79d6ef3691a2b85a728aacdace5722e24244963d4e89efca0"} Apr 19 12:12:14.358456 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:14.358286 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a77860af-dadc-4d15-b10c-7995213c2600","Type":"ContainerStarted","Data":"35101c2a8b203888c3388ec00c4fb683b41471bf06a14673022266375de5d668"} Apr 19 12:12:14.358456 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:14.358409 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a77860af-dadc-4d15-b10c-7995213c2600","Type":"ContainerStarted","Data":"ba6e2c544a621988627e6daaf3ac4d56807d49e05dd16cf45cd5989ea2d4c5a2"} Apr 19 12:12:14.358910 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:14.358422 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a77860af-dadc-4d15-b10c-7995213c2600","Type":"ContainerStarted","Data":"fae900886f966f530d32a1d65b00f1f2a7e156c0acb15899a4ac261e983e4b32"} Apr 19 12:12:14.358910 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:14.358542 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a77860af-dadc-4d15-b10c-7995213c2600","Type":"ContainerStarted","Data":"9778e4e5777f05f8e47485371fbb6682670961c42aafc10093e370bc8a1f305b"} Apr 19 12:12:14.358910 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:14.358558 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a77860af-dadc-4d15-b10c-7995213c2600","Type":"ContainerStarted","Data":"0573a0af300bc1b54365b5c96483be8e0e196b00a13a63b4a575c57f4fced981"} Apr 19 12:12:14.358910 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:14.358573 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a77860af-dadc-4d15-b10c-7995213c2600","Type":"ContainerStarted","Data":"8ee496506528517f29b6b00d8320d6426364f3758cecd47625da8216c68d3954"} Apr 19 12:12:14.398594 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:14.398522 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.198692119 podStartE2EDuration="6.398500877s" podCreationTimestamp="2026-04-19 12:12:08 +0000 UTC" firstStartedPulling="2026-04-19 12:12:08.586162146 +0000 UTC m=+137.340353958" lastFinishedPulling="2026-04-19 12:12:13.7859709 +0000 UTC m=+142.540162716" observedRunningTime="2026-04-19 12:12:14.396519417 +0000 UTC m=+143.150711276" watchObservedRunningTime="2026-04-19 12:12:14.398500877 +0000 UTC m=+143.152692705" Apr 19 12:12:15.348323 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:15.348289 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5d8865455f-6w88z" Apr 19 12:12:18.443463 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:18.443420 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:12:26.398644 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:26.398601 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-lfqxb" event={"ID":"dea1fdb1-ab00-4985-b0cb-3078be61f4c5","Type":"ContainerStarted","Data":"f3f48a708624e5753539689a04c96eccbdbb9d699e8ad6a473d1a97908506fd5"} Apr 19 12:12:26.399023 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:26.398798 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-lfqxb" Apr 19 12:12:26.409449 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:26.409417 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-lfqxb" Apr 19 12:12:26.415462 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:26.415408 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-lfqxb" podStartSLOduration=2.317854812 podStartE2EDuration="19.415391517s" podCreationTimestamp="2026-04-19 12:12:07 +0000 UTC" firstStartedPulling="2026-04-19 12:12:08.387308471 +0000 UTC m=+137.141500280" lastFinishedPulling="2026-04-19 12:12:25.484845166 +0000 UTC m=+154.239036985" observedRunningTime="2026-04-19 12:12:26.41407591 +0000 UTC m=+155.168267733" watchObservedRunningTime="2026-04-19 12:12:26.415391517 +0000 UTC m=+155.169583345" Apr 19 12:12:26.555007 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:26.554958 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-77c459b5d9-4q4r2" Apr 19 12:12:26.555194 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:26.555020 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-77c459b5d9-4q4r2" Apr 19 12:12:27.108752 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:12:27.108704 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-7jlhs" podUID="bb22769b-c18a-471e-9118-2fca21dc6606" Apr 19 12:12:27.114997 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:12:27.114939 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-l5qnw" podUID="0cdd2a15-9ab0-45f9-9373-938372482e1a" Apr 19 12:12:27.402519 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:27.402433 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-l5qnw" Apr 19 12:12:27.402519 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:27.402456 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7jlhs" Apr 19 12:12:31.982085 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:31.982041 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bb22769b-c18a-471e-9118-2fca21dc6606-metrics-tls\") pod \"dns-default-7jlhs\" (UID: \"bb22769b-c18a-471e-9118-2fca21dc6606\") " pod="openshift-dns/dns-default-7jlhs" Apr 19 12:12:31.982686 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:31.982135 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cdd2a15-9ab0-45f9-9373-938372482e1a-cert\") pod \"ingress-canary-l5qnw\" (UID: \"0cdd2a15-9ab0-45f9-9373-938372482e1a\") " pod="openshift-ingress-canary/ingress-canary-l5qnw" Apr 19 12:12:31.984781 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:31.984753 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bb22769b-c18a-471e-9118-2fca21dc6606-metrics-tls\") pod \"dns-default-7jlhs\" (UID: \"bb22769b-c18a-471e-9118-2fca21dc6606\") " pod="openshift-dns/dns-default-7jlhs" Apr 19 12:12:31.985021 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:31.985001 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cdd2a15-9ab0-45f9-9373-938372482e1a-cert\") pod \"ingress-canary-l5qnw\" (UID: \"0cdd2a15-9ab0-45f9-9373-938372482e1a\") " pod="openshift-ingress-canary/ingress-canary-l5qnw" Apr 19 12:12:32.206080 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:32.206040 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7hvqw\"" Apr 19 12:12:32.206622 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:32.206594 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-zfmpv\"" Apr 19 12:12:32.214505 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:32.214474 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7jlhs" Apr 19 12:12:32.214663 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:32.214549 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-l5qnw" Apr 19 12:12:32.370162 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:32.369805 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7jlhs"] Apr 19 12:12:32.372925 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:12:32.372647 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb22769b_c18a_471e_9118_2fca21dc6606.slice/crio-0e99c5536fa05108000a81f8c7f62fd34dd6c14af5602fc48e2daa745d074c34 WatchSource:0}: Error finding container 0e99c5536fa05108000a81f8c7f62fd34dd6c14af5602fc48e2daa745d074c34: Status 404 returned error can't find the container with id 0e99c5536fa05108000a81f8c7f62fd34dd6c14af5602fc48e2daa745d074c34 Apr 19 12:12:32.389025 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:32.388874 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-l5qnw"] Apr 19 12:12:32.392068 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:12:32.392029 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cdd2a15_9ab0_45f9_9373_938372482e1a.slice/crio-c4c055ef69dbb8952bc6668b87df0c3835b08fd6079d660f0bce131b28618cc0 WatchSource:0}: Error finding container c4c055ef69dbb8952bc6668b87df0c3835b08fd6079d660f0bce131b28618cc0: Status 404 returned error can't find the container with id c4c055ef69dbb8952bc6668b87df0c3835b08fd6079d660f0bce131b28618cc0 Apr 19 12:12:32.420415 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:32.420376 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7jlhs" event={"ID":"bb22769b-c18a-471e-9118-2fca21dc6606","Type":"ContainerStarted","Data":"0e99c5536fa05108000a81f8c7f62fd34dd6c14af5602fc48e2daa745d074c34"} Apr 19 12:12:32.421646 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:32.421615 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-l5qnw" event={"ID":"0cdd2a15-9ab0-45f9-9373-938372482e1a","Type":"ContainerStarted","Data":"c4c055ef69dbb8952bc6668b87df0c3835b08fd6079d660f0bce131b28618cc0"} Apr 19 12:12:35.432691 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:35.432657 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-l5qnw" event={"ID":"0cdd2a15-9ab0-45f9-9373-938372482e1a","Type":"ContainerStarted","Data":"af456c66be5deeeeff6cbb5f4c1eb0b9053f2e86f4a15740399b9c8566b53e80"} Apr 19 12:12:35.434396 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:35.434371 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7jlhs" event={"ID":"bb22769b-c18a-471e-9118-2fca21dc6606","Type":"ContainerStarted","Data":"424227a746d911bc9fa325fddc2d21d6e090e6fb312430ef200d38f9f932fa57"} Apr 19 12:12:35.434396 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:35.434401 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7jlhs" event={"ID":"bb22769b-c18a-471e-9118-2fca21dc6606","Type":"ContainerStarted","Data":"78ae0e91f76b5538ab26a77c7aebd7a9078fb4754112477c7d5c1fbb8c182ee3"} Apr 19 12:12:35.434562 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:35.434499 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-7jlhs" Apr 19 12:12:35.445379 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:35.445326 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-l5qnw" podStartSLOduration=128.844751022 podStartE2EDuration="2m11.445310103s" podCreationTimestamp="2026-04-19 12:10:24 +0000 UTC" firstStartedPulling="2026-04-19 12:12:32.39429991 +0000 UTC m=+161.148491716" lastFinishedPulling="2026-04-19 12:12:34.994858991 +0000 UTC m=+163.749050797" observedRunningTime="2026-04-19 12:12:35.444921042 +0000 UTC m=+164.199112869" watchObservedRunningTime="2026-04-19 12:12:35.445310103 +0000 UTC m=+164.199501922" Apr 19 12:12:35.461518 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:35.461461 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-7jlhs" podStartSLOduration=128.846348419 podStartE2EDuration="2m11.46144792s" podCreationTimestamp="2026-04-19 12:10:24 +0000 UTC" firstStartedPulling="2026-04-19 12:12:32.374949981 +0000 UTC m=+161.129141798" lastFinishedPulling="2026-04-19 12:12:34.990049491 +0000 UTC m=+163.744241299" observedRunningTime="2026-04-19 12:12:35.459615571 +0000 UTC m=+164.213807398" watchObservedRunningTime="2026-04-19 12:12:35.46144792 +0000 UTC m=+164.215639754" Apr 19 12:12:45.440507 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:45.440467 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-7jlhs" Apr 19 12:12:46.559676 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:46.559640 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-77c459b5d9-4q4r2" Apr 19 12:12:46.563509 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:46.563487 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-77c459b5d9-4q4r2" Apr 19 12:12:49.480372 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:49.480287 2567 generic.go:358] "Generic (PLEG): container finished" podID="0ff01778-4b50-4a8a-ab4e-abf54e99b970" containerID="16da9d22ff6d5c44228cc72b110cc586b33dc1571562544b38630a7a4a413d5d" exitCode=0 Apr 19 12:12:49.480372 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:49.480362 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xhcck" event={"ID":"0ff01778-4b50-4a8a-ab4e-abf54e99b970","Type":"ContainerDied","Data":"16da9d22ff6d5c44228cc72b110cc586b33dc1571562544b38630a7a4a413d5d"} Apr 19 12:12:49.480789 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:49.480710 2567 scope.go:117] "RemoveContainer" containerID="16da9d22ff6d5c44228cc72b110cc586b33dc1571562544b38630a7a4a413d5d" Apr 19 12:12:49.481864 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:49.481838 2567 generic.go:358] "Generic (PLEG): container finished" podID="a80fa3ab-60ec-4151-bfb0-4bc0ed470f79" containerID="69ff75f7c77e6da02208b8e9d3d69820ade4b764d1a5b12002983165f475373c" exitCode=0 Apr 19 12:12:49.481944 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:49.481900 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-b5csj" event={"ID":"a80fa3ab-60ec-4151-bfb0-4bc0ed470f79","Type":"ContainerDied","Data":"69ff75f7c77e6da02208b8e9d3d69820ade4b764d1a5b12002983165f475373c"} Apr 19 12:12:49.482208 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:49.482194 2567 scope.go:117] "RemoveContainer" containerID="69ff75f7c77e6da02208b8e9d3d69820ade4b764d1a5b12002983165f475373c" Apr 19 12:12:50.486063 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:50.486016 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-xhcck" event={"ID":"0ff01778-4b50-4a8a-ab4e-abf54e99b970","Type":"ContainerStarted","Data":"0bea3e0c8513fa994b00e60091b12a8d15a3cfdc0b091df3d2b3b12f8d9b4716"} Apr 19 12:12:50.487532 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:50.487508 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-b5csj" event={"ID":"a80fa3ab-60ec-4151-bfb0-4bc0ed470f79","Type":"ContainerStarted","Data":"88f79a1b5d8459f01db9335e19cfeab42e9582ec0fe4c9cb4a9718de8de0364d"} Apr 19 12:12:50.591480 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:50.591451 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7jlhs_bb22769b-c18a-471e-9118-2fca21dc6606/dns/0.log" Apr 19 12:12:50.791881 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:50.791810 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7jlhs_bb22769b-c18a-471e-9118-2fca21dc6606/kube-rbac-proxy/0.log" Apr 19 12:12:51.391608 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:51.391578 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-8gjrs_8cd0f946-6502-4d2a-94d4-721582219a2f/dns-node-resolver/0.log" Apr 19 12:12:52.592198 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:12:52.592168 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-l5qnw_0cdd2a15-9ab0-45f9-9373-938372482e1a/serve-healthcheck-canary/0.log" Apr 19 12:13:08.444188 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:08.444143 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:08.507324 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:08.507299 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:08.560334 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:08.560309 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:26.446271 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:26.446232 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 19 12:13:26.446784 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:26.446649 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a77860af-dadc-4d15-b10c-7995213c2600" containerName="prometheus" containerID="cri-o://8ee496506528517f29b6b00d8320d6426364f3758cecd47625da8216c68d3954" gracePeriod=600 Apr 19 12:13:26.446784 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:26.446692 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a77860af-dadc-4d15-b10c-7995213c2600" containerName="kube-rbac-proxy-thanos" containerID="cri-o://35101c2a8b203888c3388ec00c4fb683b41471bf06a14673022266375de5d668" gracePeriod=600 Apr 19 12:13:26.446784 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:26.446715 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a77860af-dadc-4d15-b10c-7995213c2600" containerName="config-reloader" containerID="cri-o://0573a0af300bc1b54365b5c96483be8e0e196b00a13a63b4a575c57f4fced981" gracePeriod=600 Apr 19 12:13:26.446784 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:26.446715 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a77860af-dadc-4d15-b10c-7995213c2600" containerName="kube-rbac-proxy-web" containerID="cri-o://fae900886f966f530d32a1d65b00f1f2a7e156c0acb15899a4ac261e983e4b32" gracePeriod=600 Apr 19 12:13:26.447002 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:26.446692 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a77860af-dadc-4d15-b10c-7995213c2600" containerName="thanos-sidecar" containerID="cri-o://9778e4e5777f05f8e47485371fbb6682670961c42aafc10093e370bc8a1f305b" gracePeriod=600 Apr 19 12:13:26.447002 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:26.446659 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a77860af-dadc-4d15-b10c-7995213c2600" containerName="kube-rbac-proxy" containerID="cri-o://ba6e2c544a621988627e6daaf3ac4d56807d49e05dd16cf45cd5989ea2d4c5a2" gracePeriod=600 Apr 19 12:13:26.604186 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:26.604156 2567 generic.go:358] "Generic (PLEG): container finished" podID="a77860af-dadc-4d15-b10c-7995213c2600" containerID="35101c2a8b203888c3388ec00c4fb683b41471bf06a14673022266375de5d668" exitCode=0 Apr 19 12:13:26.604186 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:26.604182 2567 generic.go:358] "Generic (PLEG): container finished" podID="a77860af-dadc-4d15-b10c-7995213c2600" containerID="ba6e2c544a621988627e6daaf3ac4d56807d49e05dd16cf45cd5989ea2d4c5a2" exitCode=0 Apr 19 12:13:26.604186 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:26.604188 2567 generic.go:358] "Generic (PLEG): container finished" podID="a77860af-dadc-4d15-b10c-7995213c2600" containerID="9778e4e5777f05f8e47485371fbb6682670961c42aafc10093e370bc8a1f305b" exitCode=0 Apr 19 12:13:26.604186 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:26.604193 2567 generic.go:358] "Generic (PLEG): container finished" podID="a77860af-dadc-4d15-b10c-7995213c2600" containerID="0573a0af300bc1b54365b5c96483be8e0e196b00a13a63b4a575c57f4fced981" exitCode=0 Apr 19 12:13:26.604423 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:26.604199 2567 generic.go:358] "Generic (PLEG): container finished" podID="a77860af-dadc-4d15-b10c-7995213c2600" containerID="8ee496506528517f29b6b00d8320d6426364f3758cecd47625da8216c68d3954" exitCode=0 Apr 19 12:13:26.604423 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:26.604223 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a77860af-dadc-4d15-b10c-7995213c2600","Type":"ContainerDied","Data":"35101c2a8b203888c3388ec00c4fb683b41471bf06a14673022266375de5d668"} Apr 19 12:13:26.604423 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:26.604256 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a77860af-dadc-4d15-b10c-7995213c2600","Type":"ContainerDied","Data":"ba6e2c544a621988627e6daaf3ac4d56807d49e05dd16cf45cd5989ea2d4c5a2"} Apr 19 12:13:26.604423 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:26.604267 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a77860af-dadc-4d15-b10c-7995213c2600","Type":"ContainerDied","Data":"9778e4e5777f05f8e47485371fbb6682670961c42aafc10093e370bc8a1f305b"} Apr 19 12:13:26.604423 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:26.604275 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a77860af-dadc-4d15-b10c-7995213c2600","Type":"ContainerDied","Data":"0573a0af300bc1b54365b5c96483be8e0e196b00a13a63b4a575c57f4fced981"} Apr 19 12:13:26.604423 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:26.604283 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a77860af-dadc-4d15-b10c-7995213c2600","Type":"ContainerDied","Data":"8ee496506528517f29b6b00d8320d6426364f3758cecd47625da8216c68d3954"} Apr 19 12:13:27.609878 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.609844 2567 generic.go:358] "Generic (PLEG): container finished" podID="a77860af-dadc-4d15-b10c-7995213c2600" containerID="fae900886f966f530d32a1d65b00f1f2a7e156c0acb15899a4ac261e983e4b32" exitCode=0 Apr 19 12:13:27.610250 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.609917 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a77860af-dadc-4d15-b10c-7995213c2600","Type":"ContainerDied","Data":"fae900886f966f530d32a1d65b00f1f2a7e156c0acb15899a4ac261e983e4b32"} Apr 19 12:13:27.694955 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.694931 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:27.777898 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.777818 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a77860af-dadc-4d15-b10c-7995213c2600-configmap-metrics-client-ca\") pod \"a77860af-dadc-4d15-b10c-7995213c2600\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " Apr 19 12:13:27.777898 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.777860 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a77860af-dadc-4d15-b10c-7995213c2600-config-out\") pod \"a77860af-dadc-4d15-b10c-7995213c2600\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " Apr 19 12:13:27.777898 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.777880 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-thanos-prometheus-http-client-file\") pod \"a77860af-dadc-4d15-b10c-7995213c2600\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " Apr 19 12:13:27.777898 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.777898 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-web-config\") pod \"a77860af-dadc-4d15-b10c-7995213c2600\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " Apr 19 12:13:27.778264 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.777912 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a77860af-dadc-4d15-b10c-7995213c2600-prometheus-k8s-db\") pod \"a77860af-dadc-4d15-b10c-7995213c2600\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " Apr 19 12:13:27.778264 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.777931 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a77860af-dadc-4d15-b10c-7995213c2600-tls-assets\") pod \"a77860af-dadc-4d15-b10c-7995213c2600\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " Apr 19 12:13:27.778264 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.777959 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a77860af-dadc-4d15-b10c-7995213c2600-configmap-kubelet-serving-ca-bundle\") pod \"a77860af-dadc-4d15-b10c-7995213c2600\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " Apr 19 12:13:27.778264 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.777981 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a77860af-dadc-4d15-b10c-7995213c2600-prometheus-trusted-ca-bundle\") pod \"a77860af-dadc-4d15-b10c-7995213c2600\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " Apr 19 12:13:27.778264 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.778015 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"a77860af-dadc-4d15-b10c-7995213c2600\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " Apr 19 12:13:27.778264 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.778049 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-secret-kube-rbac-proxy\") pod \"a77860af-dadc-4d15-b10c-7995213c2600\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " Apr 19 12:13:27.778264 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.778078 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-config\") pod \"a77860af-dadc-4d15-b10c-7995213c2600\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " Apr 19 12:13:27.778264 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.778103 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a77860af-dadc-4d15-b10c-7995213c2600-configmap-serving-certs-ca-bundle\") pod \"a77860af-dadc-4d15-b10c-7995213c2600\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " Apr 19 12:13:27.778264 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.778161 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"a77860af-dadc-4d15-b10c-7995213c2600\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " Apr 19 12:13:27.778264 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.778187 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-secret-metrics-client-certs\") pod \"a77860af-dadc-4d15-b10c-7995213c2600\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " Apr 19 12:13:27.778264 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.778211 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-secret-grpc-tls\") pod \"a77860af-dadc-4d15-b10c-7995213c2600\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " Apr 19 12:13:27.778264 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.778253 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-secret-prometheus-k8s-tls\") pod \"a77860af-dadc-4d15-b10c-7995213c2600\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " Apr 19 12:13:27.778858 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.778302 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4r7d\" (UniqueName: \"kubernetes.io/projected/a77860af-dadc-4d15-b10c-7995213c2600-kube-api-access-v4r7d\") pod \"a77860af-dadc-4d15-b10c-7995213c2600\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " Apr 19 12:13:27.778858 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.778330 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a77860af-dadc-4d15-b10c-7995213c2600-prometheus-k8s-rulefiles-0\") pod \"a77860af-dadc-4d15-b10c-7995213c2600\" (UID: \"a77860af-dadc-4d15-b10c-7995213c2600\") " Apr 19 12:13:27.778858 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.778322 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a77860af-dadc-4d15-b10c-7995213c2600-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "a77860af-dadc-4d15-b10c-7995213c2600" (UID: "a77860af-dadc-4d15-b10c-7995213c2600"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:13:27.778858 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.778469 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a77860af-dadc-4d15-b10c-7995213c2600-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "a77860af-dadc-4d15-b10c-7995213c2600" (UID: "a77860af-dadc-4d15-b10c-7995213c2600"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:13:27.779064 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.778968 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a77860af-dadc-4d15-b10c-7995213c2600-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "a77860af-dadc-4d15-b10c-7995213c2600" (UID: "a77860af-dadc-4d15-b10c-7995213c2600"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:13:27.779364 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.779331 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a77860af-dadc-4d15-b10c-7995213c2600-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "a77860af-dadc-4d15-b10c-7995213c2600" (UID: "a77860af-dadc-4d15-b10c-7995213c2600"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:13:27.780035 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.779783 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a77860af-dadc-4d15-b10c-7995213c2600-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "a77860af-dadc-4d15-b10c-7995213c2600" (UID: "a77860af-dadc-4d15-b10c-7995213c2600"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:13:27.781448 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.781412 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a77860af-dadc-4d15-b10c-7995213c2600-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "a77860af-dadc-4d15-b10c-7995213c2600" (UID: "a77860af-dadc-4d15-b10c-7995213c2600"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:13:27.782175 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.782140 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a77860af-dadc-4d15-b10c-7995213c2600-config-out" (OuterVolumeSpecName: "config-out") pod "a77860af-dadc-4d15-b10c-7995213c2600" (UID: "a77860af-dadc-4d15-b10c-7995213c2600"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:13:27.782270 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.782240 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "a77860af-dadc-4d15-b10c-7995213c2600" (UID: "a77860af-dadc-4d15-b10c-7995213c2600"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:13:27.782324 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.782303 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "a77860af-dadc-4d15-b10c-7995213c2600" (UID: "a77860af-dadc-4d15-b10c-7995213c2600"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:13:27.783044 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.783019 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "a77860af-dadc-4d15-b10c-7995213c2600" (UID: "a77860af-dadc-4d15-b10c-7995213c2600"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:13:27.783044 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.783025 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a77860af-dadc-4d15-b10c-7995213c2600-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "a77860af-dadc-4d15-b10c-7995213c2600" (UID: "a77860af-dadc-4d15-b10c-7995213c2600"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:13:27.783200 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.783096 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "a77860af-dadc-4d15-b10c-7995213c2600" (UID: "a77860af-dadc-4d15-b10c-7995213c2600"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:13:27.783314 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.783294 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a77860af-dadc-4d15-b10c-7995213c2600-kube-api-access-v4r7d" (OuterVolumeSpecName: "kube-api-access-v4r7d") pod "a77860af-dadc-4d15-b10c-7995213c2600" (UID: "a77860af-dadc-4d15-b10c-7995213c2600"). InnerVolumeSpecName "kube-api-access-v4r7d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:13:27.783446 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.783424 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "a77860af-dadc-4d15-b10c-7995213c2600" (UID: "a77860af-dadc-4d15-b10c-7995213c2600"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:13:27.783667 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.783634 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-config" (OuterVolumeSpecName: "config") pod "a77860af-dadc-4d15-b10c-7995213c2600" (UID: "a77860af-dadc-4d15-b10c-7995213c2600"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:13:27.783764 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.783696 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "a77860af-dadc-4d15-b10c-7995213c2600" (UID: "a77860af-dadc-4d15-b10c-7995213c2600"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:13:27.783764 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.783739 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "a77860af-dadc-4d15-b10c-7995213c2600" (UID: "a77860af-dadc-4d15-b10c-7995213c2600"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:13:27.792755 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.792728 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-web-config" (OuterVolumeSpecName: "web-config") pod "a77860af-dadc-4d15-b10c-7995213c2600" (UID: "a77860af-dadc-4d15-b10c-7995213c2600"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:13:27.879311 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.879274 2567 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-131-150.ec2.internal\" DevicePath \"\"" Apr 19 12:13:27.879311 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.879302 2567 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-secret-metrics-client-certs\") on node \"ip-10-0-131-150.ec2.internal\" DevicePath \"\"" Apr 19 12:13:27.879311 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.879313 2567 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-secret-grpc-tls\") on node \"ip-10-0-131-150.ec2.internal\" DevicePath \"\"" Apr 19 12:13:27.879501 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.879322 2567 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-secret-prometheus-k8s-tls\") on node \"ip-10-0-131-150.ec2.internal\" DevicePath \"\"" Apr 19 12:13:27.879501 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.879332 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v4r7d\" (UniqueName: \"kubernetes.io/projected/a77860af-dadc-4d15-b10c-7995213c2600-kube-api-access-v4r7d\") on node \"ip-10-0-131-150.ec2.internal\" DevicePath \"\"" Apr 19 12:13:27.879501 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.879340 2567 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a77860af-dadc-4d15-b10c-7995213c2600-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-131-150.ec2.internal\" DevicePath \"\"" Apr 19 12:13:27.879501 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.879349 2567 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a77860af-dadc-4d15-b10c-7995213c2600-configmap-metrics-client-ca\") on node \"ip-10-0-131-150.ec2.internal\" DevicePath \"\"" Apr 19 12:13:27.879501 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.879358 2567 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a77860af-dadc-4d15-b10c-7995213c2600-config-out\") on node \"ip-10-0-131-150.ec2.internal\" DevicePath \"\"" Apr 19 12:13:27.879501 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.879366 2567 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-thanos-prometheus-http-client-file\") on node \"ip-10-0-131-150.ec2.internal\" DevicePath \"\"" Apr 19 12:13:27.879501 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.879374 2567 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-web-config\") on node \"ip-10-0-131-150.ec2.internal\" DevicePath \"\"" Apr 19 12:13:27.879501 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.879384 2567 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a77860af-dadc-4d15-b10c-7995213c2600-prometheus-k8s-db\") on node \"ip-10-0-131-150.ec2.internal\" DevicePath \"\"" Apr 19 12:13:27.879501 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.879393 2567 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a77860af-dadc-4d15-b10c-7995213c2600-tls-assets\") on node \"ip-10-0-131-150.ec2.internal\" DevicePath \"\"" Apr 19 12:13:27.879501 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.879402 2567 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a77860af-dadc-4d15-b10c-7995213c2600-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-131-150.ec2.internal\" DevicePath \"\"" Apr 19 12:13:27.879501 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.879410 2567 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a77860af-dadc-4d15-b10c-7995213c2600-prometheus-trusted-ca-bundle\") on node \"ip-10-0-131-150.ec2.internal\" DevicePath \"\"" Apr 19 12:13:27.879501 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.879421 2567 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-131-150.ec2.internal\" DevicePath \"\"" Apr 19 12:13:27.879501 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.879429 2567 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-secret-kube-rbac-proxy\") on node \"ip-10-0-131-150.ec2.internal\" DevicePath \"\"" Apr 19 12:13:27.879501 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.879437 2567 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a77860af-dadc-4d15-b10c-7995213c2600-config\") on node \"ip-10-0-131-150.ec2.internal\" DevicePath \"\"" Apr 19 12:13:27.879501 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:27.879446 2567 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a77860af-dadc-4d15-b10c-7995213c2600-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-131-150.ec2.internal\" DevicePath \"\"" Apr 19 12:13:28.615223 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.615183 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a77860af-dadc-4d15-b10c-7995213c2600","Type":"ContainerDied","Data":"09426331f13f6959cef774519a5bce910f37681b57e2a53e13499dbdb916ea98"} Apr 19 12:13:28.615223 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.615229 2567 scope.go:117] "RemoveContainer" containerID="35101c2a8b203888c3388ec00c4fb683b41471bf06a14673022266375de5d668" Apr 19 12:13:28.615725 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.615262 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.622670 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.622649 2567 scope.go:117] "RemoveContainer" containerID="ba6e2c544a621988627e6daaf3ac4d56807d49e05dd16cf45cd5989ea2d4c5a2" Apr 19 12:13:28.629283 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.629261 2567 scope.go:117] "RemoveContainer" containerID="fae900886f966f530d32a1d65b00f1f2a7e156c0acb15899a4ac261e983e4b32" Apr 19 12:13:28.635246 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.635223 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 19 12:13:28.636872 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.636854 2567 scope.go:117] "RemoveContainer" containerID="9778e4e5777f05f8e47485371fbb6682670961c42aafc10093e370bc8a1f305b" Apr 19 12:13:28.639542 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.639522 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 19 12:13:28.643753 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.643736 2567 scope.go:117] "RemoveContainer" containerID="0573a0af300bc1b54365b5c96483be8e0e196b00a13a63b4a575c57f4fced981" Apr 19 12:13:28.650324 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.650307 2567 scope.go:117] "RemoveContainer" containerID="8ee496506528517f29b6b00d8320d6426364f3758cecd47625da8216c68d3954" Apr 19 12:13:28.657329 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.657312 2567 scope.go:117] "RemoveContainer" containerID="91a1ba43464846b79d6ef3691a2b85a728aacdace5722e24244963d4e89efca0" Apr 19 12:13:28.667398 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.667377 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 19 12:13:28.667682 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.667671 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a77860af-dadc-4d15-b10c-7995213c2600" containerName="kube-rbac-proxy-web" Apr 19 12:13:28.667719 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.667685 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="a77860af-dadc-4d15-b10c-7995213c2600" containerName="kube-rbac-proxy-web" Apr 19 12:13:28.667719 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.667695 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a77860af-dadc-4d15-b10c-7995213c2600" containerName="kube-rbac-proxy" Apr 19 12:13:28.667719 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.667701 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="a77860af-dadc-4d15-b10c-7995213c2600" containerName="kube-rbac-proxy" Apr 19 12:13:28.667719 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.667711 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a77860af-dadc-4d15-b10c-7995213c2600" containerName="thanos-sidecar" Apr 19 12:13:28.667719 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.667717 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="a77860af-dadc-4d15-b10c-7995213c2600" containerName="thanos-sidecar" Apr 19 12:13:28.667902 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.667725 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a77860af-dadc-4d15-b10c-7995213c2600" containerName="prometheus" Apr 19 12:13:28.667902 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.667730 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="a77860af-dadc-4d15-b10c-7995213c2600" containerName="prometheus" Apr 19 12:13:28.667902 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.667739 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a77860af-dadc-4d15-b10c-7995213c2600" containerName="init-config-reloader" Apr 19 12:13:28.667902 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.667744 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="a77860af-dadc-4d15-b10c-7995213c2600" containerName="init-config-reloader" Apr 19 12:13:28.667902 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.667751 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a77860af-dadc-4d15-b10c-7995213c2600" containerName="config-reloader" Apr 19 12:13:28.667902 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.667756 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="a77860af-dadc-4d15-b10c-7995213c2600" containerName="config-reloader" Apr 19 12:13:28.667902 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.667763 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a77860af-dadc-4d15-b10c-7995213c2600" containerName="kube-rbac-proxy-thanos" Apr 19 12:13:28.667902 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.667769 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="a77860af-dadc-4d15-b10c-7995213c2600" containerName="kube-rbac-proxy-thanos" Apr 19 12:13:28.667902 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.667823 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="a77860af-dadc-4d15-b10c-7995213c2600" containerName="kube-rbac-proxy" Apr 19 12:13:28.667902 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.667834 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="a77860af-dadc-4d15-b10c-7995213c2600" containerName="kube-rbac-proxy-web" Apr 19 12:13:28.667902 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.667840 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="a77860af-dadc-4d15-b10c-7995213c2600" containerName="thanos-sidecar" Apr 19 12:13:28.667902 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.667848 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="a77860af-dadc-4d15-b10c-7995213c2600" containerName="config-reloader" Apr 19 12:13:28.667902 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.667855 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="a77860af-dadc-4d15-b10c-7995213c2600" containerName="prometheus" Apr 19 12:13:28.667902 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.667862 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="a77860af-dadc-4d15-b10c-7995213c2600" containerName="kube-rbac-proxy-thanos" Apr 19 12:13:28.673285 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.673270 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.675567 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.675533 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 19 12:13:28.675715 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.675681 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 19 12:13:28.675849 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.675763 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 19 12:13:28.675849 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.675777 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 19 12:13:28.676193 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.676176 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 19 12:13:28.676455 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.676437 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 19 12:13:28.676571 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.676494 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-e4d1jitc3a6fb\"" Apr 19 12:13:28.676571 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.676517 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-p7h8h\"" Apr 19 12:13:28.676655 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.676580 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 19 12:13:28.676730 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.676716 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 19 12:13:28.676926 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.676747 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 19 12:13:28.676926 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.676748 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 19 12:13:28.678812 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.678790 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 19 12:13:28.680413 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.680189 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 19 12:13:28.683461 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.683441 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 19 12:13:28.786697 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.786654 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv8ls\" (UniqueName: \"kubernetes.io/projected/31bab116-4755-4146-bc5c-30aebd8a4641-kube-api-access-hv8ls\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.786697 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.786694 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31bab116-4755-4146-bc5c-30aebd8a4641-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.786949 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.786713 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/31bab116-4755-4146-bc5c-30aebd8a4641-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.786949 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.786786 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/31bab116-4755-4146-bc5c-30aebd8a4641-config\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.786949 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.786833 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/31bab116-4755-4146-bc5c-30aebd8a4641-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.786949 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.786863 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/31bab116-4755-4146-bc5c-30aebd8a4641-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.786949 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.786897 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/31bab116-4755-4146-bc5c-30aebd8a4641-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.786949 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.786941 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/31bab116-4755-4146-bc5c-30aebd8a4641-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.787273 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.786976 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/31bab116-4755-4146-bc5c-30aebd8a4641-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.787273 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.787002 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/31bab116-4755-4146-bc5c-30aebd8a4641-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.787273 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.787031 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/31bab116-4755-4146-bc5c-30aebd8a4641-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.787273 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.787058 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/31bab116-4755-4146-bc5c-30aebd8a4641-web-config\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.787273 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.787090 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31bab116-4755-4146-bc5c-30aebd8a4641-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.787273 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.787160 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/31bab116-4755-4146-bc5c-30aebd8a4641-config-out\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.787273 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.787191 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/31bab116-4755-4146-bc5c-30aebd8a4641-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.787273 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.787215 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/31bab116-4755-4146-bc5c-30aebd8a4641-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.787273 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.787241 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31bab116-4755-4146-bc5c-30aebd8a4641-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.787533 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.787282 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/31bab116-4755-4146-bc5c-30aebd8a4641-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.888527 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.888432 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/31bab116-4755-4146-bc5c-30aebd8a4641-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.888527 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.888495 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/31bab116-4755-4146-bc5c-30aebd8a4641-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.888527 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.888523 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/31bab116-4755-4146-bc5c-30aebd8a4641-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.888814 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.888553 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/31bab116-4755-4146-bc5c-30aebd8a4641-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.888814 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.888580 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/31bab116-4755-4146-bc5c-30aebd8a4641-web-config\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.888814 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.888609 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31bab116-4755-4146-bc5c-30aebd8a4641-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.888814 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.888653 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/31bab116-4755-4146-bc5c-30aebd8a4641-config-out\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.888814 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.888686 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/31bab116-4755-4146-bc5c-30aebd8a4641-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.888814 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.888721 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/31bab116-4755-4146-bc5c-30aebd8a4641-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.888814 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.888774 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31bab116-4755-4146-bc5c-30aebd8a4641-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.889187 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.888822 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/31bab116-4755-4146-bc5c-30aebd8a4641-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.889187 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.888864 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hv8ls\" (UniqueName: \"kubernetes.io/projected/31bab116-4755-4146-bc5c-30aebd8a4641-kube-api-access-hv8ls\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.889187 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.888890 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31bab116-4755-4146-bc5c-30aebd8a4641-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.889187 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.888918 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/31bab116-4755-4146-bc5c-30aebd8a4641-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.889187 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.888946 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/31bab116-4755-4146-bc5c-30aebd8a4641-config\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.889187 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.888972 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/31bab116-4755-4146-bc5c-30aebd8a4641-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.889187 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.889008 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/31bab116-4755-4146-bc5c-30aebd8a4641-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.889187 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.889046 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/31bab116-4755-4146-bc5c-30aebd8a4641-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.889584 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.889470 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31bab116-4755-4146-bc5c-30aebd8a4641-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.889584 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.889472 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31bab116-4755-4146-bc5c-30aebd8a4641-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.891677 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.891647 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/31bab116-4755-4146-bc5c-30aebd8a4641-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.891810 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.891710 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/31bab116-4755-4146-bc5c-30aebd8a4641-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.891978 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.891950 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/31bab116-4755-4146-bc5c-30aebd8a4641-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.892084 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.892040 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/31bab116-4755-4146-bc5c-30aebd8a4641-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.892215 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.892192 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/31bab116-4755-4146-bc5c-30aebd8a4641-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.892484 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.892366 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/31bab116-4755-4146-bc5c-30aebd8a4641-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.892484 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.892374 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/31bab116-4755-4146-bc5c-30aebd8a4641-config-out\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.892484 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.892395 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/31bab116-4755-4146-bc5c-30aebd8a4641-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.892681 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.892615 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/31bab116-4755-4146-bc5c-30aebd8a4641-web-config\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.892771 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.892748 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/31bab116-4755-4146-bc5c-30aebd8a4641-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.893561 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.893535 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31bab116-4755-4146-bc5c-30aebd8a4641-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.894298 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.894277 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/31bab116-4755-4146-bc5c-30aebd8a4641-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.894989 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.894966 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/31bab116-4755-4146-bc5c-30aebd8a4641-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.895071 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.895051 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/31bab116-4755-4146-bc5c-30aebd8a4641-config\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.895258 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.895243 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/31bab116-4755-4146-bc5c-30aebd8a4641-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.899508 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.899483 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv8ls\" (UniqueName: \"kubernetes.io/projected/31bab116-4755-4146-bc5c-30aebd8a4641-kube-api-access-hv8ls\") pod \"prometheus-k8s-0\" (UID: \"31bab116-4755-4146-bc5c-30aebd8a4641\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:28.983737 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:28.983681 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:13:29.109682 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:29.109652 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 19 12:13:29.113660 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:13:29.113627 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31bab116_4755_4146_bc5c_30aebd8a4641.slice/crio-fc81518ad31f05757da2fa8930cf93e160f76bfbdc4378559a68b2a6619f2c56 WatchSource:0}: Error finding container fc81518ad31f05757da2fa8930cf93e160f76bfbdc4378559a68b2a6619f2c56: Status 404 returned error can't find the container with id fc81518ad31f05757da2fa8930cf93e160f76bfbdc4378559a68b2a6619f2c56 Apr 19 12:13:29.625165 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:29.625129 2567 generic.go:358] "Generic (PLEG): container finished" podID="31bab116-4755-4146-bc5c-30aebd8a4641" containerID="d92be08890cd530017f4cf3d84b12323658b1f3a5f27f2213d1b01cf9e4cf7d8" exitCode=0 Apr 19 12:13:29.625588 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:29.625171 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"31bab116-4755-4146-bc5c-30aebd8a4641","Type":"ContainerDied","Data":"d92be08890cd530017f4cf3d84b12323658b1f3a5f27f2213d1b01cf9e4cf7d8"} Apr 19 12:13:29.625588 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:29.625195 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"31bab116-4755-4146-bc5c-30aebd8a4641","Type":"ContainerStarted","Data":"fc81518ad31f05757da2fa8930cf93e160f76bfbdc4378559a68b2a6619f2c56"} Apr 19 12:13:29.840843 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:29.840797 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a77860af-dadc-4d15-b10c-7995213c2600" path="/var/lib/kubelet/pods/a77860af-dadc-4d15-b10c-7995213c2600/volumes" Apr 19 12:13:30.630771 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:30.630732 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"31bab116-4755-4146-bc5c-30aebd8a4641","Type":"ContainerStarted","Data":"9af7525b0016b3121005c43bf32a6f89fc2f6d9c8c8a6de1b5686320d2136de1"} Apr 19 12:13:30.630771 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:30.630770 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"31bab116-4755-4146-bc5c-30aebd8a4641","Type":"ContainerStarted","Data":"4b6dc11e457a856b42f21daa38175d259a4411ffc8b4ccc444100c8c284036ca"} Apr 19 12:13:30.630771 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:30.630780 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"31bab116-4755-4146-bc5c-30aebd8a4641","Type":"ContainerStarted","Data":"15b73ed4226550e06647597ea6029bef13e397cbd86624b65758620633f05471"} Apr 19 12:13:30.631213 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:30.630788 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"31bab116-4755-4146-bc5c-30aebd8a4641","Type":"ContainerStarted","Data":"17f9425e3cafad6c9f59cd45f74ca2f78c4d1db6ac4f253d0033c5106e44ecb6"} Apr 19 12:13:30.631213 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:30.630796 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"31bab116-4755-4146-bc5c-30aebd8a4641","Type":"ContainerStarted","Data":"20a10d18732f230522757b48c6c12cce8d06693c81339229b5bdb6b8f8aa5e24"} Apr 19 12:13:30.631213 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:30.630806 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"31bab116-4755-4146-bc5c-30aebd8a4641","Type":"ContainerStarted","Data":"6cf0f93acd6d9adf0dc9c4cf389c716ecc79df1d107c97a40d489b60ccacacb0"} Apr 19 12:13:30.656652 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:30.656591 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.656571403 podStartE2EDuration="2.656571403s" podCreationTimestamp="2026-04-19 12:13:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:13:30.654345002 +0000 UTC m=+219.408536829" watchObservedRunningTime="2026-04-19 12:13:30.656571403 +0000 UTC m=+219.410763235" Apr 19 12:13:33.983990 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:13:33.983954 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:14:09.043909 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:14:09.043825 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-tplkv"] Apr 19 12:14:09.046755 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:14:09.046734 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tplkv" Apr 19 12:14:09.048965 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:14:09.048940 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 19 12:14:09.060418 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:14:09.055034 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-tplkv"] Apr 19 12:14:09.138607 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:14:09.138569 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/598b50ba-3a7a-4f1d-8e80-a9fb189ba28a-original-pull-secret\") pod \"global-pull-secret-syncer-tplkv\" (UID: \"598b50ba-3a7a-4f1d-8e80-a9fb189ba28a\") " pod="kube-system/global-pull-secret-syncer-tplkv" Apr 19 12:14:09.138798 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:14:09.138628 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/598b50ba-3a7a-4f1d-8e80-a9fb189ba28a-dbus\") pod \"global-pull-secret-syncer-tplkv\" (UID: \"598b50ba-3a7a-4f1d-8e80-a9fb189ba28a\") " pod="kube-system/global-pull-secret-syncer-tplkv" Apr 19 12:14:09.138798 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:14:09.138711 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/598b50ba-3a7a-4f1d-8e80-a9fb189ba28a-kubelet-config\") pod \"global-pull-secret-syncer-tplkv\" (UID: \"598b50ba-3a7a-4f1d-8e80-a9fb189ba28a\") " pod="kube-system/global-pull-secret-syncer-tplkv" Apr 19 12:14:09.239491 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:14:09.239443 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/598b50ba-3a7a-4f1d-8e80-a9fb189ba28a-dbus\") pod \"global-pull-secret-syncer-tplkv\" (UID: \"598b50ba-3a7a-4f1d-8e80-a9fb189ba28a\") " pod="kube-system/global-pull-secret-syncer-tplkv" Apr 19 12:14:09.239688 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:14:09.239534 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/598b50ba-3a7a-4f1d-8e80-a9fb189ba28a-kubelet-config\") pod \"global-pull-secret-syncer-tplkv\" (UID: \"598b50ba-3a7a-4f1d-8e80-a9fb189ba28a\") " pod="kube-system/global-pull-secret-syncer-tplkv" Apr 19 12:14:09.239688 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:14:09.239578 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/598b50ba-3a7a-4f1d-8e80-a9fb189ba28a-original-pull-secret\") pod \"global-pull-secret-syncer-tplkv\" (UID: \"598b50ba-3a7a-4f1d-8e80-a9fb189ba28a\") " pod="kube-system/global-pull-secret-syncer-tplkv" Apr 19 12:14:09.239688 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:14:09.239630 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/598b50ba-3a7a-4f1d-8e80-a9fb189ba28a-dbus\") pod \"global-pull-secret-syncer-tplkv\" (UID: \"598b50ba-3a7a-4f1d-8e80-a9fb189ba28a\") " pod="kube-system/global-pull-secret-syncer-tplkv" Apr 19 12:14:09.239688 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:14:09.239655 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/598b50ba-3a7a-4f1d-8e80-a9fb189ba28a-kubelet-config\") pod \"global-pull-secret-syncer-tplkv\" (UID: \"598b50ba-3a7a-4f1d-8e80-a9fb189ba28a\") " pod="kube-system/global-pull-secret-syncer-tplkv" Apr 19 12:14:09.241885 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:14:09.241861 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/598b50ba-3a7a-4f1d-8e80-a9fb189ba28a-original-pull-secret\") pod \"global-pull-secret-syncer-tplkv\" (UID: \"598b50ba-3a7a-4f1d-8e80-a9fb189ba28a\") " pod="kube-system/global-pull-secret-syncer-tplkv" Apr 19 12:14:09.363647 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:14:09.363566 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tplkv" Apr 19 12:14:09.479294 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:14:09.479258 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-tplkv"] Apr 19 12:14:09.482190 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:14:09.482165 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod598b50ba_3a7a_4f1d_8e80_a9fb189ba28a.slice/crio-bfcb0a3d4b36d66ed77ad9d8e2825cfd2a5636f93caa8ecd2d56ee8062cd1e31 WatchSource:0}: Error finding container bfcb0a3d4b36d66ed77ad9d8e2825cfd2a5636f93caa8ecd2d56ee8062cd1e31: Status 404 returned error can't find the container with id bfcb0a3d4b36d66ed77ad9d8e2825cfd2a5636f93caa8ecd2d56ee8062cd1e31 Apr 19 12:14:09.751454 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:14:09.751418 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-tplkv" event={"ID":"598b50ba-3a7a-4f1d-8e80-a9fb189ba28a","Type":"ContainerStarted","Data":"bfcb0a3d4b36d66ed77ad9d8e2825cfd2a5636f93caa8ecd2d56ee8062cd1e31"} Apr 19 12:14:13.765289 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:14:13.765252 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-tplkv" event={"ID":"598b50ba-3a7a-4f1d-8e80-a9fb189ba28a","Type":"ContainerStarted","Data":"a13d52733027b3a1dcbb48b3f490317ad190f8e99746d34705ddd4e1c613c71a"} Apr 19 12:14:13.777936 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:14:13.777877 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-tplkv" podStartSLOduration=0.855745745 podStartE2EDuration="4.777849541s" podCreationTimestamp="2026-04-19 12:14:09 +0000 UTC" firstStartedPulling="2026-04-19 12:14:09.483707298 +0000 UTC m=+258.237899105" lastFinishedPulling="2026-04-19 12:14:13.405811096 +0000 UTC m=+262.160002901" observedRunningTime="2026-04-19 12:14:13.777688334 +0000 UTC m=+262.531880162" watchObservedRunningTime="2026-04-19 12:14:13.777849541 +0000 UTC m=+262.532041368" Apr 19 12:14:28.983853 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:14:28.983809 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:14:28.999033 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:14:28.999006 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:14:29.827339 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:14:29.827314 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 19 12:14:51.749756 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:14:51.749719 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c8kc9_88822cf0-d332-4d8f-ab99-d2460f2ad404/console-operator/2.log" Apr 19 12:14:51.750312 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:14:51.749886 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c8kc9_88822cf0-d332-4d8f-ab99-d2460f2ad404/console-operator/2.log" Apr 19 12:14:51.754416 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:14:51.754396 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5n96s_08fea0f9-3a6e-4ab6-b269-5668dab364ea/ovn-acl-logging/0.log" Apr 19 12:14:51.754541 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:14:51.754455 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5n96s_08fea0f9-3a6e-4ab6-b269-5668dab364ea/ovn-acl-logging/0.log" Apr 19 12:14:51.761422 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:14:51.761403 2567 kubelet.go:1628] "Image garbage collection succeeded" Apr 19 12:15:38.368429 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:38.368343 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-676bcb86f4-nxd54"] Apr 19 12:15:38.371614 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:38.371596 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-nxd54" Apr 19 12:15:38.374946 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:38.374923 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 19 12:15:38.375222 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:38.375197 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 19 12:15:38.375367 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:38.375350 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 19 12:15:38.375407 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:38.375389 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-66xtk\"" Apr 19 12:15:38.376444 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:38.376427 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 19 12:15:38.388256 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:38.388231 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-676bcb86f4-nxd54"] Apr 19 12:15:38.476496 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:38.476462 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b3d0a41c-19e8-426a-a855-03cf0363ad5a-webhook-cert\") pod \"opendatahub-operator-controller-manager-676bcb86f4-nxd54\" (UID: \"b3d0a41c-19e8-426a-a855-03cf0363ad5a\") " pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-nxd54" Apr 19 12:15:38.476672 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:38.476529 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b3d0a41c-19e8-426a-a855-03cf0363ad5a-apiservice-cert\") pod \"opendatahub-operator-controller-manager-676bcb86f4-nxd54\" (UID: \"b3d0a41c-19e8-426a-a855-03cf0363ad5a\") " pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-nxd54" Apr 19 12:15:38.476672 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:38.476565 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqhg9\" (UniqueName: \"kubernetes.io/projected/b3d0a41c-19e8-426a-a855-03cf0363ad5a-kube-api-access-gqhg9\") pod \"opendatahub-operator-controller-manager-676bcb86f4-nxd54\" (UID: \"b3d0a41c-19e8-426a-a855-03cf0363ad5a\") " pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-nxd54" Apr 19 12:15:38.577783 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:38.577733 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b3d0a41c-19e8-426a-a855-03cf0363ad5a-webhook-cert\") pod \"opendatahub-operator-controller-manager-676bcb86f4-nxd54\" (UID: \"b3d0a41c-19e8-426a-a855-03cf0363ad5a\") " pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-nxd54" Apr 19 12:15:38.577961 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:38.577809 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b3d0a41c-19e8-426a-a855-03cf0363ad5a-apiservice-cert\") pod \"opendatahub-operator-controller-manager-676bcb86f4-nxd54\" (UID: \"b3d0a41c-19e8-426a-a855-03cf0363ad5a\") " pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-nxd54" Apr 19 12:15:38.577961 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:38.577837 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqhg9\" (UniqueName: \"kubernetes.io/projected/b3d0a41c-19e8-426a-a855-03cf0363ad5a-kube-api-access-gqhg9\") pod \"opendatahub-operator-controller-manager-676bcb86f4-nxd54\" (UID: \"b3d0a41c-19e8-426a-a855-03cf0363ad5a\") " pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-nxd54" Apr 19 12:15:38.580321 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:38.580294 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b3d0a41c-19e8-426a-a855-03cf0363ad5a-apiservice-cert\") pod \"opendatahub-operator-controller-manager-676bcb86f4-nxd54\" (UID: \"b3d0a41c-19e8-426a-a855-03cf0363ad5a\") " pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-nxd54" Apr 19 12:15:38.580445 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:38.580334 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b3d0a41c-19e8-426a-a855-03cf0363ad5a-webhook-cert\") pod \"opendatahub-operator-controller-manager-676bcb86f4-nxd54\" (UID: \"b3d0a41c-19e8-426a-a855-03cf0363ad5a\") " pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-nxd54" Apr 19 12:15:38.588330 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:38.588304 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqhg9\" (UniqueName: \"kubernetes.io/projected/b3d0a41c-19e8-426a-a855-03cf0363ad5a-kube-api-access-gqhg9\") pod \"opendatahub-operator-controller-manager-676bcb86f4-nxd54\" (UID: \"b3d0a41c-19e8-426a-a855-03cf0363ad5a\") " pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-nxd54" Apr 19 12:15:38.681724 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:38.681691 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-nxd54" Apr 19 12:15:38.808350 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:38.808323 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-676bcb86f4-nxd54"] Apr 19 12:15:38.811170 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:15:38.811142 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3d0a41c_19e8_426a_a855_03cf0363ad5a.slice/crio-79025aea85de85649a7e8786715cee489777c2d421832572e8cca3164d904ab1 WatchSource:0}: Error finding container 79025aea85de85649a7e8786715cee489777c2d421832572e8cca3164d904ab1: Status 404 returned error can't find the container with id 79025aea85de85649a7e8786715cee489777c2d421832572e8cca3164d904ab1 Apr 19 12:15:38.812759 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:38.812742 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 19 12:15:39.028641 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:39.028560 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-nxd54" event={"ID":"b3d0a41c-19e8-426a-a855-03cf0363ad5a","Type":"ContainerStarted","Data":"79025aea85de85649a7e8786715cee489777c2d421832572e8cca3164d904ab1"} Apr 19 12:15:42.040827 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:42.040788 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-nxd54" event={"ID":"b3d0a41c-19e8-426a-a855-03cf0363ad5a","Type":"ContainerStarted","Data":"25f1cab325778e17e1575304d3816096f6f2d07fba3fc075aee9b785c707d99c"} Apr 19 12:15:42.041199 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:42.041027 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-nxd54" Apr 19 12:15:42.058151 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:42.058068 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-nxd54" podStartSLOduration=1.244593563 podStartE2EDuration="4.058052953s" podCreationTimestamp="2026-04-19 12:15:38 +0000 UTC" firstStartedPulling="2026-04-19 12:15:38.812868052 +0000 UTC m=+347.567059860" lastFinishedPulling="2026-04-19 12:15:41.626327445 +0000 UTC m=+350.380519250" observedRunningTime="2026-04-19 12:15:42.056709068 +0000 UTC m=+350.810900894" watchObservedRunningTime="2026-04-19 12:15:42.058052953 +0000 UTC m=+350.812244781" Apr 19 12:15:53.046514 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:53.046481 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-nxd54" Apr 19 12:15:59.855560 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:59.855521 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-688fc496d-8v748"] Apr 19 12:15:59.859213 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:59.859190 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-688fc496d-8v748" Apr 19 12:15:59.861637 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:59.861610 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 19 12:15:59.861762 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:59.861642 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 19 12:15:59.862481 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:59.862462 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 19 12:15:59.862543 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:59.862527 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 19 12:15:59.862601 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:59.862590 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 19 12:15:59.862649 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:59.862631 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-vtqjc\"" Apr 19 12:15:59.865923 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:59.865899 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d09d25b-d17a-4e56-8370-75c92ceb73de-cert\") pod \"lws-controller-manager-688fc496d-8v748\" (UID: \"5d09d25b-d17a-4e56-8370-75c92ceb73de\") " pod="openshift-lws-operator/lws-controller-manager-688fc496d-8v748" Apr 19 12:15:59.866014 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:59.865952 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28sxg\" (UniqueName: \"kubernetes.io/projected/5d09d25b-d17a-4e56-8370-75c92ceb73de-kube-api-access-28sxg\") pod \"lws-controller-manager-688fc496d-8v748\" (UID: \"5d09d25b-d17a-4e56-8370-75c92ceb73de\") " pod="openshift-lws-operator/lws-controller-manager-688fc496d-8v748" Apr 19 12:15:59.866014 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:59.865986 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/5d09d25b-d17a-4e56-8370-75c92ceb73de-metrics-cert\") pod \"lws-controller-manager-688fc496d-8v748\" (UID: \"5d09d25b-d17a-4e56-8370-75c92ceb73de\") " pod="openshift-lws-operator/lws-controller-manager-688fc496d-8v748" Apr 19 12:15:59.866094 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:59.866029 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/5d09d25b-d17a-4e56-8370-75c92ceb73de-manager-config\") pod \"lws-controller-manager-688fc496d-8v748\" (UID: \"5d09d25b-d17a-4e56-8370-75c92ceb73de\") " pod="openshift-lws-operator/lws-controller-manager-688fc496d-8v748" Apr 19 12:15:59.868591 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:59.868571 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-688fc496d-8v748"] Apr 19 12:15:59.967302 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:59.966622 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-28sxg\" (UniqueName: \"kubernetes.io/projected/5d09d25b-d17a-4e56-8370-75c92ceb73de-kube-api-access-28sxg\") pod \"lws-controller-manager-688fc496d-8v748\" (UID: \"5d09d25b-d17a-4e56-8370-75c92ceb73de\") " pod="openshift-lws-operator/lws-controller-manager-688fc496d-8v748" Apr 19 12:15:59.967302 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:59.966681 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/5d09d25b-d17a-4e56-8370-75c92ceb73de-metrics-cert\") pod \"lws-controller-manager-688fc496d-8v748\" (UID: \"5d09d25b-d17a-4e56-8370-75c92ceb73de\") " pod="openshift-lws-operator/lws-controller-manager-688fc496d-8v748" Apr 19 12:15:59.967302 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:59.966720 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/5d09d25b-d17a-4e56-8370-75c92ceb73de-manager-config\") pod \"lws-controller-manager-688fc496d-8v748\" (UID: \"5d09d25b-d17a-4e56-8370-75c92ceb73de\") " pod="openshift-lws-operator/lws-controller-manager-688fc496d-8v748" Apr 19 12:15:59.967302 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:59.966809 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d09d25b-d17a-4e56-8370-75c92ceb73de-cert\") pod \"lws-controller-manager-688fc496d-8v748\" (UID: \"5d09d25b-d17a-4e56-8370-75c92ceb73de\") " pod="openshift-lws-operator/lws-controller-manager-688fc496d-8v748" Apr 19 12:15:59.968710 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:59.968683 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/5d09d25b-d17a-4e56-8370-75c92ceb73de-manager-config\") pod \"lws-controller-manager-688fc496d-8v748\" (UID: \"5d09d25b-d17a-4e56-8370-75c92ceb73de\") " pod="openshift-lws-operator/lws-controller-manager-688fc496d-8v748" Apr 19 12:15:59.971137 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:59.970774 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/5d09d25b-d17a-4e56-8370-75c92ceb73de-metrics-cert\") pod \"lws-controller-manager-688fc496d-8v748\" (UID: \"5d09d25b-d17a-4e56-8370-75c92ceb73de\") " pod="openshift-lws-operator/lws-controller-manager-688fc496d-8v748" Apr 19 12:15:59.981197 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:59.981165 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d09d25b-d17a-4e56-8370-75c92ceb73de-cert\") pod \"lws-controller-manager-688fc496d-8v748\" (UID: \"5d09d25b-d17a-4e56-8370-75c92ceb73de\") " pod="openshift-lws-operator/lws-controller-manager-688fc496d-8v748" Apr 19 12:15:59.984430 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:15:59.984400 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-28sxg\" (UniqueName: \"kubernetes.io/projected/5d09d25b-d17a-4e56-8370-75c92ceb73de-kube-api-access-28sxg\") pod \"lws-controller-manager-688fc496d-8v748\" (UID: \"5d09d25b-d17a-4e56-8370-75c92ceb73de\") " pod="openshift-lws-operator/lws-controller-manager-688fc496d-8v748" Apr 19 12:16:00.170771 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:00.170665 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-688fc496d-8v748" Apr 19 12:16:00.300289 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:00.300266 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-688fc496d-8v748"] Apr 19 12:16:00.303043 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:16:00.303013 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d09d25b_d17a_4e56_8370_75c92ceb73de.slice/crio-01538d180eaacbe25f457ac567e1d4e7e5bc92877d02bd3eb1aff64ce6d17de0 WatchSource:0}: Error finding container 01538d180eaacbe25f457ac567e1d4e7e5bc92877d02bd3eb1aff64ce6d17de0: Status 404 returned error can't find the container with id 01538d180eaacbe25f457ac567e1d4e7e5bc92877d02bd3eb1aff64ce6d17de0 Apr 19 12:16:01.106664 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:01.106618 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-688fc496d-8v748" event={"ID":"5d09d25b-d17a-4e56-8370-75c92ceb73de","Type":"ContainerStarted","Data":"01538d180eaacbe25f457ac567e1d4e7e5bc92877d02bd3eb1aff64ce6d17de0"} Apr 19 12:16:04.119993 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:04.119953 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-688fc496d-8v748" event={"ID":"5d09d25b-d17a-4e56-8370-75c92ceb73de","Type":"ContainerStarted","Data":"b98abc4eb818a04da6dd7b1684adac7c563bf377fed881cf39354784c7ad3126"} Apr 19 12:16:04.120477 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:04.120096 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-688fc496d-8v748" Apr 19 12:16:04.147307 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:04.147256 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-688fc496d-8v748" podStartSLOduration=2.091186623 podStartE2EDuration="5.147243134s" podCreationTimestamp="2026-04-19 12:15:59 +0000 UTC" firstStartedPulling="2026-04-19 12:16:00.304958283 +0000 UTC m=+369.059150088" lastFinishedPulling="2026-04-19 12:16:03.361014793 +0000 UTC m=+372.115206599" observedRunningTime="2026-04-19 12:16:04.145518574 +0000 UTC m=+372.899710400" watchObservedRunningTime="2026-04-19 12:16:04.147243134 +0000 UTC m=+372.901434960" Apr 19 12:16:15.126275 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:15.126242 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-688fc496d-8v748" Apr 19 12:16:22.542573 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.542537 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z"] Apr 19 12:16:22.551844 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.551815 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" Apr 19 12:16:22.554925 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.554893 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z"] Apr 19 12:16:22.555716 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.555509 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 19 12:16:22.555716 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.555521 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 19 12:16:22.555716 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.555561 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 19 12:16:22.555716 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.555597 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-x9f2b\"" Apr 19 12:16:22.655966 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.655933 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/8a65d157-d594-470e-b93a-fba64ab1bdeb-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z\" (UID: \"8a65d157-d594-470e-b93a-fba64ab1bdeb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" Apr 19 12:16:22.655966 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.655971 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/8a65d157-d594-470e-b93a-fba64ab1bdeb-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z\" (UID: \"8a65d157-d594-470e-b93a-fba64ab1bdeb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" Apr 19 12:16:22.656198 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.656021 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/8a65d157-d594-470e-b93a-fba64ab1bdeb-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z\" (UID: \"8a65d157-d594-470e-b93a-fba64ab1bdeb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" Apr 19 12:16:22.656198 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.656042 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/8a65d157-d594-470e-b93a-fba64ab1bdeb-istio-token\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z\" (UID: \"8a65d157-d594-470e-b93a-fba64ab1bdeb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" Apr 19 12:16:22.656198 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.656101 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcbh9\" (UniqueName: \"kubernetes.io/projected/8a65d157-d594-470e-b93a-fba64ab1bdeb-kube-api-access-jcbh9\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z\" (UID: \"8a65d157-d594-470e-b93a-fba64ab1bdeb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" Apr 19 12:16:22.656198 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.656174 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/8a65d157-d594-470e-b93a-fba64ab1bdeb-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z\" (UID: \"8a65d157-d594-470e-b93a-fba64ab1bdeb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" Apr 19 12:16:22.656198 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.656195 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/8a65d157-d594-470e-b93a-fba64ab1bdeb-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z\" (UID: \"8a65d157-d594-470e-b93a-fba64ab1bdeb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" Apr 19 12:16:22.656385 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.656233 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/8a65d157-d594-470e-b93a-fba64ab1bdeb-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z\" (UID: \"8a65d157-d594-470e-b93a-fba64ab1bdeb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" Apr 19 12:16:22.656385 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.656300 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/8a65d157-d594-470e-b93a-fba64ab1bdeb-istio-data\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z\" (UID: \"8a65d157-d594-470e-b93a-fba64ab1bdeb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" Apr 19 12:16:22.749640 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.749582 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64"] Apr 19 12:16:22.753279 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.753258 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64" Apr 19 12:16:22.757483 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.757458 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/8a65d157-d594-470e-b93a-fba64ab1bdeb-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z\" (UID: \"8a65d157-d594-470e-b93a-fba64ab1bdeb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" Apr 19 12:16:22.757609 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.757492 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/8a65d157-d594-470e-b93a-fba64ab1bdeb-istio-data\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z\" (UID: \"8a65d157-d594-470e-b93a-fba64ab1bdeb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" Apr 19 12:16:22.757609 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.757517 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/8a65d157-d594-470e-b93a-fba64ab1bdeb-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z\" (UID: \"8a65d157-d594-470e-b93a-fba64ab1bdeb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" Apr 19 12:16:22.757732 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.757707 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/8a65d157-d594-470e-b93a-fba64ab1bdeb-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z\" (UID: \"8a65d157-d594-470e-b93a-fba64ab1bdeb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" Apr 19 12:16:22.757808 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.757794 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/8a65d157-d594-470e-b93a-fba64ab1bdeb-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z\" (UID: \"8a65d157-d594-470e-b93a-fba64ab1bdeb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" Apr 19 12:16:22.757860 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.757835 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/8a65d157-d594-470e-b93a-fba64ab1bdeb-istio-token\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z\" (UID: \"8a65d157-d594-470e-b93a-fba64ab1bdeb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" Apr 19 12:16:22.757918 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.757877 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/8a65d157-d594-470e-b93a-fba64ab1bdeb-istio-data\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z\" (UID: \"8a65d157-d594-470e-b93a-fba64ab1bdeb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" Apr 19 12:16:22.757918 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.757887 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jcbh9\" (UniqueName: \"kubernetes.io/projected/8a65d157-d594-470e-b93a-fba64ab1bdeb-kube-api-access-jcbh9\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z\" (UID: \"8a65d157-d594-470e-b93a-fba64ab1bdeb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" Apr 19 12:16:22.758021 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.757926 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/8a65d157-d594-470e-b93a-fba64ab1bdeb-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z\" (UID: \"8a65d157-d594-470e-b93a-fba64ab1bdeb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" Apr 19 12:16:22.758021 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.757981 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/8a65d157-d594-470e-b93a-fba64ab1bdeb-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z\" (UID: \"8a65d157-d594-470e-b93a-fba64ab1bdeb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" Apr 19 12:16:22.758021 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.758014 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/8a65d157-d594-470e-b93a-fba64ab1bdeb-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z\" (UID: \"8a65d157-d594-470e-b93a-fba64ab1bdeb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" Apr 19 12:16:22.758192 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.758163 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/8a65d157-d594-470e-b93a-fba64ab1bdeb-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z\" (UID: \"8a65d157-d594-470e-b93a-fba64ab1bdeb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" Apr 19 12:16:22.758457 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.758435 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/8a65d157-d594-470e-b93a-fba64ab1bdeb-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z\" (UID: \"8a65d157-d594-470e-b93a-fba64ab1bdeb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" Apr 19 12:16:22.758741 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.758719 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/8a65d157-d594-470e-b93a-fba64ab1bdeb-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z\" (UID: \"8a65d157-d594-470e-b93a-fba64ab1bdeb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" Apr 19 12:16:22.760501 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.760469 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/8a65d157-d594-470e-b93a-fba64ab1bdeb-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z\" (UID: \"8a65d157-d594-470e-b93a-fba64ab1bdeb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" Apr 19 12:16:22.760609 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.760587 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/8a65d157-d594-470e-b93a-fba64ab1bdeb-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z\" (UID: \"8a65d157-d594-470e-b93a-fba64ab1bdeb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" Apr 19 12:16:22.762863 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.762840 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64"] Apr 19 12:16:22.768323 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.768299 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/8a65d157-d594-470e-b93a-fba64ab1bdeb-istio-token\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z\" (UID: \"8a65d157-d594-470e-b93a-fba64ab1bdeb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" Apr 19 12:16:22.768605 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.768582 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcbh9\" (UniqueName: \"kubernetes.io/projected/8a65d157-d594-470e-b93a-fba64ab1bdeb-kube-api-access-jcbh9\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z\" (UID: \"8a65d157-d594-470e-b93a-fba64ab1bdeb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" Apr 19 12:16:22.859306 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.859220 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/550eb47b-2375-423e-964b-6fd281a5d1a2-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64\" (UID: \"550eb47b-2375-423e-964b-6fd281a5d1a2\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64" Apr 19 12:16:22.859306 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.859276 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/550eb47b-2375-423e-964b-6fd281a5d1a2-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64\" (UID: \"550eb47b-2375-423e-964b-6fd281a5d1a2\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64" Apr 19 12:16:22.859484 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.859339 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/550eb47b-2375-423e-964b-6fd281a5d1a2-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64\" (UID: \"550eb47b-2375-423e-964b-6fd281a5d1a2\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64" Apr 19 12:16:22.859484 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.859374 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/550eb47b-2375-423e-964b-6fd281a5d1a2-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64\" (UID: \"550eb47b-2375-423e-964b-6fd281a5d1a2\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64" Apr 19 12:16:22.859484 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.859392 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/550eb47b-2375-423e-964b-6fd281a5d1a2-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64\" (UID: \"550eb47b-2375-423e-964b-6fd281a5d1a2\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64" Apr 19 12:16:22.859484 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.859439 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/550eb47b-2375-423e-964b-6fd281a5d1a2-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64\" (UID: \"550eb47b-2375-423e-964b-6fd281a5d1a2\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64" Apr 19 12:16:22.859608 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.859487 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/550eb47b-2375-423e-964b-6fd281a5d1a2-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64\" (UID: \"550eb47b-2375-423e-964b-6fd281a5d1a2\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64" Apr 19 12:16:22.859608 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.859580 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/550eb47b-2375-423e-964b-6fd281a5d1a2-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64\" (UID: \"550eb47b-2375-423e-964b-6fd281a5d1a2\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64" Apr 19 12:16:22.859708 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.859619 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wps8\" (UniqueName: \"kubernetes.io/projected/550eb47b-2375-423e-964b-6fd281a5d1a2-kube-api-access-4wps8\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64\" (UID: \"550eb47b-2375-423e-964b-6fd281a5d1a2\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64" Apr 19 12:16:22.867075 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.867050 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" Apr 19 12:16:22.960745 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.960696 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/550eb47b-2375-423e-964b-6fd281a5d1a2-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64\" (UID: \"550eb47b-2375-423e-964b-6fd281a5d1a2\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64" Apr 19 12:16:22.960920 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.960757 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/550eb47b-2375-423e-964b-6fd281a5d1a2-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64\" (UID: \"550eb47b-2375-423e-964b-6fd281a5d1a2\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64" Apr 19 12:16:22.960920 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.960796 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/550eb47b-2375-423e-964b-6fd281a5d1a2-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64\" (UID: \"550eb47b-2375-423e-964b-6fd281a5d1a2\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64" Apr 19 12:16:22.960920 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.960822 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/550eb47b-2375-423e-964b-6fd281a5d1a2-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64\" (UID: \"550eb47b-2375-423e-964b-6fd281a5d1a2\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64" Apr 19 12:16:22.960920 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.960845 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/550eb47b-2375-423e-964b-6fd281a5d1a2-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64\" (UID: \"550eb47b-2375-423e-964b-6fd281a5d1a2\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64" Apr 19 12:16:22.960920 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.960878 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/550eb47b-2375-423e-964b-6fd281a5d1a2-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64\" (UID: \"550eb47b-2375-423e-964b-6fd281a5d1a2\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64" Apr 19 12:16:22.961216 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.960952 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/550eb47b-2375-423e-964b-6fd281a5d1a2-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64\" (UID: \"550eb47b-2375-423e-964b-6fd281a5d1a2\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64" Apr 19 12:16:22.961216 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.960979 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wps8\" (UniqueName: \"kubernetes.io/projected/550eb47b-2375-423e-964b-6fd281a5d1a2-kube-api-access-4wps8\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64\" (UID: \"550eb47b-2375-423e-964b-6fd281a5d1a2\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64" Apr 19 12:16:22.961216 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.961009 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/550eb47b-2375-423e-964b-6fd281a5d1a2-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64\" (UID: \"550eb47b-2375-423e-964b-6fd281a5d1a2\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64" Apr 19 12:16:22.961216 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.961066 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/550eb47b-2375-423e-964b-6fd281a5d1a2-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64\" (UID: \"550eb47b-2375-423e-964b-6fd281a5d1a2\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64" Apr 19 12:16:22.961216 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.961192 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/550eb47b-2375-423e-964b-6fd281a5d1a2-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64\" (UID: \"550eb47b-2375-423e-964b-6fd281a5d1a2\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64" Apr 19 12:16:22.961577 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.961532 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/550eb47b-2375-423e-964b-6fd281a5d1a2-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64\" (UID: \"550eb47b-2375-423e-964b-6fd281a5d1a2\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64" Apr 19 12:16:22.961817 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.961790 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/550eb47b-2375-423e-964b-6fd281a5d1a2-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64\" (UID: \"550eb47b-2375-423e-964b-6fd281a5d1a2\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64" Apr 19 12:16:22.961924 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.961890 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/550eb47b-2375-423e-964b-6fd281a5d1a2-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64\" (UID: \"550eb47b-2375-423e-964b-6fd281a5d1a2\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64" Apr 19 12:16:22.963287 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.963265 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/550eb47b-2375-423e-964b-6fd281a5d1a2-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64\" (UID: \"550eb47b-2375-423e-964b-6fd281a5d1a2\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64" Apr 19 12:16:22.963645 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.963625 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/550eb47b-2375-423e-964b-6fd281a5d1a2-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64\" (UID: \"550eb47b-2375-423e-964b-6fd281a5d1a2\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64" Apr 19 12:16:22.969065 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.969037 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/550eb47b-2375-423e-964b-6fd281a5d1a2-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64\" (UID: \"550eb47b-2375-423e-964b-6fd281a5d1a2\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64" Apr 19 12:16:22.969472 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.969452 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wps8\" (UniqueName: \"kubernetes.io/projected/550eb47b-2375-423e-964b-6fd281a5d1a2-kube-api-access-4wps8\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64\" (UID: \"550eb47b-2375-423e-964b-6fd281a5d1a2\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64" Apr 19 12:16:22.996075 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:22.996048 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z"] Apr 19 12:16:22.998199 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:16:22.998167 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a65d157_d594_470e_b93a_fba64ab1bdeb.slice/crio-82880f5e424507fe413460ced3e9d508b2fc554ca5c5e2c0e55c65984af99ce1 WatchSource:0}: Error finding container 82880f5e424507fe413460ced3e9d508b2fc554ca5c5e2c0e55c65984af99ce1: Status 404 returned error can't find the container with id 82880f5e424507fe413460ced3e9d508b2fc554ca5c5e2c0e55c65984af99ce1 Apr 19 12:16:23.086825 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:23.086787 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64" Apr 19 12:16:23.188720 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:23.188688 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" event={"ID":"8a65d157-d594-470e-b93a-fba64ab1bdeb","Type":"ContainerStarted","Data":"82880f5e424507fe413460ced3e9d508b2fc554ca5c5e2c0e55c65984af99ce1"} Apr 19 12:16:23.206797 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:23.206772 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64"] Apr 19 12:16:23.209495 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:16:23.209467 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod550eb47b_2375_423e_964b_6fd281a5d1a2.slice/crio-46d93dcccfa50f093df1b6d723d5e1df3f4a10c240a14471f2f5d6df2983350c WatchSource:0}: Error finding container 46d93dcccfa50f093df1b6d723d5e1df3f4a10c240a14471f2f5d6df2983350c: Status 404 returned error can't find the container with id 46d93dcccfa50f093df1b6d723d5e1df3f4a10c240a14471f2f5d6df2983350c Apr 19 12:16:24.194233 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:24.194188 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64" event={"ID":"550eb47b-2375-423e-964b-6fd281a5d1a2","Type":"ContainerStarted","Data":"46d93dcccfa50f093df1b6d723d5e1df3f4a10c240a14471f2f5d6df2983350c"} Apr 19 12:16:25.481801 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:25.481757 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 19 12:16:25.482216 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:25.481862 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 19 12:16:25.482216 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:25.481901 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 19 12:16:25.487599 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:25.487573 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 19 12:16:25.487694 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:25.487625 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 19 12:16:25.487694 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:25.487653 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 19 12:16:26.202789 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:26.202745 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64" event={"ID":"550eb47b-2375-423e-964b-6fd281a5d1a2","Type":"ContainerStarted","Data":"3315ccc2e88cd79af5f1cb16bf93fdf8e86dea1587560c141740d60d75c20617"} Apr 19 12:16:26.204049 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:26.204022 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" event={"ID":"8a65d157-d594-470e-b93a-fba64ab1bdeb","Type":"ContainerStarted","Data":"cbf84b62c50c9a8a03a73b314071d78bbd1464101634b147ce0d12455f70f3f8"} Apr 19 12:16:26.233732 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:26.233686 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64" podStartSLOduration=1.9578988370000001 podStartE2EDuration="4.233671919s" podCreationTimestamp="2026-04-19 12:16:22 +0000 UTC" firstStartedPulling="2026-04-19 12:16:23.211612933 +0000 UTC m=+391.965804742" lastFinishedPulling="2026-04-19 12:16:25.487386018 +0000 UTC m=+394.241577824" observedRunningTime="2026-04-19 12:16:26.232688669 +0000 UTC m=+394.986880511" watchObservedRunningTime="2026-04-19 12:16:26.233671919 +0000 UTC m=+394.987863746" Apr 19 12:16:26.264611 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:26.264566 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" podStartSLOduration=1.783077037 podStartE2EDuration="4.264553143s" podCreationTimestamp="2026-04-19 12:16:22 +0000 UTC" firstStartedPulling="2026-04-19 12:16:23.000025272 +0000 UTC m=+391.754217088" lastFinishedPulling="2026-04-19 12:16:25.481501385 +0000 UTC m=+394.235693194" observedRunningTime="2026-04-19 12:16:26.262234662 +0000 UTC m=+395.016426511" watchObservedRunningTime="2026-04-19 12:16:26.264553143 +0000 UTC m=+395.018744970" Apr 19 12:16:26.867749 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:26.867710 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" Apr 19 12:16:26.868883 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:26.868859 2567 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.134.0.27:15021/healthz/ready\": dial tcp 10.134.0.27:15021: connect: connection refused" start-of-body= Apr 19 12:16:26.868965 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:26.868918 2567 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" podUID="8a65d157-d594-470e-b93a-fba64ab1bdeb" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.27:15021/healthz/ready\": dial tcp 10.134.0.27:15021: connect: connection refused" Apr 19 12:16:27.086900 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:27.086864 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64" Apr 19 12:16:27.091566 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:27.091539 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64" Apr 19 12:16:27.208678 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:27.208645 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64" Apr 19 12:16:27.209428 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:27.209410 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64" Apr 19 12:16:27.255701 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:27.255669 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z"] Apr 19 12:16:27.868137 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:27.868028 2567 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.134.0.27:15021/healthz/ready\": dial tcp 10.134.0.27:15021: connect: connection refused" start-of-body= Apr 19 12:16:27.868137 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:27.868091 2567 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" podUID="8a65d157-d594-470e-b93a-fba64ab1bdeb" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.27:15021/healthz/ready\": dial tcp 10.134.0.27:15021: connect: connection refused" Apr 19 12:16:28.867480 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:28.867439 2567 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.134.0.27:15021/healthz/ready\": dial tcp 10.134.0.27:15021: connect: connection refused" start-of-body= Apr 19 12:16:28.867676 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:28.867502 2567 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" podUID="8a65d157-d594-470e-b93a-fba64ab1bdeb" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.27:15021/healthz/ready\": dial tcp 10.134.0.27:15021: connect: connection refused" Apr 19 12:16:29.220452 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:29.220416 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" podUID="8a65d157-d594-470e-b93a-fba64ab1bdeb" containerName="istio-proxy" containerID="cri-o://cbf84b62c50c9a8a03a73b314071d78bbd1464101634b147ce0d12455f70f3f8" gracePeriod=30 Apr 19 12:16:34.460865 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:34.460839 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" Apr 19 12:16:34.567181 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:34.567090 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/8a65d157-d594-470e-b93a-fba64ab1bdeb-istio-podinfo\") pod \"8a65d157-d594-470e-b93a-fba64ab1bdeb\" (UID: \"8a65d157-d594-470e-b93a-fba64ab1bdeb\") " Apr 19 12:16:34.567181 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:34.567164 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/8a65d157-d594-470e-b93a-fba64ab1bdeb-credential-socket\") pod \"8a65d157-d594-470e-b93a-fba64ab1bdeb\" (UID: \"8a65d157-d594-470e-b93a-fba64ab1bdeb\") " Apr 19 12:16:34.567398 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:34.567215 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcbh9\" (UniqueName: \"kubernetes.io/projected/8a65d157-d594-470e-b93a-fba64ab1bdeb-kube-api-access-jcbh9\") pod \"8a65d157-d594-470e-b93a-fba64ab1bdeb\" (UID: \"8a65d157-d594-470e-b93a-fba64ab1bdeb\") " Apr 19 12:16:34.567398 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:34.567273 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/8a65d157-d594-470e-b93a-fba64ab1bdeb-istio-envoy\") pod \"8a65d157-d594-470e-b93a-fba64ab1bdeb\" (UID: \"8a65d157-d594-470e-b93a-fba64ab1bdeb\") " Apr 19 12:16:34.567398 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:34.567306 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/8a65d157-d594-470e-b93a-fba64ab1bdeb-istio-token\") pod \"8a65d157-d594-470e-b93a-fba64ab1bdeb\" (UID: \"8a65d157-d594-470e-b93a-fba64ab1bdeb\") " Apr 19 12:16:34.567398 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:34.567347 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/8a65d157-d594-470e-b93a-fba64ab1bdeb-istio-data\") pod \"8a65d157-d594-470e-b93a-fba64ab1bdeb\" (UID: \"8a65d157-d594-470e-b93a-fba64ab1bdeb\") " Apr 19 12:16:34.567398 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:34.567392 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/8a65d157-d594-470e-b93a-fba64ab1bdeb-workload-certs\") pod \"8a65d157-d594-470e-b93a-fba64ab1bdeb\" (UID: \"8a65d157-d594-470e-b93a-fba64ab1bdeb\") " Apr 19 12:16:34.567626 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:34.567427 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/8a65d157-d594-470e-b93a-fba64ab1bdeb-istiod-ca-cert\") pod \"8a65d157-d594-470e-b93a-fba64ab1bdeb\" (UID: \"8a65d157-d594-470e-b93a-fba64ab1bdeb\") " Apr 19 12:16:34.567626 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:34.567444 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a65d157-d594-470e-b93a-fba64ab1bdeb-credential-socket" (OuterVolumeSpecName: "credential-socket") pod "8a65d157-d594-470e-b93a-fba64ab1bdeb" (UID: "8a65d157-d594-470e-b93a-fba64ab1bdeb"). InnerVolumeSpecName "credential-socket". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:16:34.567626 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:34.567467 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/8a65d157-d594-470e-b93a-fba64ab1bdeb-workload-socket\") pod \"8a65d157-d594-470e-b93a-fba64ab1bdeb\" (UID: \"8a65d157-d594-470e-b93a-fba64ab1bdeb\") " Apr 19 12:16:34.567775 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:34.567629 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a65d157-d594-470e-b93a-fba64ab1bdeb-workload-certs" (OuterVolumeSpecName: "workload-certs") pod "8a65d157-d594-470e-b93a-fba64ab1bdeb" (UID: "8a65d157-d594-470e-b93a-fba64ab1bdeb"). InnerVolumeSpecName "workload-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:16:34.567775 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:34.567763 2567 reconciler_common.go:299] "Volume detached for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/8a65d157-d594-470e-b93a-fba64ab1bdeb-workload-certs\") on node \"ip-10-0-131-150.ec2.internal\" DevicePath \"\"" Apr 19 12:16:34.567879 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:34.567785 2567 reconciler_common.go:299] "Volume detached for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/8a65d157-d594-470e-b93a-fba64ab1bdeb-credential-socket\") on node \"ip-10-0-131-150.ec2.internal\" DevicePath \"\"" Apr 19 12:16:34.567879 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:34.567860 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a65d157-d594-470e-b93a-fba64ab1bdeb-workload-socket" (OuterVolumeSpecName: "workload-socket") pod "8a65d157-d594-470e-b93a-fba64ab1bdeb" (UID: "8a65d157-d594-470e-b93a-fba64ab1bdeb"). InnerVolumeSpecName "workload-socket". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:16:34.567982 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:34.567932 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a65d157-d594-470e-b93a-fba64ab1bdeb-istiod-ca-cert" (OuterVolumeSpecName: "istiod-ca-cert") pod "8a65d157-d594-470e-b93a-fba64ab1bdeb" (UID: "8a65d157-d594-470e-b93a-fba64ab1bdeb"). InnerVolumeSpecName "istiod-ca-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:16:34.568106 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:34.568082 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a65d157-d594-470e-b93a-fba64ab1bdeb-istio-data" (OuterVolumeSpecName: "istio-data") pod "8a65d157-d594-470e-b93a-fba64ab1bdeb" (UID: "8a65d157-d594-470e-b93a-fba64ab1bdeb"). InnerVolumeSpecName "istio-data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:16:34.569520 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:34.569496 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a65d157-d594-470e-b93a-fba64ab1bdeb-istio-token" (OuterVolumeSpecName: "istio-token") pod "8a65d157-d594-470e-b93a-fba64ab1bdeb" (UID: "8a65d157-d594-470e-b93a-fba64ab1bdeb"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:16:34.569627 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:34.569561 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a65d157-d594-470e-b93a-fba64ab1bdeb-kube-api-access-jcbh9" (OuterVolumeSpecName: "kube-api-access-jcbh9") pod "8a65d157-d594-470e-b93a-fba64ab1bdeb" (UID: "8a65d157-d594-470e-b93a-fba64ab1bdeb"). InnerVolumeSpecName "kube-api-access-jcbh9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:16:34.569707 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:34.569691 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/8a65d157-d594-470e-b93a-fba64ab1bdeb-istio-podinfo" (OuterVolumeSpecName: "istio-podinfo") pod "8a65d157-d594-470e-b93a-fba64ab1bdeb" (UID: "8a65d157-d594-470e-b93a-fba64ab1bdeb"). InnerVolumeSpecName "istio-podinfo". PluginName "kubernetes.io/downward-api", VolumeGIDValue "" Apr 19 12:16:34.569747 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:34.569712 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a65d157-d594-470e-b93a-fba64ab1bdeb-istio-envoy" (OuterVolumeSpecName: "istio-envoy") pod "8a65d157-d594-470e-b93a-fba64ab1bdeb" (UID: "8a65d157-d594-470e-b93a-fba64ab1bdeb"). InnerVolumeSpecName "istio-envoy". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:16:34.668518 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:34.668484 2567 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/8a65d157-d594-470e-b93a-fba64ab1bdeb-istio-token\") on node \"ip-10-0-131-150.ec2.internal\" DevicePath \"\"" Apr 19 12:16:34.668518 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:34.668513 2567 reconciler_common.go:299] "Volume detached for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/8a65d157-d594-470e-b93a-fba64ab1bdeb-istio-data\") on node \"ip-10-0-131-150.ec2.internal\" DevicePath \"\"" Apr 19 12:16:34.668518 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:34.668523 2567 reconciler_common.go:299] "Volume detached for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/8a65d157-d594-470e-b93a-fba64ab1bdeb-istiod-ca-cert\") on node \"ip-10-0-131-150.ec2.internal\" DevicePath \"\"" Apr 19 12:16:34.668742 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:34.668531 2567 reconciler_common.go:299] "Volume detached for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/8a65d157-d594-470e-b93a-fba64ab1bdeb-workload-socket\") on node \"ip-10-0-131-150.ec2.internal\" DevicePath \"\"" Apr 19 12:16:34.668742 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:34.668541 2567 reconciler_common.go:299] "Volume detached for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/8a65d157-d594-470e-b93a-fba64ab1bdeb-istio-podinfo\") on node \"ip-10-0-131-150.ec2.internal\" DevicePath \"\"" Apr 19 12:16:34.668742 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:34.668551 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jcbh9\" (UniqueName: \"kubernetes.io/projected/8a65d157-d594-470e-b93a-fba64ab1bdeb-kube-api-access-jcbh9\") on node \"ip-10-0-131-150.ec2.internal\" DevicePath \"\"" Apr 19 12:16:34.668742 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:34.668560 2567 reconciler_common.go:299] "Volume detached for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/8a65d157-d594-470e-b93a-fba64ab1bdeb-istio-envoy\") on node \"ip-10-0-131-150.ec2.internal\" DevicePath \"\"" Apr 19 12:16:35.241919 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:35.241884 2567 generic.go:358] "Generic (PLEG): container finished" podID="8a65d157-d594-470e-b93a-fba64ab1bdeb" containerID="cbf84b62c50c9a8a03a73b314071d78bbd1464101634b147ce0d12455f70f3f8" exitCode=0 Apr 19 12:16:35.242100 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:35.241953 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" Apr 19 12:16:35.242100 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:35.241966 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" event={"ID":"8a65d157-d594-470e-b93a-fba64ab1bdeb","Type":"ContainerDied","Data":"cbf84b62c50c9a8a03a73b314071d78bbd1464101634b147ce0d12455f70f3f8"} Apr 19 12:16:35.242100 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:35.242006 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z" event={"ID":"8a65d157-d594-470e-b93a-fba64ab1bdeb","Type":"ContainerDied","Data":"82880f5e424507fe413460ced3e9d508b2fc554ca5c5e2c0e55c65984af99ce1"} Apr 19 12:16:35.242100 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:35.242026 2567 scope.go:117] "RemoveContainer" containerID="cbf84b62c50c9a8a03a73b314071d78bbd1464101634b147ce0d12455f70f3f8" Apr 19 12:16:35.250839 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:35.250815 2567 scope.go:117] "RemoveContainer" containerID="cbf84b62c50c9a8a03a73b314071d78bbd1464101634b147ce0d12455f70f3f8" Apr 19 12:16:35.251165 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:16:35.251143 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbf84b62c50c9a8a03a73b314071d78bbd1464101634b147ce0d12455f70f3f8\": container with ID starting with cbf84b62c50c9a8a03a73b314071d78bbd1464101634b147ce0d12455f70f3f8 not found: ID does not exist" containerID="cbf84b62c50c9a8a03a73b314071d78bbd1464101634b147ce0d12455f70f3f8" Apr 19 12:16:35.251226 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:35.251174 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbf84b62c50c9a8a03a73b314071d78bbd1464101634b147ce0d12455f70f3f8"} err="failed to get container status \"cbf84b62c50c9a8a03a73b314071d78bbd1464101634b147ce0d12455f70f3f8\": rpc error: code = NotFound desc = could not find container \"cbf84b62c50c9a8a03a73b314071d78bbd1464101634b147ce0d12455f70f3f8\": container with ID starting with cbf84b62c50c9a8a03a73b314071d78bbd1464101634b147ce0d12455f70f3f8 not found: ID does not exist" Apr 19 12:16:35.262338 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:35.262309 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z"] Apr 19 12:16:35.265503 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:35.265481 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd54sk7z"] Apr 19 12:16:35.837981 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:35.837947 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a65d157-d594-470e-b93a-fba64ab1bdeb" path="/var/lib/kubelet/pods/8a65d157-d594-470e-b93a-fba64ab1bdeb/volumes" Apr 19 12:16:56.370719 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:56.370681 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-r96bs"] Apr 19 12:16:56.371097 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:56.371017 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8a65d157-d594-470e-b93a-fba64ab1bdeb" containerName="istio-proxy" Apr 19 12:16:56.371097 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:56.371027 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a65d157-d594-470e-b93a-fba64ab1bdeb" containerName="istio-proxy" Apr 19 12:16:56.371193 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:56.371104 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="8a65d157-d594-470e-b93a-fba64ab1bdeb" containerName="istio-proxy" Apr 19 12:16:56.378725 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:56.378699 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-r96bs" Apr 19 12:16:56.380164 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:56.380134 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-r96bs"] Apr 19 12:16:56.380975 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:56.380953 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-bsbsh\"" Apr 19 12:16:56.381081 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:56.380977 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 19 12:16:56.382068 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:56.382044 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 19 12:16:56.457303 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:56.457263 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkvfh\" (UniqueName: \"kubernetes.io/projected/f5e7d3c4-2d10-4ad1-b460-064b838eca80-kube-api-access-zkvfh\") pod \"kuadrant-operator-catalog-r96bs\" (UID: \"f5e7d3c4-2d10-4ad1-b460-064b838eca80\") " pod="kuadrant-system/kuadrant-operator-catalog-r96bs" Apr 19 12:16:56.558190 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:56.558157 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zkvfh\" (UniqueName: \"kubernetes.io/projected/f5e7d3c4-2d10-4ad1-b460-064b838eca80-kube-api-access-zkvfh\") pod \"kuadrant-operator-catalog-r96bs\" (UID: \"f5e7d3c4-2d10-4ad1-b460-064b838eca80\") " pod="kuadrant-system/kuadrant-operator-catalog-r96bs" Apr 19 12:16:56.568793 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:56.568766 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkvfh\" (UniqueName: \"kubernetes.io/projected/f5e7d3c4-2d10-4ad1-b460-064b838eca80-kube-api-access-zkvfh\") pod \"kuadrant-operator-catalog-r96bs\" (UID: \"f5e7d3c4-2d10-4ad1-b460-064b838eca80\") " pod="kuadrant-system/kuadrant-operator-catalog-r96bs" Apr 19 12:16:56.690298 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:56.690266 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-r96bs" Apr 19 12:16:56.747924 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:56.747891 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-r96bs"] Apr 19 12:16:56.818372 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:56.818345 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-r96bs"] Apr 19 12:16:56.821053 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:16:56.821025 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5e7d3c4_2d10_4ad1_b460_064b838eca80.slice/crio-2539f103607041c047f7031f270ec01787f4a17193acfb540d4fc54c85ea8721 WatchSource:0}: Error finding container 2539f103607041c047f7031f270ec01787f4a17193acfb540d4fc54c85ea8721: Status 404 returned error can't find the container with id 2539f103607041c047f7031f270ec01787f4a17193acfb540d4fc54c85ea8721 Apr 19 12:16:56.953408 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:56.953330 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-hm7z6"] Apr 19 12:16:56.958074 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:56.958056 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-hm7z6" Apr 19 12:16:56.963157 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:56.963129 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-hm7z6"] Apr 19 12:16:57.062654 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:57.062606 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b472n\" (UniqueName: \"kubernetes.io/projected/bcc90d50-936f-4717-a726-3211787fb320-kube-api-access-b472n\") pod \"kuadrant-operator-catalog-hm7z6\" (UID: \"bcc90d50-936f-4717-a726-3211787fb320\") " pod="kuadrant-system/kuadrant-operator-catalog-hm7z6" Apr 19 12:16:57.164978 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:57.163900 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b472n\" (UniqueName: \"kubernetes.io/projected/bcc90d50-936f-4717-a726-3211787fb320-kube-api-access-b472n\") pod \"kuadrant-operator-catalog-hm7z6\" (UID: \"bcc90d50-936f-4717-a726-3211787fb320\") " pod="kuadrant-system/kuadrant-operator-catalog-hm7z6" Apr 19 12:16:57.178460 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:57.178427 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b472n\" (UniqueName: \"kubernetes.io/projected/bcc90d50-936f-4717-a726-3211787fb320-kube-api-access-b472n\") pod \"kuadrant-operator-catalog-hm7z6\" (UID: \"bcc90d50-936f-4717-a726-3211787fb320\") " pod="kuadrant-system/kuadrant-operator-catalog-hm7z6" Apr 19 12:16:57.269439 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:57.269358 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-hm7z6" Apr 19 12:16:57.316277 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:57.316237 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-r96bs" event={"ID":"f5e7d3c4-2d10-4ad1-b460-064b838eca80","Type":"ContainerStarted","Data":"2539f103607041c047f7031f270ec01787f4a17193acfb540d4fc54c85ea8721"} Apr 19 12:16:57.394143 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:57.394100 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-hm7z6"] Apr 19 12:16:57.396271 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:16:57.396241 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcc90d50_936f_4717_a726_3211787fb320.slice/crio-17f6b6c6532526f503332017eae96cc4aee99d507794c8505537371a948a83bf WatchSource:0}: Error finding container 17f6b6c6532526f503332017eae96cc4aee99d507794c8505537371a948a83bf: Status 404 returned error can't find the container with id 17f6b6c6532526f503332017eae96cc4aee99d507794c8505537371a948a83bf Apr 19 12:16:58.321599 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:58.321553 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-hm7z6" event={"ID":"bcc90d50-936f-4717-a726-3211787fb320","Type":"ContainerStarted","Data":"17f6b6c6532526f503332017eae96cc4aee99d507794c8505537371a948a83bf"} Apr 19 12:16:59.326581 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:59.326477 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-hm7z6" event={"ID":"bcc90d50-936f-4717-a726-3211787fb320","Type":"ContainerStarted","Data":"a3dd9581ae82731159a5bb8ab53a2bd758f6324cd28321b9f18c2d8c33f5a1cc"} Apr 19 12:16:59.327938 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:59.327913 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-r96bs" event={"ID":"f5e7d3c4-2d10-4ad1-b460-064b838eca80","Type":"ContainerStarted","Data":"b0a3dd638a3b50db0e06cb21fb768d0c1cf9d472b4731bb1d36e3f992aa629e5"} Apr 19 12:16:59.328059 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:59.327967 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-r96bs" podUID="f5e7d3c4-2d10-4ad1-b460-064b838eca80" containerName="registry-server" containerID="cri-o://b0a3dd638a3b50db0e06cb21fb768d0c1cf9d472b4731bb1d36e3f992aa629e5" gracePeriod=2 Apr 19 12:16:59.341409 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:59.341362 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-hm7z6" podStartSLOduration=1.712008878 podStartE2EDuration="3.341348943s" podCreationTimestamp="2026-04-19 12:16:56 +0000 UTC" firstStartedPulling="2026-04-19 12:16:57.397648425 +0000 UTC m=+426.151840230" lastFinishedPulling="2026-04-19 12:16:59.02698849 +0000 UTC m=+427.781180295" observedRunningTime="2026-04-19 12:16:59.340140482 +0000 UTC m=+428.094332306" watchObservedRunningTime="2026-04-19 12:16:59.341348943 +0000 UTC m=+428.095540769" Apr 19 12:16:59.357242 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:59.357195 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-r96bs" podStartSLOduration=1.180385064 podStartE2EDuration="3.357178923s" podCreationTimestamp="2026-04-19 12:16:56 +0000 UTC" firstStartedPulling="2026-04-19 12:16:56.822401189 +0000 UTC m=+425.576592997" lastFinishedPulling="2026-04-19 12:16:58.999195043 +0000 UTC m=+427.753386856" observedRunningTime="2026-04-19 12:16:59.356242525 +0000 UTC m=+428.110434355" watchObservedRunningTime="2026-04-19 12:16:59.357178923 +0000 UTC m=+428.111370749" Apr 19 12:16:59.573224 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:59.573198 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-r96bs" Apr 19 12:16:59.687724 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:59.687687 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvfh\" (UniqueName: \"kubernetes.io/projected/f5e7d3c4-2d10-4ad1-b460-064b838eca80-kube-api-access-zkvfh\") pod \"f5e7d3c4-2d10-4ad1-b460-064b838eca80\" (UID: \"f5e7d3c4-2d10-4ad1-b460-064b838eca80\") " Apr 19 12:16:59.689879 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:59.689844 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5e7d3c4-2d10-4ad1-b460-064b838eca80-kube-api-access-zkvfh" (OuterVolumeSpecName: "kube-api-access-zkvfh") pod "f5e7d3c4-2d10-4ad1-b460-064b838eca80" (UID: "f5e7d3c4-2d10-4ad1-b460-064b838eca80"). InnerVolumeSpecName "kube-api-access-zkvfh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:16:59.788705 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:16:59.788665 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zkvfh\" (UniqueName: \"kubernetes.io/projected/f5e7d3c4-2d10-4ad1-b460-064b838eca80-kube-api-access-zkvfh\") on node \"ip-10-0-131-150.ec2.internal\" DevicePath \"\"" Apr 19 12:17:00.332796 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:00.332708 2567 generic.go:358] "Generic (PLEG): container finished" podID="f5e7d3c4-2d10-4ad1-b460-064b838eca80" containerID="b0a3dd638a3b50db0e06cb21fb768d0c1cf9d472b4731bb1d36e3f992aa629e5" exitCode=0 Apr 19 12:17:00.332796 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:00.332764 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-r96bs" Apr 19 12:17:00.333263 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:00.332791 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-r96bs" event={"ID":"f5e7d3c4-2d10-4ad1-b460-064b838eca80","Type":"ContainerDied","Data":"b0a3dd638a3b50db0e06cb21fb768d0c1cf9d472b4731bb1d36e3f992aa629e5"} Apr 19 12:17:00.333263 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:00.332836 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-r96bs" event={"ID":"f5e7d3c4-2d10-4ad1-b460-064b838eca80","Type":"ContainerDied","Data":"2539f103607041c047f7031f270ec01787f4a17193acfb540d4fc54c85ea8721"} Apr 19 12:17:00.333263 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:00.332856 2567 scope.go:117] "RemoveContainer" containerID="b0a3dd638a3b50db0e06cb21fb768d0c1cf9d472b4731bb1d36e3f992aa629e5" Apr 19 12:17:00.341599 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:00.341582 2567 scope.go:117] "RemoveContainer" containerID="b0a3dd638a3b50db0e06cb21fb768d0c1cf9d472b4731bb1d36e3f992aa629e5" Apr 19 12:17:00.341887 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:17:00.341865 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0a3dd638a3b50db0e06cb21fb768d0c1cf9d472b4731bb1d36e3f992aa629e5\": container with ID starting with b0a3dd638a3b50db0e06cb21fb768d0c1cf9d472b4731bb1d36e3f992aa629e5 not found: ID does not exist" containerID="b0a3dd638a3b50db0e06cb21fb768d0c1cf9d472b4731bb1d36e3f992aa629e5" Apr 19 12:17:00.341981 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:00.341892 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0a3dd638a3b50db0e06cb21fb768d0c1cf9d472b4731bb1d36e3f992aa629e5"} err="failed to get container status \"b0a3dd638a3b50db0e06cb21fb768d0c1cf9d472b4731bb1d36e3f992aa629e5\": rpc error: code = NotFound desc = could not find container \"b0a3dd638a3b50db0e06cb21fb768d0c1cf9d472b4731bb1d36e3f992aa629e5\": container with ID starting with b0a3dd638a3b50db0e06cb21fb768d0c1cf9d472b4731bb1d36e3f992aa629e5 not found: ID does not exist" Apr 19 12:17:00.349046 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:00.349021 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-r96bs"] Apr 19 12:17:00.354039 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:00.354017 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-r96bs"] Apr 19 12:17:01.837355 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:01.837321 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5e7d3c4-2d10-4ad1-b460-064b838eca80" path="/var/lib/kubelet/pods/f5e7d3c4-2d10-4ad1-b460-064b838eca80/volumes" Apr 19 12:17:07.270492 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:07.270397 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-hm7z6" Apr 19 12:17:07.270492 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:07.270461 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-hm7z6" Apr 19 12:17:07.292282 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:07.292254 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-hm7z6" Apr 19 12:17:07.379962 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:07.379934 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-hm7z6" Apr 19 12:17:25.555190 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:25.555149 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-r5mhn"] Apr 19 12:17:25.555562 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:25.555521 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f5e7d3c4-2d10-4ad1-b460-064b838eca80" containerName="registry-server" Apr 19 12:17:25.555562 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:25.555531 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5e7d3c4-2d10-4ad1-b460-064b838eca80" containerName="registry-server" Apr 19 12:17:25.555635 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:25.555582 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="f5e7d3c4-2d10-4ad1-b460-064b838eca80" containerName="registry-server" Apr 19 12:17:25.558835 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:25.558817 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-r5mhn" Apr 19 12:17:25.561146 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:25.561104 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-2fq6b\"" Apr 19 12:17:25.570167 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:25.570145 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-r5mhn"] Apr 19 12:17:25.725889 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:25.725849 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z2nz\" (UniqueName: \"kubernetes.io/projected/862242bd-b9b0-4d0b-97ff-4f41b481014a-kube-api-access-7z2nz\") pod \"authorino-operator-657f44b778-r5mhn\" (UID: \"862242bd-b9b0-4d0b-97ff-4f41b481014a\") " pod="kuadrant-system/authorino-operator-657f44b778-r5mhn" Apr 19 12:17:25.826707 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:25.826615 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7z2nz\" (UniqueName: \"kubernetes.io/projected/862242bd-b9b0-4d0b-97ff-4f41b481014a-kube-api-access-7z2nz\") pod \"authorino-operator-657f44b778-r5mhn\" (UID: \"862242bd-b9b0-4d0b-97ff-4f41b481014a\") " pod="kuadrant-system/authorino-operator-657f44b778-r5mhn" Apr 19 12:17:25.836975 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:25.836948 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z2nz\" (UniqueName: \"kubernetes.io/projected/862242bd-b9b0-4d0b-97ff-4f41b481014a-kube-api-access-7z2nz\") pod \"authorino-operator-657f44b778-r5mhn\" (UID: \"862242bd-b9b0-4d0b-97ff-4f41b481014a\") " pod="kuadrant-system/authorino-operator-657f44b778-r5mhn" Apr 19 12:17:25.869910 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:25.869874 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-r5mhn" Apr 19 12:17:26.005390 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:26.005303 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-r5mhn"] Apr 19 12:17:26.008128 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:17:26.008088 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod862242bd_b9b0_4d0b_97ff_4f41b481014a.slice/crio-7d6b8bf3609d04118a58441b6fed6f7724d0bb0c5351473801243018aebf3174 WatchSource:0}: Error finding container 7d6b8bf3609d04118a58441b6fed6f7724d0bb0c5351473801243018aebf3174: Status 404 returned error can't find the container with id 7d6b8bf3609d04118a58441b6fed6f7724d0bb0c5351473801243018aebf3174 Apr 19 12:17:26.421183 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:26.421143 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-r5mhn" event={"ID":"862242bd-b9b0-4d0b-97ff-4f41b481014a","Type":"ContainerStarted","Data":"7d6b8bf3609d04118a58441b6fed6f7724d0bb0c5351473801243018aebf3174"} Apr 19 12:17:28.430877 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:28.430839 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-r5mhn" event={"ID":"862242bd-b9b0-4d0b-97ff-4f41b481014a","Type":"ContainerStarted","Data":"319f37e027442f7443a922128a2021b881b496c7b814d393ef6b1156507665cd"} Apr 19 12:17:28.431236 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:28.430930 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-r5mhn" Apr 19 12:17:28.446545 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:28.446459 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-r5mhn" podStartSLOduration=1.26624564 podStartE2EDuration="3.446445934s" podCreationTimestamp="2026-04-19 12:17:25 +0000 UTC" firstStartedPulling="2026-04-19 12:17:26.010528264 +0000 UTC m=+454.764720072" lastFinishedPulling="2026-04-19 12:17:28.190728557 +0000 UTC m=+456.944920366" observedRunningTime="2026-04-19 12:17:28.444003231 +0000 UTC m=+457.198195058" watchObservedRunningTime="2026-04-19 12:17:28.446445934 +0000 UTC m=+457.200637760" Apr 19 12:17:28.896258 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:28.896180 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-qsfwm"] Apr 19 12:17:28.899758 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:28.899740 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-qsfwm" Apr 19 12:17:28.902374 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:28.902354 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-698gg\"" Apr 19 12:17:28.902472 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:28.902397 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 19 12:17:28.908625 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:28.908599 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-qsfwm"] Apr 19 12:17:28.962708 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:28.962674 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6dvt\" (UniqueName: \"kubernetes.io/projected/cbb2efd2-d02b-4efd-a044-e2f1a5c5b9a3-kube-api-access-h6dvt\") pod \"dns-operator-controller-manager-648d5c98bc-qsfwm\" (UID: \"cbb2efd2-d02b-4efd-a044-e2f1a5c5b9a3\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-qsfwm" Apr 19 12:17:29.063644 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:29.063601 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h6dvt\" (UniqueName: \"kubernetes.io/projected/cbb2efd2-d02b-4efd-a044-e2f1a5c5b9a3-kube-api-access-h6dvt\") pod \"dns-operator-controller-manager-648d5c98bc-qsfwm\" (UID: \"cbb2efd2-d02b-4efd-a044-e2f1a5c5b9a3\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-qsfwm" Apr 19 12:17:29.079684 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:29.079649 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6dvt\" (UniqueName: \"kubernetes.io/projected/cbb2efd2-d02b-4efd-a044-e2f1a5c5b9a3-kube-api-access-h6dvt\") pod \"dns-operator-controller-manager-648d5c98bc-qsfwm\" (UID: \"cbb2efd2-d02b-4efd-a044-e2f1a5c5b9a3\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-qsfwm" Apr 19 12:17:29.211322 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:29.211277 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-qsfwm" Apr 19 12:17:29.331459 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:29.331434 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-qsfwm"] Apr 19 12:17:29.333492 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:17:29.333456 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbb2efd2_d02b_4efd_a044_e2f1a5c5b9a3.slice/crio-5afb482e6f54beec887e91b76c913760fbb1a4599f63d52fac8074a509057b37 WatchSource:0}: Error finding container 5afb482e6f54beec887e91b76c913760fbb1a4599f63d52fac8074a509057b37: Status 404 returned error can't find the container with id 5afb482e6f54beec887e91b76c913760fbb1a4599f63d52fac8074a509057b37 Apr 19 12:17:29.435428 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:29.435394 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-qsfwm" event={"ID":"cbb2efd2-d02b-4efd-a044-e2f1a5c5b9a3","Type":"ContainerStarted","Data":"5afb482e6f54beec887e91b76c913760fbb1a4599f63d52fac8074a509057b37"} Apr 19 12:17:32.226809 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:32.226771 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xj5j5"] Apr 19 12:17:32.230204 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:32.230182 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xj5j5" Apr 19 12:17:32.232426 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:32.232407 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-s2ms2\"" Apr 19 12:17:32.240722 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:32.240698 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xj5j5"] Apr 19 12:17:32.296260 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:32.296225 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7qrv\" (UniqueName: \"kubernetes.io/projected/146e9d90-9840-449e-80ff-bb0016144a98-kube-api-access-f7qrv\") pod \"limitador-operator-controller-manager-85c4996f8c-xj5j5\" (UID: \"146e9d90-9840-449e-80ff-bb0016144a98\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xj5j5" Apr 19 12:17:32.397214 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:32.397173 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7qrv\" (UniqueName: \"kubernetes.io/projected/146e9d90-9840-449e-80ff-bb0016144a98-kube-api-access-f7qrv\") pod \"limitador-operator-controller-manager-85c4996f8c-xj5j5\" (UID: \"146e9d90-9840-449e-80ff-bb0016144a98\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xj5j5" Apr 19 12:17:32.407839 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:32.407814 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7qrv\" (UniqueName: \"kubernetes.io/projected/146e9d90-9840-449e-80ff-bb0016144a98-kube-api-access-f7qrv\") pod \"limitador-operator-controller-manager-85c4996f8c-xj5j5\" (UID: \"146e9d90-9840-449e-80ff-bb0016144a98\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xj5j5" Apr 19 12:17:32.448282 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:32.448244 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-qsfwm" event={"ID":"cbb2efd2-d02b-4efd-a044-e2f1a5c5b9a3","Type":"ContainerStarted","Data":"b66ae2a74e8f51e8585659334f7aff065a6646c0210833b23e4f3c3cc64c389f"} Apr 19 12:17:32.448445 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:32.448428 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-qsfwm" Apr 19 12:17:32.468690 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:32.468627 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-qsfwm" podStartSLOduration=2.418400188 podStartE2EDuration="4.46861102s" podCreationTimestamp="2026-04-19 12:17:28 +0000 UTC" firstStartedPulling="2026-04-19 12:17:29.335545812 +0000 UTC m=+458.089737616" lastFinishedPulling="2026-04-19 12:17:31.38575664 +0000 UTC m=+460.139948448" observedRunningTime="2026-04-19 12:17:32.468068813 +0000 UTC m=+461.222260641" watchObservedRunningTime="2026-04-19 12:17:32.46861102 +0000 UTC m=+461.222802849" Apr 19 12:17:32.541544 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:32.541455 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xj5j5" Apr 19 12:17:32.680247 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:32.680220 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xj5j5"] Apr 19 12:17:32.682682 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:17:32.682658 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod146e9d90_9840_449e_80ff_bb0016144a98.slice/crio-94b9a0d21f0743a82ba8d40a41ccd50395c08f8038142a907a2fec341472f5e5 WatchSource:0}: Error finding container 94b9a0d21f0743a82ba8d40a41ccd50395c08f8038142a907a2fec341472f5e5: Status 404 returned error can't find the container with id 94b9a0d21f0743a82ba8d40a41ccd50395c08f8038142a907a2fec341472f5e5 Apr 19 12:17:33.455018 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:33.454975 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xj5j5" event={"ID":"146e9d90-9840-449e-80ff-bb0016144a98","Type":"ContainerStarted","Data":"94b9a0d21f0743a82ba8d40a41ccd50395c08f8038142a907a2fec341472f5e5"} Apr 19 12:17:34.460844 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:34.460805 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xj5j5" event={"ID":"146e9d90-9840-449e-80ff-bb0016144a98","Type":"ContainerStarted","Data":"62b331bf3a6303d607aff18e74a8b6020139bafc896646c9de81afb00f79c855"} Apr 19 12:17:34.461207 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:34.460905 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xj5j5" Apr 19 12:17:34.476511 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:34.476457 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xj5j5" podStartSLOduration=0.791552456 podStartE2EDuration="2.47644172s" podCreationTimestamp="2026-04-19 12:17:32 +0000 UTC" firstStartedPulling="2026-04-19 12:17:32.68463661 +0000 UTC m=+461.438828415" lastFinishedPulling="2026-04-19 12:17:34.369525861 +0000 UTC m=+463.123717679" observedRunningTime="2026-04-19 12:17:34.474610151 +0000 UTC m=+463.228802011" watchObservedRunningTime="2026-04-19 12:17:34.47644172 +0000 UTC m=+463.230633546" Apr 19 12:17:39.437805 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:39.437758 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-r5mhn" Apr 19 12:17:43.457126 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:43.457092 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-qsfwm" Apr 19 12:17:45.467473 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:45.467443 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xj5j5" Apr 19 12:17:45.687226 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:45.687190 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-b2zlh"] Apr 19 12:17:45.690916 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:45.690891 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-b2zlh" Apr 19 12:17:45.693179 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:45.693150 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 19 12:17:45.693179 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:45.693164 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 19 12:17:45.693374 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:45.693157 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-6x9x6\"" Apr 19 12:17:45.699086 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:45.699060 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-b2zlh"] Apr 19 12:17:45.815551 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:45.815463 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8a32a167-68ba-402d-9863-4c0331b7d2e6-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-b2zlh\" (UID: \"8a32a167-68ba-402d-9863-4c0331b7d2e6\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-b2zlh" Apr 19 12:17:45.815699 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:45.815562 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a32a167-68ba-402d-9863-4c0331b7d2e6-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-b2zlh\" (UID: \"8a32a167-68ba-402d-9863-4c0331b7d2e6\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-b2zlh" Apr 19 12:17:45.815699 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:45.815626 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h58g\" (UniqueName: \"kubernetes.io/projected/8a32a167-68ba-402d-9863-4c0331b7d2e6-kube-api-access-4h58g\") pod \"kuadrant-console-plugin-6cb54b5c86-b2zlh\" (UID: \"8a32a167-68ba-402d-9863-4c0331b7d2e6\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-b2zlh" Apr 19 12:17:45.917078 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:45.917037 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8a32a167-68ba-402d-9863-4c0331b7d2e6-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-b2zlh\" (UID: \"8a32a167-68ba-402d-9863-4c0331b7d2e6\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-b2zlh" Apr 19 12:17:45.917312 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:45.917166 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a32a167-68ba-402d-9863-4c0331b7d2e6-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-b2zlh\" (UID: \"8a32a167-68ba-402d-9863-4c0331b7d2e6\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-b2zlh" Apr 19 12:17:45.917312 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:45.917221 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4h58g\" (UniqueName: \"kubernetes.io/projected/8a32a167-68ba-402d-9863-4c0331b7d2e6-kube-api-access-4h58g\") pod \"kuadrant-console-plugin-6cb54b5c86-b2zlh\" (UID: \"8a32a167-68ba-402d-9863-4c0331b7d2e6\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-b2zlh" Apr 19 12:17:45.917700 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:45.917677 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8a32a167-68ba-402d-9863-4c0331b7d2e6-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-b2zlh\" (UID: \"8a32a167-68ba-402d-9863-4c0331b7d2e6\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-b2zlh" Apr 19 12:17:45.919581 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:45.919549 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a32a167-68ba-402d-9863-4c0331b7d2e6-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-b2zlh\" (UID: \"8a32a167-68ba-402d-9863-4c0331b7d2e6\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-b2zlh" Apr 19 12:17:45.932959 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:45.932934 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h58g\" (UniqueName: \"kubernetes.io/projected/8a32a167-68ba-402d-9863-4c0331b7d2e6-kube-api-access-4h58g\") pod \"kuadrant-console-plugin-6cb54b5c86-b2zlh\" (UID: \"8a32a167-68ba-402d-9863-4c0331b7d2e6\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-b2zlh" Apr 19 12:17:46.000998 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:46.000960 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-b2zlh" Apr 19 12:17:46.129536 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:46.129509 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-b2zlh"] Apr 19 12:17:46.131894 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:17:46.131858 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a32a167_68ba_402d_9863_4c0331b7d2e6.slice/crio-2b7350813ab0d321b774e80c1d4b63ea243151809fcbfc0e548a69f8c3c3504c WatchSource:0}: Error finding container 2b7350813ab0d321b774e80c1d4b63ea243151809fcbfc0e548a69f8c3c3504c: Status 404 returned error can't find the container with id 2b7350813ab0d321b774e80c1d4b63ea243151809fcbfc0e548a69f8c3c3504c Apr 19 12:17:46.504915 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:46.504879 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-b2zlh" event={"ID":"8a32a167-68ba-402d-9863-4c0331b7d2e6","Type":"ContainerStarted","Data":"2b7350813ab0d321b774e80c1d4b63ea243151809fcbfc0e548a69f8c3c3504c"} Apr 19 12:17:56.335959 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:56.335923 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xj5j5"] Apr 19 12:17:56.336446 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:56.336240 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xj5j5" podUID="146e9d90-9840-449e-80ff-bb0016144a98" containerName="manager" containerID="cri-o://62b331bf3a6303d607aff18e74a8b6020139bafc896646c9de81afb00f79c855" gracePeriod=2 Apr 19 12:17:56.350001 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:56.349968 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xj5j5"] Apr 19 12:17:56.360524 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:56.360492 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ww7lh"] Apr 19 12:17:56.361271 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:56.361167 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="146e9d90-9840-449e-80ff-bb0016144a98" containerName="manager" Apr 19 12:17:56.361271 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:56.361194 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="146e9d90-9840-449e-80ff-bb0016144a98" containerName="manager" Apr 19 12:17:56.361459 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:56.361297 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="146e9d90-9840-449e-80ff-bb0016144a98" containerName="manager" Apr 19 12:17:56.374740 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:56.374710 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ww7lh"] Apr 19 12:17:56.374883 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:56.374850 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ww7lh" Apr 19 12:17:56.525716 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:56.525680 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6jcx\" (UniqueName: \"kubernetes.io/projected/81f832e6-02fd-45df-bd6a-6d365185a209-kube-api-access-m6jcx\") pod \"limitador-operator-controller-manager-85c4996f8c-ww7lh\" (UID: \"81f832e6-02fd-45df-bd6a-6d365185a209\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ww7lh" Apr 19 12:17:56.627178 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:56.627070 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6jcx\" (UniqueName: \"kubernetes.io/projected/81f832e6-02fd-45df-bd6a-6d365185a209-kube-api-access-m6jcx\") pod \"limitador-operator-controller-manager-85c4996f8c-ww7lh\" (UID: \"81f832e6-02fd-45df-bd6a-6d365185a209\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ww7lh" Apr 19 12:17:56.635758 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:56.635730 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6jcx\" (UniqueName: \"kubernetes.io/projected/81f832e6-02fd-45df-bd6a-6d365185a209-kube-api-access-m6jcx\") pod \"limitador-operator-controller-manager-85c4996f8c-ww7lh\" (UID: \"81f832e6-02fd-45df-bd6a-6d365185a209\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ww7lh" Apr 19 12:17:56.725043 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:17:56.724992 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ww7lh" Apr 19 12:18:08.226187 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:08.226159 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ww7lh"] Apr 19 12:18:08.228099 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:18:08.228071 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81f832e6_02fd_45df_bd6a_6d365185a209.slice/crio-71d8cbc81873ae0e37bfca7003f8c8dfa8bd0e093bac3c0fc2b0bed343487e1c WatchSource:0}: Error finding container 71d8cbc81873ae0e37bfca7003f8c8dfa8bd0e093bac3c0fc2b0bed343487e1c: Status 404 returned error can't find the container with id 71d8cbc81873ae0e37bfca7003f8c8dfa8bd0e093bac3c0fc2b0bed343487e1c Apr 19 12:18:08.230052 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:08.230030 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xj5j5" Apr 19 12:18:08.231919 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:08.231889 2567 status_manager.go:895] "Failed to get status for pod" podUID="146e9d90-9840-449e-80ff-bb0016144a98" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xj5j5" err="pods \"limitador-operator-controller-manager-85c4996f8c-xj5j5\" is forbidden: User \"system:node:ip-10-0-131-150.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-131-150.ec2.internal' and this object" Apr 19 12:18:08.249485 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:08.249464 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7qrv\" (UniqueName: \"kubernetes.io/projected/146e9d90-9840-449e-80ff-bb0016144a98-kube-api-access-f7qrv\") pod \"146e9d90-9840-449e-80ff-bb0016144a98\" (UID: \"146e9d90-9840-449e-80ff-bb0016144a98\") " Apr 19 12:18:08.251416 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:08.251394 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/146e9d90-9840-449e-80ff-bb0016144a98-kube-api-access-f7qrv" (OuterVolumeSpecName: "kube-api-access-f7qrv") pod "146e9d90-9840-449e-80ff-bb0016144a98" (UID: "146e9d90-9840-449e-80ff-bb0016144a98"). InnerVolumeSpecName "kube-api-access-f7qrv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:18:08.350558 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:08.350481 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f7qrv\" (UniqueName: \"kubernetes.io/projected/146e9d90-9840-449e-80ff-bb0016144a98-kube-api-access-f7qrv\") on node \"ip-10-0-131-150.ec2.internal\" DevicePath \"\"" Apr 19 12:18:08.593602 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:08.593564 2567 generic.go:358] "Generic (PLEG): container finished" podID="146e9d90-9840-449e-80ff-bb0016144a98" containerID="62b331bf3a6303d607aff18e74a8b6020139bafc896646c9de81afb00f79c855" exitCode=0 Apr 19 12:18:08.593790 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:08.593614 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xj5j5" Apr 19 12:18:08.593790 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:08.593642 2567 scope.go:117] "RemoveContainer" containerID="62b331bf3a6303d607aff18e74a8b6020139bafc896646c9de81afb00f79c855" Apr 19 12:18:08.595100 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:08.595065 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-b2zlh" event={"ID":"8a32a167-68ba-402d-9863-4c0331b7d2e6","Type":"ContainerStarted","Data":"f7bcbc6ad0c7033db2401df663d1632a4ea9078ac0b550def5ae9d719004b8e5"} Apr 19 12:18:08.595945 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:08.595911 2567 status_manager.go:895] "Failed to get status for pod" podUID="146e9d90-9840-449e-80ff-bb0016144a98" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xj5j5" err="pods \"limitador-operator-controller-manager-85c4996f8c-xj5j5\" is forbidden: User \"system:node:ip-10-0-131-150.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-131-150.ec2.internal' and this object" Apr 19 12:18:08.596716 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:08.596695 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ww7lh" event={"ID":"81f832e6-02fd-45df-bd6a-6d365185a209","Type":"ContainerStarted","Data":"ef179c9e149763210c93e94977fad3f874876986654e35e6ae863d2f05816f1f"} Apr 19 12:18:08.596716 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:08.596722 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ww7lh" event={"ID":"81f832e6-02fd-45df-bd6a-6d365185a209","Type":"ContainerStarted","Data":"71d8cbc81873ae0e37bfca7003f8c8dfa8bd0e093bac3c0fc2b0bed343487e1c"} Apr 19 12:18:08.596896 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:08.596815 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ww7lh" Apr 19 12:18:08.603152 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:08.603135 2567 scope.go:117] "RemoveContainer" containerID="62b331bf3a6303d607aff18e74a8b6020139bafc896646c9de81afb00f79c855" Apr 19 12:18:08.603372 ip-10-0-131-150 kubenswrapper[2567]: E0419 12:18:08.603354 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62b331bf3a6303d607aff18e74a8b6020139bafc896646c9de81afb00f79c855\": container with ID starting with 62b331bf3a6303d607aff18e74a8b6020139bafc896646c9de81afb00f79c855 not found: ID does not exist" containerID="62b331bf3a6303d607aff18e74a8b6020139bafc896646c9de81afb00f79c855" Apr 19 12:18:08.603424 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:08.603379 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62b331bf3a6303d607aff18e74a8b6020139bafc896646c9de81afb00f79c855"} err="failed to get container status \"62b331bf3a6303d607aff18e74a8b6020139bafc896646c9de81afb00f79c855\": rpc error: code = NotFound desc = could not find container \"62b331bf3a6303d607aff18e74a8b6020139bafc896646c9de81afb00f79c855\": container with ID starting with 62b331bf3a6303d607aff18e74a8b6020139bafc896646c9de81afb00f79c855 not found: ID does not exist" Apr 19 12:18:08.610093 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:08.610055 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-b2zlh" podStartSLOduration=1.56150985 podStartE2EDuration="23.610043176s" podCreationTimestamp="2026-04-19 12:17:45 +0000 UTC" firstStartedPulling="2026-04-19 12:17:46.133273351 +0000 UTC m=+474.887465155" lastFinishedPulling="2026-04-19 12:18:08.18180666 +0000 UTC m=+496.935998481" observedRunningTime="2026-04-19 12:18:08.608578004 +0000 UTC m=+497.362769831" watchObservedRunningTime="2026-04-19 12:18:08.610043176 +0000 UTC m=+497.364235058" Apr 19 12:18:08.610284 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:08.610258 2567 status_manager.go:895] "Failed to get status for pod" podUID="146e9d90-9840-449e-80ff-bb0016144a98" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xj5j5" err="pods \"limitador-operator-controller-manager-85c4996f8c-xj5j5\" is forbidden: User \"system:node:ip-10-0-131-150.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-131-150.ec2.internal' and this object" Apr 19 12:18:08.611736 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:08.611714 2567 status_manager.go:895] "Failed to get status for pod" podUID="146e9d90-9840-449e-80ff-bb0016144a98" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xj5j5" err="pods \"limitador-operator-controller-manager-85c4996f8c-xj5j5\" is forbidden: User \"system:node:ip-10-0-131-150.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-131-150.ec2.internal' and this object" Apr 19 12:18:08.629939 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:08.629855 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ww7lh" podStartSLOduration=12.629839683 podStartE2EDuration="12.629839683s" podCreationTimestamp="2026-04-19 12:17:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:18:08.628748577 +0000 UTC m=+497.382940403" watchObservedRunningTime="2026-04-19 12:18:08.629839683 +0000 UTC m=+497.384031510" Apr 19 12:18:09.837794 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:09.837763 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="146e9d90-9840-449e-80ff-bb0016144a98" path="/var/lib/kubelet/pods/146e9d90-9840-449e-80ff-bb0016144a98/volumes" Apr 19 12:18:19.604689 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:19.604651 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ww7lh" Apr 19 12:18:24.986260 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:24.986226 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj"] Apr 19 12:18:25.007050 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.007011 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj"] Apr 19 12:18:25.007243 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.007204 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj" Apr 19 12:18:25.009476 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.009458 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-tjxtl\"" Apr 19 12:18:25.104017 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.103983 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/10f1dd05-cbb1-4144-b0f7-2d667d234a5b-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-7w8qj\" (UID: \"10f1dd05-cbb1-4144-b0f7-2d667d234a5b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj" Apr 19 12:18:25.104017 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.104032 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/10f1dd05-cbb1-4144-b0f7-2d667d234a5b-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-7w8qj\" (UID: \"10f1dd05-cbb1-4144-b0f7-2d667d234a5b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj" Apr 19 12:18:25.104308 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.104142 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/10f1dd05-cbb1-4144-b0f7-2d667d234a5b-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-7w8qj\" (UID: \"10f1dd05-cbb1-4144-b0f7-2d667d234a5b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj" Apr 19 12:18:25.104308 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.104217 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lkkg\" (UniqueName: \"kubernetes.io/projected/10f1dd05-cbb1-4144-b0f7-2d667d234a5b-kube-api-access-5lkkg\") pod \"maas-default-gateway-openshift-default-58b6f876-7w8qj\" (UID: \"10f1dd05-cbb1-4144-b0f7-2d667d234a5b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj" Apr 19 12:18:25.104308 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.104261 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/10f1dd05-cbb1-4144-b0f7-2d667d234a5b-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-7w8qj\" (UID: \"10f1dd05-cbb1-4144-b0f7-2d667d234a5b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj" Apr 19 12:18:25.104470 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.104330 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/10f1dd05-cbb1-4144-b0f7-2d667d234a5b-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-7w8qj\" (UID: \"10f1dd05-cbb1-4144-b0f7-2d667d234a5b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj" Apr 19 12:18:25.104470 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.104357 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/10f1dd05-cbb1-4144-b0f7-2d667d234a5b-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-7w8qj\" (UID: \"10f1dd05-cbb1-4144-b0f7-2d667d234a5b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj" Apr 19 12:18:25.104470 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.104389 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/10f1dd05-cbb1-4144-b0f7-2d667d234a5b-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-7w8qj\" (UID: \"10f1dd05-cbb1-4144-b0f7-2d667d234a5b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj" Apr 19 12:18:25.104470 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.104424 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/10f1dd05-cbb1-4144-b0f7-2d667d234a5b-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-7w8qj\" (UID: \"10f1dd05-cbb1-4144-b0f7-2d667d234a5b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj" Apr 19 12:18:25.205172 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.205133 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5lkkg\" (UniqueName: \"kubernetes.io/projected/10f1dd05-cbb1-4144-b0f7-2d667d234a5b-kube-api-access-5lkkg\") pod \"maas-default-gateway-openshift-default-58b6f876-7w8qj\" (UID: \"10f1dd05-cbb1-4144-b0f7-2d667d234a5b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj" Apr 19 12:18:25.205360 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.205185 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/10f1dd05-cbb1-4144-b0f7-2d667d234a5b-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-7w8qj\" (UID: \"10f1dd05-cbb1-4144-b0f7-2d667d234a5b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj" Apr 19 12:18:25.205360 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.205223 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/10f1dd05-cbb1-4144-b0f7-2d667d234a5b-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-7w8qj\" (UID: \"10f1dd05-cbb1-4144-b0f7-2d667d234a5b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj" Apr 19 12:18:25.205360 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.205240 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/10f1dd05-cbb1-4144-b0f7-2d667d234a5b-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-7w8qj\" (UID: \"10f1dd05-cbb1-4144-b0f7-2d667d234a5b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj" Apr 19 12:18:25.205360 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.205264 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/10f1dd05-cbb1-4144-b0f7-2d667d234a5b-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-7w8qj\" (UID: \"10f1dd05-cbb1-4144-b0f7-2d667d234a5b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj" Apr 19 12:18:25.205360 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.205284 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/10f1dd05-cbb1-4144-b0f7-2d667d234a5b-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-7w8qj\" (UID: \"10f1dd05-cbb1-4144-b0f7-2d667d234a5b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj" Apr 19 12:18:25.205360 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.205311 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/10f1dd05-cbb1-4144-b0f7-2d667d234a5b-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-7w8qj\" (UID: \"10f1dd05-cbb1-4144-b0f7-2d667d234a5b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj" Apr 19 12:18:25.205714 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.205364 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/10f1dd05-cbb1-4144-b0f7-2d667d234a5b-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-7w8qj\" (UID: \"10f1dd05-cbb1-4144-b0f7-2d667d234a5b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj" Apr 19 12:18:25.205714 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.205422 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/10f1dd05-cbb1-4144-b0f7-2d667d234a5b-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-7w8qj\" (UID: \"10f1dd05-cbb1-4144-b0f7-2d667d234a5b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj" Apr 19 12:18:25.205820 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.205744 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/10f1dd05-cbb1-4144-b0f7-2d667d234a5b-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-7w8qj\" (UID: \"10f1dd05-cbb1-4144-b0f7-2d667d234a5b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj" Apr 19 12:18:25.205820 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.205808 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/10f1dd05-cbb1-4144-b0f7-2d667d234a5b-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-7w8qj\" (UID: \"10f1dd05-cbb1-4144-b0f7-2d667d234a5b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj" Apr 19 12:18:25.205918 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.205892 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/10f1dd05-cbb1-4144-b0f7-2d667d234a5b-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-7w8qj\" (UID: \"10f1dd05-cbb1-4144-b0f7-2d667d234a5b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj" Apr 19 12:18:25.205996 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.205973 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/10f1dd05-cbb1-4144-b0f7-2d667d234a5b-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-7w8qj\" (UID: \"10f1dd05-cbb1-4144-b0f7-2d667d234a5b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj" Apr 19 12:18:25.206189 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.206171 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/10f1dd05-cbb1-4144-b0f7-2d667d234a5b-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-7w8qj\" (UID: \"10f1dd05-cbb1-4144-b0f7-2d667d234a5b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj" Apr 19 12:18:25.207600 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.207577 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/10f1dd05-cbb1-4144-b0f7-2d667d234a5b-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-7w8qj\" (UID: \"10f1dd05-cbb1-4144-b0f7-2d667d234a5b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj" Apr 19 12:18:25.207702 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.207653 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/10f1dd05-cbb1-4144-b0f7-2d667d234a5b-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-7w8qj\" (UID: \"10f1dd05-cbb1-4144-b0f7-2d667d234a5b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj" Apr 19 12:18:25.212679 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.212658 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/10f1dd05-cbb1-4144-b0f7-2d667d234a5b-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-7w8qj\" (UID: \"10f1dd05-cbb1-4144-b0f7-2d667d234a5b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj" Apr 19 12:18:25.213168 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.213147 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lkkg\" (UniqueName: \"kubernetes.io/projected/10f1dd05-cbb1-4144-b0f7-2d667d234a5b-kube-api-access-5lkkg\") pod \"maas-default-gateway-openshift-default-58b6f876-7w8qj\" (UID: \"10f1dd05-cbb1-4144-b0f7-2d667d234a5b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj" Apr 19 12:18:25.317835 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.317743 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj" Apr 19 12:18:25.442625 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.442584 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj"] Apr 19 12:18:25.444493 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:18:25.444467 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10f1dd05_cbb1_4144_b0f7_2d667d234a5b.slice/crio-f0527b8f927e5af19c7fccea8d1c99a43cff60fcaec8df4caadeb745743caa8a WatchSource:0}: Error finding container f0527b8f927e5af19c7fccea8d1c99a43cff60fcaec8df4caadeb745743caa8a: Status 404 returned error can't find the container with id f0527b8f927e5af19c7fccea8d1c99a43cff60fcaec8df4caadeb745743caa8a Apr 19 12:18:25.446707 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.446672 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 19 12:18:25.446799 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.446783 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 19 12:18:25.446864 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.446827 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 19 12:18:25.661205 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.661081 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj" event={"ID":"10f1dd05-cbb1-4144-b0f7-2d667d234a5b","Type":"ContainerStarted","Data":"162bab17e5a19eea9928bb7c8c080a360cbaf2993fc34edb988364dee11e20d8"} Apr 19 12:18:25.661205 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.661154 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj" event={"ID":"10f1dd05-cbb1-4144-b0f7-2d667d234a5b","Type":"ContainerStarted","Data":"f0527b8f927e5af19c7fccea8d1c99a43cff60fcaec8df4caadeb745743caa8a"} Apr 19 12:18:25.680989 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:25.680931 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj" podStartSLOduration=1.680914193 podStartE2EDuration="1.680914193s" podCreationTimestamp="2026-04-19 12:18:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:18:25.677591702 +0000 UTC m=+514.431783528" watchObservedRunningTime="2026-04-19 12:18:25.680914193 +0000 UTC m=+514.435106021" Apr 19 12:18:26.317911 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:26.317873 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj" Apr 19 12:18:26.322731 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:26.322706 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj" Apr 19 12:18:26.664886 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:26.664799 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj" Apr 19 12:18:26.665871 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:26.665850 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7w8qj" Apr 19 12:18:39.356646 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:39.356570 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:18:39.403739 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:39.403709 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:18:39.403739 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:39.403740 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:18:39.403922 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:39.403845 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-lg5t4" Apr 19 12:18:39.406095 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:39.406074 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 19 12:18:39.537304 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:39.537265 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/4c6301b2-57c8-4295-90d6-3696c1d3e418-config-file\") pod \"limitador-limitador-78c99df468-lg5t4\" (UID: \"4c6301b2-57c8-4295-90d6-3696c1d3e418\") " pod="kuadrant-system/limitador-limitador-78c99df468-lg5t4" Apr 19 12:18:39.537487 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:39.537316 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77v54\" (UniqueName: \"kubernetes.io/projected/4c6301b2-57c8-4295-90d6-3696c1d3e418-kube-api-access-77v54\") pod \"limitador-limitador-78c99df468-lg5t4\" (UID: \"4c6301b2-57c8-4295-90d6-3696c1d3e418\") " pod="kuadrant-system/limitador-limitador-78c99df468-lg5t4" Apr 19 12:18:39.638854 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:39.638771 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/4c6301b2-57c8-4295-90d6-3696c1d3e418-config-file\") pod \"limitador-limitador-78c99df468-lg5t4\" (UID: \"4c6301b2-57c8-4295-90d6-3696c1d3e418\") " pod="kuadrant-system/limitador-limitador-78c99df468-lg5t4" Apr 19 12:18:39.638854 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:39.638820 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77v54\" (UniqueName: \"kubernetes.io/projected/4c6301b2-57c8-4295-90d6-3696c1d3e418-kube-api-access-77v54\") pod \"limitador-limitador-78c99df468-lg5t4\" (UID: \"4c6301b2-57c8-4295-90d6-3696c1d3e418\") " pod="kuadrant-system/limitador-limitador-78c99df468-lg5t4" Apr 19 12:18:39.639407 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:39.639387 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/4c6301b2-57c8-4295-90d6-3696c1d3e418-config-file\") pod \"limitador-limitador-78c99df468-lg5t4\" (UID: \"4c6301b2-57c8-4295-90d6-3696c1d3e418\") " pod="kuadrant-system/limitador-limitador-78c99df468-lg5t4" Apr 19 12:18:39.646153 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:39.646107 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-77v54\" (UniqueName: \"kubernetes.io/projected/4c6301b2-57c8-4295-90d6-3696c1d3e418-kube-api-access-77v54\") pod \"limitador-limitador-78c99df468-lg5t4\" (UID: \"4c6301b2-57c8-4295-90d6-3696c1d3e418\") " pod="kuadrant-system/limitador-limitador-78c99df468-lg5t4" Apr 19 12:18:39.714523 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:39.714494 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-lg5t4" Apr 19 12:18:39.841214 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:39.841020 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:18:39.843645 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:18:39.843621 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c6301b2_57c8_4295_90d6_3696c1d3e418.slice/crio-a2d216830f0af54d11afd895539f91df4709b776c32fd4178375184c99fedd4a WatchSource:0}: Error finding container a2d216830f0af54d11afd895539f91df4709b776c32fd4178375184c99fedd4a: Status 404 returned error can't find the container with id a2d216830f0af54d11afd895539f91df4709b776c32fd4178375184c99fedd4a Apr 19 12:18:40.717258 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:40.717209 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-lg5t4" event={"ID":"4c6301b2-57c8-4295-90d6-3696c1d3e418","Type":"ContainerStarted","Data":"a2d216830f0af54d11afd895539f91df4709b776c32fd4178375184c99fedd4a"} Apr 19 12:18:42.726546 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:42.726512 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-lg5t4" event={"ID":"4c6301b2-57c8-4295-90d6-3696c1d3e418","Type":"ContainerStarted","Data":"019b27abc202dee78fbb8fedf739464f53f241e08371312ffb4c565c62a8ba5d"} Apr 19 12:18:42.726931 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:42.726633 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-lg5t4" Apr 19 12:18:42.743012 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:42.742952 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-lg5t4" podStartSLOduration=1.2748175339999999 podStartE2EDuration="3.742935483s" podCreationTimestamp="2026-04-19 12:18:39 +0000 UTC" firstStartedPulling="2026-04-19 12:18:39.845913651 +0000 UTC m=+528.600105457" lastFinishedPulling="2026-04-19 12:18:42.314031601 +0000 UTC m=+531.068223406" observedRunningTime="2026-04-19 12:18:42.740063299 +0000 UTC m=+531.494255126" watchObservedRunningTime="2026-04-19 12:18:42.742935483 +0000 UTC m=+531.497127309" Apr 19 12:18:53.731453 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:18:53.731423 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-lg5t4" Apr 19 12:19:34.443268 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:19:34.443226 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:19:51.777552 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:19:51.777525 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c8kc9_88822cf0-d332-4d8f-ab99-d2460f2ad404/console-operator/2.log" Apr 19 12:19:51.779265 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:19:51.779235 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c8kc9_88822cf0-d332-4d8f-ab99-d2460f2ad404/console-operator/2.log" Apr 19 12:19:51.781802 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:19:51.781784 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5n96s_08fea0f9-3a6e-4ab6-b269-5668dab364ea/ovn-acl-logging/0.log" Apr 19 12:19:51.783265 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:19:51.783247 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5n96s_08fea0f9-3a6e-4ab6-b269-5668dab364ea/ovn-acl-logging/0.log" Apr 19 12:20:11.910447 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:20:11.910365 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:20:14.378754 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:20:14.378712 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:20:16.277740 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:20:16.277703 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:20:22.287667 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:20:22.287623 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:20:59.179384 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:20:59.179346 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:21:16.081627 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:21:16.081584 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:21:25.877344 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:21:25.877308 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:22:03.679349 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:22:03.679273 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:22:07.979535 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:22:07.979492 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:22:12.196298 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:22:12.196262 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:22:24.079947 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:22:24.079909 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:22:31.680006 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:22:31.679967 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:22:42.876301 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:22:42.876264 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:22:52.181235 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:22:52.181196 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:23:02.331884 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:23:02.331843 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:24:04.575825 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:24:04.575787 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:24:20.679928 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:24:20.679890 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:24:51.802855 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:24:51.802827 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c8kc9_88822cf0-d332-4d8f-ab99-d2460f2ad404/console-operator/2.log" Apr 19 12:24:51.807961 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:24:51.807939 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c8kc9_88822cf0-d332-4d8f-ab99-d2460f2ad404/console-operator/2.log" Apr 19 12:24:51.809750 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:24:51.809731 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5n96s_08fea0f9-3a6e-4ab6-b269-5668dab364ea/ovn-acl-logging/0.log" Apr 19 12:24:51.812182 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:24:51.812164 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5n96s_08fea0f9-3a6e-4ab6-b269-5668dab364ea/ovn-acl-logging/0.log" Apr 19 12:24:58.976040 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:24:58.976001 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:25:14.881139 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:25:14.881094 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:25:29.281222 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:25:29.281190 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:25:45.679393 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:25:45.679357 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:26:13.079132 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:26:13.079029 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:26:17.079054 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:26:17.079017 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:26:38.789541 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:26:38.789504 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:26:48.677781 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:26:48.677739 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:27:04.769361 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:27:04.769323 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:27:13.479162 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:27:13.479124 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:27:30.883436 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:27:30.883399 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:27:38.795586 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:27:38.795381 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:28:12.075405 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:28:12.075367 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:28:19.573738 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:28:19.573700 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:28:27.882169 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:28:27.882127 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:28:36.576218 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:28:36.576181 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:28:44.571200 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:28:44.571163 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:29:02.575813 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:29:02.575767 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:29:14.077128 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:29:14.077015 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:29:51.830835 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:29:51.830801 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c8kc9_88822cf0-d332-4d8f-ab99-d2460f2ad404/console-operator/2.log" Apr 19 12:29:51.833940 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:29:51.833908 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c8kc9_88822cf0-d332-4d8f-ab99-d2460f2ad404/console-operator/2.log" Apr 19 12:29:51.835097 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:29:51.835071 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5n96s_08fea0f9-3a6e-4ab6-b269-5668dab364ea/ovn-acl-logging/0.log" Apr 19 12:29:51.838768 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:29:51.838748 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5n96s_08fea0f9-3a6e-4ab6-b269-5668dab364ea/ovn-acl-logging/0.log" Apr 19 12:30:00.380143 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:30:00.380093 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:30:08.075814 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:30:08.075781 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:30:17.080951 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:30:17.080910 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:30:25.678403 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:30:25.678368 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:30:34.575809 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:30:34.575725 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:30:42.677149 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:30:42.677096 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:30:51.573791 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:30:51.573756 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:31:00.475364 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:31:00.475330 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:31:09.376063 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:31:09.376025 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:31:17.676150 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:31:17.676099 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:31:26.679197 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:31:26.679159 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:31:34.979952 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:31:34.979913 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:31:44.176464 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:31:44.176420 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:31:52.176208 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:31:52.176156 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:32:01.086319 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:32:01.086278 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:32:09.884480 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:32:09.884404 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:32:18.082268 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:32:18.082234 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:32:26.979883 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:32:26.979845 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:34:46.573275 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:34:46.573238 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:34:51.574884 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:34:51.574848 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:34:51.856792 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:34:51.856707 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c8kc9_88822cf0-d332-4d8f-ab99-d2460f2ad404/console-operator/2.log" Apr 19 12:34:51.860940 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:34:51.860913 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5n96s_08fea0f9-3a6e-4ab6-b269-5668dab364ea/ovn-acl-logging/0.log" Apr 19 12:34:51.861759 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:34:51.861732 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c8kc9_88822cf0-d332-4d8f-ab99-d2460f2ad404/console-operator/2.log" Apr 19 12:34:51.865805 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:34:51.865787 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5n96s_08fea0f9-3a6e-4ab6-b269-5668dab364ea/ovn-acl-logging/0.log" Apr 19 12:35:17.298192 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:35:17.292728 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:35:23.874403 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:35:23.874360 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:35:32.681596 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:35:32.681554 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:35:43.173161 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:35:43.173124 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:35:52.380673 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:35:52.380633 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:36:02.083081 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:36:02.083044 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:36:11.381570 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:36:11.381526 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:36:21.782592 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:36:21.782552 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:36:30.877424 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:36:30.877391 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:36:41.376862 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:36:41.376785 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:36:50.381272 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:36:50.381235 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:37:25.374086 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:37:25.374046 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:38:07.880366 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:38:07.880280 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:38:16.182082 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:38:16.182042 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:38:25.179487 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:38:25.179448 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:38:32.872256 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:38:32.872218 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:38:41.877284 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:38:41.877248 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:38:54.778377 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:38:54.778336 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:39:02.876127 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:39:02.876083 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:39:11.682754 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:39:11.682715 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:39:19.578936 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:39:19.578898 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:39:27.582760 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:39:27.582721 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:39:35.770522 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:39:35.770434 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:39:46.575084 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:39:46.575048 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:39:51.882258 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:39:51.882226 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c8kc9_88822cf0-d332-4d8f-ab99-d2460f2ad404/console-operator/2.log" Apr 19 12:39:51.886529 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:39:51.886506 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5n96s_08fea0f9-3a6e-4ab6-b269-5668dab364ea/ovn-acl-logging/0.log" Apr 19 12:39:51.888276 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:39:51.888248 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c8kc9_88822cf0-d332-4d8f-ab99-d2460f2ad404/console-operator/2.log" Apr 19 12:39:51.892053 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:39:51.892035 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5n96s_08fea0f9-3a6e-4ab6-b269-5668dab364ea/ovn-acl-logging/0.log" Apr 19 12:40:04.182726 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:40:04.182683 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:40:12.476574 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:40:12.476536 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:40:21.479278 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:40:21.479232 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:40:29.278534 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:40:29.278491 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:40:46.778167 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:40:46.778124 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:40:54.278027 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:40:54.277987 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:41:03.886084 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:41:03.886052 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:41:11.779950 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:41:11.779871 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:41:21.080541 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:41:21.080501 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:41:29.279063 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:41:29.279024 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:41:38.175991 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:41:38.175953 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:41:49.785155 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:41:49.785103 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:41:58.782755 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:41:58.782714 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:42:10.881709 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:42:10.881671 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:42:20.482158 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:42:20.482099 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:42:28.880466 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:42:28.880427 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:42:36.681307 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:42:36.681216 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:42:44.478097 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:42:44.478057 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:43:01.075431 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:43:01.075394 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:43:10.381568 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:43:10.381528 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:43:18.880520 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:43:18.880477 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:43:27.078930 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:43:27.078891 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:43:50.691417 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:43:50.691380 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:44:02.882051 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:02.882001 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lg5t4"] Apr 19 12:44:08.458003 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:08.457922 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-676bcb86f4-nxd54_b3d0a41c-19e8-426a-a855-03cf0363ad5a/manager/0.log" Apr 19 12:44:09.833134 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:09.833075 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-r5mhn_862242bd-b9b0-4d0b-97ff-4f41b481014a/manager/0.log" Apr 19 12:44:09.935860 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:09.935823 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-qsfwm_cbb2efd2-d02b-4efd-a044-e2f1a5c5b9a3/manager/0.log" Apr 19 12:44:10.036513 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:10.036483 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-b2zlh_8a32a167-68ba-402d-9863-4c0331b7d2e6/kuadrant-console-plugin/0.log" Apr 19 12:44:10.142166 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:10.142062 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-hm7z6_bcc90d50-936f-4717-a726-3211787fb320/registry-server/0.log" Apr 19 12:44:10.356668 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:10.356636 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-lg5t4_4c6301b2-57c8-4295-90d6-3696c1d3e418/limitador/0.log" Apr 19 12:44:10.459079 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:10.459039 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-ww7lh_81f832e6-02fd-45df-bd6a-6d365185a209/manager/0.log" Apr 19 12:44:10.765975 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:10.765864 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64_550eb47b-2375-423e-964b-6fd281a5d1a2/istio-proxy/0.log" Apr 19 12:44:11.182067 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:11.182031 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-7w8qj_10f1dd05-cbb1-4144-b0f7-2d667d234a5b/istio-proxy/0.log" Apr 19 12:44:18.685501 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:18.685472 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-tplkv_598b50ba-3a7a-4f1d-8e80-a9fb189ba28a/global-pull-secret-syncer/0.log" Apr 19 12:44:18.730669 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:18.730636 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-6ljqn_3ed6a38c-6faa-41d7-855c-af958b4e6898/konnectivity-agent/0.log" Apr 19 12:44:18.817625 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:18.817581 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-150.ec2.internal_21c64da2dfd5e799e05beb88f86623ce/haproxy/0.log" Apr 19 12:44:23.338328 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:23.338296 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-r5mhn_862242bd-b9b0-4d0b-97ff-4f41b481014a/manager/0.log" Apr 19 12:44:23.361741 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:23.361713 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-qsfwm_cbb2efd2-d02b-4efd-a044-e2f1a5c5b9a3/manager/0.log" Apr 19 12:44:23.383366 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:23.383337 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-b2zlh_8a32a167-68ba-402d-9863-4c0331b7d2e6/kuadrant-console-plugin/0.log" Apr 19 12:44:23.416194 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:23.416160 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-hm7z6_bcc90d50-936f-4717-a726-3211787fb320/registry-server/0.log" Apr 19 12:44:23.506219 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:23.506185 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-lg5t4_4c6301b2-57c8-4295-90d6-3696c1d3e418/limitador/0.log" Apr 19 12:44:23.565680 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:23.565650 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-ww7lh_81f832e6-02fd-45df-bd6a-6d365185a209/manager/0.log" Apr 19 12:44:25.105961 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:25.105918 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-nzrxw_f239109c-ba15-4cff-b322-22aa684c3f58/cluster-monitoring-operator/0.log" Apr 19 12:44:25.139406 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:25.139321 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-l4v2s_20d0fb14-1bf7-42b5-bee5-1769c34fea15/kube-state-metrics/0.log" Apr 19 12:44:25.156679 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:25.156638 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-l4v2s_20d0fb14-1bf7-42b5-bee5-1769c34fea15/kube-rbac-proxy-main/0.log" Apr 19 12:44:25.175581 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:25.175547 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-l4v2s_20d0fb14-1bf7-42b5-bee5-1769c34fea15/kube-rbac-proxy-self/0.log" Apr 19 12:44:25.198525 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:25.198494 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-77c459b5d9-4q4r2_5870950f-5e7a-46c3-a04d-45a969812735/metrics-server/0.log" Apr 19 12:44:25.220632 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:25.220608 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-sdr94_28bf133a-7fde-473d-aa9a-ba2319484941/monitoring-plugin/0.log" Apr 19 12:44:25.247486 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:25.247454 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4hzrs_898b1e53-98cc-4608-bcf9-727c3e285ef4/node-exporter/0.log" Apr 19 12:44:25.268137 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:25.268099 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4hzrs_898b1e53-98cc-4608-bcf9-727c3e285ef4/kube-rbac-proxy/0.log" Apr 19 12:44:25.286420 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:25.286393 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4hzrs_898b1e53-98cc-4608-bcf9-727c3e285ef4/init-textfile/0.log" Apr 19 12:44:25.508726 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:25.508697 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_31bab116-4755-4146-bc5c-30aebd8a4641/prometheus/0.log" Apr 19 12:44:25.527180 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:25.527153 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_31bab116-4755-4146-bc5c-30aebd8a4641/config-reloader/0.log" Apr 19 12:44:25.546124 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:25.546084 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_31bab116-4755-4146-bc5c-30aebd8a4641/thanos-sidecar/0.log" Apr 19 12:44:25.566470 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:25.566449 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_31bab116-4755-4146-bc5c-30aebd8a4641/kube-rbac-proxy-web/0.log" Apr 19 12:44:25.584675 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:25.584652 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_31bab116-4755-4146-bc5c-30aebd8a4641/kube-rbac-proxy/0.log" Apr 19 12:44:25.603517 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:25.603494 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_31bab116-4755-4146-bc5c-30aebd8a4641/kube-rbac-proxy-thanos/0.log" Apr 19 12:44:25.625388 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:25.625362 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_31bab116-4755-4146-bc5c-30aebd8a4641/init-config-reloader/0.log" Apr 19 12:44:25.785515 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:25.785430 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d8865455f-6w88z_7e99e30a-bcfc-4582-b11a-60738ad7760f/thanos-query/0.log" Apr 19 12:44:25.805887 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:25.805861 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d8865455f-6w88z_7e99e30a-bcfc-4582-b11a-60738ad7760f/kube-rbac-proxy-web/0.log" Apr 19 12:44:25.823950 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:25.823920 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d8865455f-6w88z_7e99e30a-bcfc-4582-b11a-60738ad7760f/kube-rbac-proxy/0.log" Apr 19 12:44:25.842126 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:25.842093 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d8865455f-6w88z_7e99e30a-bcfc-4582-b11a-60738ad7760f/prom-label-proxy/0.log" Apr 19 12:44:25.866355 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:25.866326 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d8865455f-6w88z_7e99e30a-bcfc-4582-b11a-60738ad7760f/kube-rbac-proxy-rules/0.log" Apr 19 12:44:25.888816 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:25.888784 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d8865455f-6w88z_7e99e30a-bcfc-4582-b11a-60738ad7760f/kube-rbac-proxy-metrics/0.log" Apr 19 12:44:26.822089 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:26.822056 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-vbdrc_eaa8980b-c94a-4ef2-8be6-d92a83338f04/networking-console-plugin/0.log" Apr 19 12:44:27.183355 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:27.183322 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kxrl8/perf-node-gather-daemonset-5khrb"] Apr 19 12:44:27.187200 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:27.187178 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-5khrb" Apr 19 12:44:27.189387 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:27.189369 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kxrl8\"/\"kube-root-ca.crt\"" Apr 19 12:44:27.190192 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:27.190172 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kxrl8\"/\"openshift-service-ca.crt\"" Apr 19 12:44:27.190246 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:27.190182 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-kxrl8\"/\"default-dockercfg-qptrb\"" Apr 19 12:44:27.195685 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:27.195662 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kxrl8/perf-node-gather-daemonset-5khrb"] Apr 19 12:44:27.317364 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:27.317332 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0a8e427c-5aac-4964-895c-45f62ffa7292-proc\") pod \"perf-node-gather-daemonset-5khrb\" (UID: \"0a8e427c-5aac-4964-895c-45f62ffa7292\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-5khrb" Apr 19 12:44:27.317566 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:27.317391 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0a8e427c-5aac-4964-895c-45f62ffa7292-lib-modules\") pod \"perf-node-gather-daemonset-5khrb\" (UID: \"0a8e427c-5aac-4964-895c-45f62ffa7292\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-5khrb" Apr 19 12:44:27.317566 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:27.317499 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0a8e427c-5aac-4964-895c-45f62ffa7292-podres\") pod \"perf-node-gather-daemonset-5khrb\" (UID: \"0a8e427c-5aac-4964-895c-45f62ffa7292\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-5khrb" Apr 19 12:44:27.317566 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:27.317538 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0a8e427c-5aac-4964-895c-45f62ffa7292-sys\") pod \"perf-node-gather-daemonset-5khrb\" (UID: \"0a8e427c-5aac-4964-895c-45f62ffa7292\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-5khrb" Apr 19 12:44:27.317667 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:27.317587 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq7n5\" (UniqueName: \"kubernetes.io/projected/0a8e427c-5aac-4964-895c-45f62ffa7292-kube-api-access-pq7n5\") pod \"perf-node-gather-daemonset-5khrb\" (UID: \"0a8e427c-5aac-4964-895c-45f62ffa7292\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-5khrb" Apr 19 12:44:27.318924 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:27.318900 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c8kc9_88822cf0-d332-4d8f-ab99-d2460f2ad404/console-operator/2.log" Apr 19 12:44:27.323127 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:27.323095 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-c8kc9_88822cf0-d332-4d8f-ab99-d2460f2ad404/console-operator/3.log" Apr 19 12:44:27.418354 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:27.418313 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0a8e427c-5aac-4964-895c-45f62ffa7292-lib-modules\") pod \"perf-node-gather-daemonset-5khrb\" (UID: \"0a8e427c-5aac-4964-895c-45f62ffa7292\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-5khrb" Apr 19 12:44:27.418550 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:27.418393 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0a8e427c-5aac-4964-895c-45f62ffa7292-podres\") pod \"perf-node-gather-daemonset-5khrb\" (UID: \"0a8e427c-5aac-4964-895c-45f62ffa7292\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-5khrb" Apr 19 12:44:27.418550 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:27.418424 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0a8e427c-5aac-4964-895c-45f62ffa7292-sys\") pod \"perf-node-gather-daemonset-5khrb\" (UID: \"0a8e427c-5aac-4964-895c-45f62ffa7292\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-5khrb" Apr 19 12:44:27.418550 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:27.418444 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pq7n5\" (UniqueName: \"kubernetes.io/projected/0a8e427c-5aac-4964-895c-45f62ffa7292-kube-api-access-pq7n5\") pod \"perf-node-gather-daemonset-5khrb\" (UID: \"0a8e427c-5aac-4964-895c-45f62ffa7292\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-5khrb" Apr 19 12:44:27.418550 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:27.418484 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0a8e427c-5aac-4964-895c-45f62ffa7292-proc\") pod \"perf-node-gather-daemonset-5khrb\" (UID: \"0a8e427c-5aac-4964-895c-45f62ffa7292\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-5khrb" Apr 19 12:44:27.418550 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:27.418533 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0a8e427c-5aac-4964-895c-45f62ffa7292-lib-modules\") pod \"perf-node-gather-daemonset-5khrb\" (UID: \"0a8e427c-5aac-4964-895c-45f62ffa7292\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-5khrb" Apr 19 12:44:27.418742 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:27.418555 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0a8e427c-5aac-4964-895c-45f62ffa7292-proc\") pod \"perf-node-gather-daemonset-5khrb\" (UID: \"0a8e427c-5aac-4964-895c-45f62ffa7292\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-5khrb" Apr 19 12:44:27.418742 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:27.418565 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0a8e427c-5aac-4964-895c-45f62ffa7292-podres\") pod \"perf-node-gather-daemonset-5khrb\" (UID: \"0a8e427c-5aac-4964-895c-45f62ffa7292\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-5khrb" Apr 19 12:44:27.418742 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:27.418544 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0a8e427c-5aac-4964-895c-45f62ffa7292-sys\") pod \"perf-node-gather-daemonset-5khrb\" (UID: \"0a8e427c-5aac-4964-895c-45f62ffa7292\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-5khrb" Apr 19 12:44:27.426349 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:27.426315 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq7n5\" (UniqueName: \"kubernetes.io/projected/0a8e427c-5aac-4964-895c-45f62ffa7292-kube-api-access-pq7n5\") pod \"perf-node-gather-daemonset-5khrb\" (UID: \"0a8e427c-5aac-4964-895c-45f62ffa7292\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-5khrb" Apr 19 12:44:27.497894 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:27.497801 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-5khrb" Apr 19 12:44:27.623986 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:27.623956 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kxrl8/perf-node-gather-daemonset-5khrb"] Apr 19 12:44:27.626597 ip-10-0-131-150 kubenswrapper[2567]: W0419 12:44:27.626564 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0a8e427c_5aac_4964_895c_45f62ffa7292.slice/crio-a9068dec00a35329889b8e47c7624ef611b0183df28c01e9b31d391b836573d7 WatchSource:0}: Error finding container a9068dec00a35329889b8e47c7624ef611b0183df28c01e9b31d391b836573d7: Status 404 returned error can't find the container with id a9068dec00a35329889b8e47c7624ef611b0183df28c01e9b31d391b836573d7 Apr 19 12:44:27.628162 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:27.628141 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 19 12:44:27.817408 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:27.817319 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-lfqxb_dea1fdb1-ab00-4985-b0cb-3078be61f4c5/download-server/0.log" Apr 19 12:44:28.153178 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:28.153083 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-5khrb" event={"ID":"0a8e427c-5aac-4964-895c-45f62ffa7292","Type":"ContainerStarted","Data":"bb0a59084d5ba786e768ee8b7bfb141aee4a2b9d47ac421f677aa075b880378c"} Apr 19 12:44:28.153178 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:28.153133 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-5khrb" event={"ID":"0a8e427c-5aac-4964-895c-45f62ffa7292","Type":"ContainerStarted","Data":"a9068dec00a35329889b8e47c7624ef611b0183df28c01e9b31d391b836573d7"} Apr 19 12:44:28.153577 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:28.153203 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-5khrb" Apr 19 12:44:28.168752 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:28.168705 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-5khrb" podStartSLOduration=1.168691176 podStartE2EDuration="1.168691176s" podCreationTimestamp="2026-04-19 12:44:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:44:28.166264775 +0000 UTC m=+2076.920456602" watchObservedRunningTime="2026-04-19 12:44:28.168691176 +0000 UTC m=+2076.922883003" Apr 19 12:44:29.061282 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:29.061257 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7jlhs_bb22769b-c18a-471e-9118-2fca21dc6606/dns/0.log" Apr 19 12:44:29.079536 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:29.079514 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7jlhs_bb22769b-c18a-471e-9118-2fca21dc6606/kube-rbac-proxy/0.log" Apr 19 12:44:29.138166 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:29.138138 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-8gjrs_8cd0f946-6502-4d2a-94d4-721582219a2f/dns-node-resolver/0.log" Apr 19 12:44:29.663282 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:29.663252 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-zfq9x_2527ca81-1ebd-4808-a264-00f75b2caea4/node-ca/0.log" Apr 19 12:44:30.392571 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:30.392539 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfdqt64_550eb47b-2375-423e-964b-6fd281a5d1a2/istio-proxy/0.log" Apr 19 12:44:30.582183 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:30.582149 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-7w8qj_10f1dd05-cbb1-4144-b0f7-2d667d234a5b/istio-proxy/0.log" Apr 19 12:44:31.131049 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:31.131019 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-l5qnw_0cdd2a15-9ab0-45f9-9373-938372482e1a/serve-healthcheck-canary/0.log" Apr 19 12:44:31.534195 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:31.534161 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-b5csj_a80fa3ab-60ec-4151-bfb0-4bc0ed470f79/insights-operator/0.log" Apr 19 12:44:31.534783 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:31.534763 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-b5csj_a80fa3ab-60ec-4151-bfb0-4bc0ed470f79/insights-operator/1.log" Apr 19 12:44:31.614188 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:31.614159 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-srrrj_c089de39-3da5-44fa-94ec-ad70051ece2c/kube-rbac-proxy/0.log" Apr 19 12:44:31.632804 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:31.632778 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-srrrj_c089de39-3da5-44fa-94ec-ad70051ece2c/exporter/0.log" Apr 19 12:44:31.652098 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:31.652075 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-srrrj_c089de39-3da5-44fa-94ec-ad70051ece2c/extractor/0.log" Apr 19 12:44:33.740166 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:33.740137 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-676bcb86f4-nxd54_b3d0a41c-19e8-426a-a855-03cf0363ad5a/manager/0.log" Apr 19 12:44:34.166475 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:34.166400 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-5khrb" Apr 19 12:44:34.814772 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:34.814739 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-688fc496d-8v748_5d09d25b-d17a-4e56-8370-75c92ceb73de/manager/0.log" Apr 19 12:44:39.361433 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:39.361354 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-pllfr_d63f033d-38ec-4dd5-be2a-1c76af7cde7d/migrator/0.log" Apr 19 12:44:39.380252 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:39.380229 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-pllfr_d63f033d-38ec-4dd5-be2a-1c76af7cde7d/graceful-termination/0.log" Apr 19 12:44:39.746522 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:39.746473 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-xhcck_0ff01778-4b50-4a8a-ab4e-abf54e99b970/kube-storage-version-migrator-operator/1.log" Apr 19 12:44:39.747325 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:39.747305 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-xhcck_0ff01778-4b50-4a8a-ab4e-abf54e99b970/kube-storage-version-migrator-operator/0.log" Apr 19 12:44:40.641533 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:40.641507 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-57vt2_d61ab6ab-1e8a-4e45-97e4-d3bb1ed06216/kube-multus/0.log" Apr 19 12:44:40.667902 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:40.667871 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2tbjd_f601721a-a6ac-4b15-8bc0-48274f620286/kube-multus-additional-cni-plugins/0.log" Apr 19 12:44:40.687561 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:40.687536 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2tbjd_f601721a-a6ac-4b15-8bc0-48274f620286/egress-router-binary-copy/0.log" Apr 19 12:44:40.706074 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:40.706044 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2tbjd_f601721a-a6ac-4b15-8bc0-48274f620286/cni-plugins/0.log" Apr 19 12:44:40.724153 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:40.724093 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2tbjd_f601721a-a6ac-4b15-8bc0-48274f620286/bond-cni-plugin/0.log" Apr 19 12:44:40.743954 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:40.743929 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2tbjd_f601721a-a6ac-4b15-8bc0-48274f620286/routeoverride-cni/0.log" Apr 19 12:44:40.762514 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:40.762483 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2tbjd_f601721a-a6ac-4b15-8bc0-48274f620286/whereabouts-cni-bincopy/0.log" Apr 19 12:44:40.782575 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:40.782546 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2tbjd_f601721a-a6ac-4b15-8bc0-48274f620286/whereabouts-cni/0.log" Apr 19 12:44:41.237700 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:41.237671 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-rxnrj_3a9d31f9-fb41-43e0-9946-0611710438a1/network-metrics-daemon/0.log" Apr 19 12:44:41.257944 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:41.257918 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-rxnrj_3a9d31f9-fb41-43e0-9946-0611710438a1/kube-rbac-proxy/0.log" Apr 19 12:44:42.041593 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:42.041565 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5n96s_08fea0f9-3a6e-4ab6-b269-5668dab364ea/ovn-controller/0.log" Apr 19 12:44:42.058470 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:42.058441 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5n96s_08fea0f9-3a6e-4ab6-b269-5668dab364ea/ovn-acl-logging/0.log" Apr 19 12:44:42.067282 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:42.067255 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5n96s_08fea0f9-3a6e-4ab6-b269-5668dab364ea/ovn-acl-logging/1.log" Apr 19 12:44:42.083188 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:42.083152 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5n96s_08fea0f9-3a6e-4ab6-b269-5668dab364ea/kube-rbac-proxy-node/0.log" Apr 19 12:44:42.102255 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:42.102224 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5n96s_08fea0f9-3a6e-4ab6-b269-5668dab364ea/kube-rbac-proxy-ovn-metrics/0.log" Apr 19 12:44:42.118826 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:42.118796 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5n96s_08fea0f9-3a6e-4ab6-b269-5668dab364ea/northd/0.log" Apr 19 12:44:42.138627 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:42.138602 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5n96s_08fea0f9-3a6e-4ab6-b269-5668dab364ea/nbdb/0.log" Apr 19 12:44:42.158041 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:42.158016 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5n96s_08fea0f9-3a6e-4ab6-b269-5668dab364ea/sbdb/0.log" Apr 19 12:44:42.258897 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:42.258871 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5n96s_08fea0f9-3a6e-4ab6-b269-5668dab364ea/ovnkube-controller/0.log" Apr 19 12:44:43.864640 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:43.864607 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-69kxr_c60217db-a1f2-446f-bada-9675c7c62201/network-check-target-container/0.log" Apr 19 12:44:44.868736 ip-10-0-131-150 kubenswrapper[2567]: I0419 12:44:44.868710 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-qgqpz_2c805a23-696a-4038-acd9-e934f8c66c1d/iptables-alerter/0.log"