Apr 23 01:09:52.105903 ip-10-0-138-235 systemd[1]: Starting Kubernetes Kubelet... Apr 23 01:09:52.506626 ip-10-0-138-235 kubenswrapper[2569]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 01:09:52.507633 ip-10-0-138-235 kubenswrapper[2569]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 01:09:52.507633 ip-10-0-138-235 kubenswrapper[2569]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 01:09:52.507633 ip-10-0-138-235 kubenswrapper[2569]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 01:09:52.507633 ip-10-0-138-235 kubenswrapper[2569]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 01:09:52.508791 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.508692 2569 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 01:09:52.512496 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512480 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 01:09:52.512496 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512497 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 01:09:52.512574 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512501 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 01:09:52.512574 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512505 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 01:09:52.512574 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512508 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 01:09:52.512574 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512511 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 01:09:52.512574 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512514 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 01:09:52.512574 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512516 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 01:09:52.512574 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512533 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 01:09:52.512574 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512537 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 01:09:52.512574 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512540 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 01:09:52.512574 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512543 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 01:09:52.512574 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512545 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 01:09:52.512574 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512548 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 01:09:52.512574 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512551 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 01:09:52.512574 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512554 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 01:09:52.512574 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512556 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 01:09:52.512574 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512559 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 01:09:52.512574 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512562 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 01:09:52.512574 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512565 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 01:09:52.512574 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512567 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 01:09:52.512574 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512570 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 01:09:52.513058 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512572 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 01:09:52.513058 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512575 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 01:09:52.513058 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512578 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 01:09:52.513058 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512581 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 01:09:52.513058 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512583 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 01:09:52.513058 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512586 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 01:09:52.513058 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512596 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 01:09:52.513058 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512599 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 01:09:52.513058 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512602 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 01:09:52.513058 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512605 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 01:09:52.513058 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512619 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 01:09:52.513058 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512622 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 01:09:52.513058 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512624 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 01:09:52.513058 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512627 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 01:09:52.513058 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512630 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 01:09:52.513058 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512633 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 01:09:52.513058 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512635 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 01:09:52.513058 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512638 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 01:09:52.513058 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512640 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 01:09:52.513058 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512649 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 01:09:52.513541 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512652 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 01:09:52.513541 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512654 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 01:09:52.513541 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512657 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 01:09:52.513541 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512659 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 01:09:52.513541 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512662 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 01:09:52.513541 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512664 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 01:09:52.513541 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512666 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 01:09:52.513541 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512669 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 01:09:52.513541 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512671 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 01:09:52.513541 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512674 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 01:09:52.513541 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512677 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 01:09:52.513541 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512679 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 01:09:52.513541 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512682 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 01:09:52.513541 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512685 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 01:09:52.513541 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512688 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 01:09:52.513541 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512690 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 01:09:52.513541 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512693 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 01:09:52.513541 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512696 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 01:09:52.513541 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512698 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 01:09:52.513541 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512701 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 01:09:52.514053 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512704 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 01:09:52.514053 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512706 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 01:09:52.514053 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512709 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 01:09:52.514053 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512711 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 01:09:52.514053 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512714 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 23 01:09:52.514053 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512716 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 01:09:52.514053 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512718 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 01:09:52.514053 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512721 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 01:09:52.514053 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512723 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 01:09:52.514053 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512726 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 01:09:52.514053 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512729 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 01:09:52.514053 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512731 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 01:09:52.514053 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512741 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 01:09:52.514053 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512746 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 01:09:52.514053 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512750 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 01:09:52.514053 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512753 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 01:09:52.514053 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512756 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 01:09:52.514053 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512759 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 01:09:52.514053 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512762 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 01:09:52.514510 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512765 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 01:09:52.514510 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512768 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 01:09:52.514510 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512771 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 01:09:52.514510 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512774 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 01:09:52.514510 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.512777 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 01:09:52.514510 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513246 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 01:09:52.514510 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513252 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 01:09:52.514510 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513256 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 01:09:52.514510 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513259 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 01:09:52.514510 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513262 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 01:09:52.514510 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513265 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 01:09:52.514510 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513268 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 01:09:52.514510 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513271 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 01:09:52.514510 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513274 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 01:09:52.514510 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513277 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 01:09:52.514510 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513280 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 01:09:52.514510 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513283 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 01:09:52.514510 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513286 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 01:09:52.514510 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513288 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 01:09:52.514510 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513291 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 01:09:52.515037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513293 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 01:09:52.515037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513296 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 01:09:52.515037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513298 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 01:09:52.515037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513301 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 01:09:52.515037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513303 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 01:09:52.515037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513306 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 01:09:52.515037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513309 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 01:09:52.515037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513312 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 23 01:09:52.515037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513314 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 01:09:52.515037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513317 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 01:09:52.515037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513319 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 01:09:52.515037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513322 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 01:09:52.515037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513325 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 01:09:52.515037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513327 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 01:09:52.515037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513330 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 01:09:52.515037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513333 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 01:09:52.515037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513335 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 01:09:52.515037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513338 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 01:09:52.515037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513341 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 01:09:52.515037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513344 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 01:09:52.515526 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513346 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 01:09:52.515526 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513349 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 01:09:52.515526 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513351 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 01:09:52.515526 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513354 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 01:09:52.515526 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513356 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 01:09:52.515526 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513359 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 01:09:52.515526 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513363 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 01:09:52.515526 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513366 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 01:09:52.515526 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513369 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 01:09:52.515526 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513372 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 01:09:52.515526 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513375 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 01:09:52.515526 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513378 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 01:09:52.515526 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513381 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 01:09:52.515526 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513384 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 01:09:52.515526 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513386 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 01:09:52.515526 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513389 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 01:09:52.515526 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513391 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 01:09:52.515526 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513393 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 01:09:52.515984 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513396 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 01:09:52.515984 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513399 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 01:09:52.515984 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513401 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 01:09:52.515984 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513404 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 01:09:52.515984 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513406 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 01:09:52.515984 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513409 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 01:09:52.515984 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513411 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 01:09:52.515984 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513415 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 01:09:52.515984 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513418 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 01:09:52.515984 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513421 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 01:09:52.515984 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513424 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 01:09:52.515984 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513426 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 01:09:52.515984 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513430 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 01:09:52.515984 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513432 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 01:09:52.515984 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513435 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 01:09:52.515984 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513437 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 01:09:52.515984 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513440 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 01:09:52.515984 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513442 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 01:09:52.515984 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513445 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 01:09:52.516451 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513447 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 01:09:52.516451 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513450 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 01:09:52.516451 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513453 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 01:09:52.516451 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513455 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 01:09:52.516451 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513458 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 01:09:52.516451 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513460 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 01:09:52.516451 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513466 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 01:09:52.516451 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513469 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 01:09:52.516451 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513472 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 01:09:52.516451 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513475 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 01:09:52.516451 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513477 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 01:09:52.516451 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513480 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 01:09:52.516451 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513482 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 01:09:52.516451 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.513485 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 01:09:52.516451 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513568 2569 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 01:09:52.516451 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513578 2569 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 01:09:52.516451 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513588 2569 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 01:09:52.516451 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513597 2569 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 01:09:52.516451 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513602 2569 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 01:09:52.516451 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513605 2569 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 01:09:52.516451 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513622 2569 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 01:09:52.516976 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513627 2569 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 01:09:52.516976 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513630 2569 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 01:09:52.516976 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513633 2569 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 01:09:52.516976 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513637 2569 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 01:09:52.516976 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513640 2569 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 01:09:52.516976 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513643 2569 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 01:09:52.516976 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513647 2569 flags.go:64] FLAG: --cgroup-root="" Apr 23 01:09:52.516976 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513650 2569 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 01:09:52.516976 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513653 2569 flags.go:64] FLAG: --client-ca-file="" Apr 23 01:09:52.516976 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513656 2569 flags.go:64] FLAG: --cloud-config="" Apr 23 01:09:52.516976 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513659 2569 flags.go:64] FLAG: --cloud-provider="external" Apr 23 01:09:52.516976 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513662 2569 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 01:09:52.516976 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513666 2569 flags.go:64] FLAG: --cluster-domain="" Apr 23 01:09:52.516976 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513669 2569 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 01:09:52.516976 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513672 2569 flags.go:64] FLAG: --config-dir="" Apr 23 01:09:52.516976 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513675 2569 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 01:09:52.516976 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513678 2569 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 01:09:52.516976 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513684 2569 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 01:09:52.516976 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513687 2569 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 01:09:52.516976 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513691 2569 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 01:09:52.516976 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513694 2569 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 01:09:52.516976 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513697 2569 flags.go:64] FLAG: --contention-profiling="false" Apr 23 01:09:52.516976 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513700 2569 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 01:09:52.516976 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513703 2569 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 01:09:52.517556 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513706 2569 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 01:09:52.517556 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513709 2569 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 01:09:52.517556 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513715 2569 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 01:09:52.517556 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513719 2569 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 01:09:52.517556 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513722 2569 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 01:09:52.517556 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513725 2569 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 01:09:52.517556 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513728 2569 flags.go:64] FLAG: --enable-server="true" Apr 23 01:09:52.517556 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513739 2569 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 01:09:52.517556 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513743 2569 flags.go:64] FLAG: --event-burst="100" Apr 23 01:09:52.517556 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513747 2569 flags.go:64] FLAG: --event-qps="50" Apr 23 01:09:52.517556 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513750 2569 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 01:09:52.517556 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513753 2569 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 01:09:52.517556 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513756 2569 flags.go:64] FLAG: --eviction-hard="" Apr 23 01:09:52.517556 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513760 2569 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 01:09:52.517556 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513763 2569 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 01:09:52.517556 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513766 2569 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 01:09:52.517556 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513769 2569 flags.go:64] FLAG: --eviction-soft="" Apr 23 01:09:52.517556 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513772 2569 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 01:09:52.517556 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513775 2569 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 01:09:52.517556 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513778 2569 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 01:09:52.517556 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513781 2569 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 01:09:52.517556 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513784 2569 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 01:09:52.517556 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513787 2569 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 01:09:52.517556 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513790 2569 flags.go:64] FLAG: --feature-gates="" Apr 23 01:09:52.517556 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513794 2569 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 01:09:52.518172 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513797 2569 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 01:09:52.518172 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513801 2569 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 01:09:52.518172 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513804 2569 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 01:09:52.518172 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513807 2569 flags.go:64] FLAG: --healthz-port="10248" Apr 23 01:09:52.518172 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513810 2569 flags.go:64] FLAG: --help="false" Apr 23 01:09:52.518172 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513813 2569 flags.go:64] FLAG: --hostname-override="ip-10-0-138-235.ec2.internal" Apr 23 01:09:52.518172 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513817 2569 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 01:09:52.518172 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513820 2569 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 01:09:52.518172 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513822 2569 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 01:09:52.518172 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513826 2569 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 01:09:52.518172 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513830 2569 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 01:09:52.518172 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513833 2569 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 01:09:52.518172 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513836 2569 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 01:09:52.518172 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513839 2569 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 01:09:52.518172 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513842 2569 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 01:09:52.518172 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513846 2569 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 01:09:52.518172 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513849 2569 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 01:09:52.518172 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513852 2569 flags.go:64] FLAG: --kube-reserved="" Apr 23 01:09:52.518172 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513854 2569 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 01:09:52.518172 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513857 2569 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 01:09:52.518172 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513860 2569 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 01:09:52.518172 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513863 2569 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 01:09:52.518172 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513866 2569 flags.go:64] FLAG: --lock-file="" Apr 23 01:09:52.518172 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513869 2569 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 01:09:52.518783 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513872 2569 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 01:09:52.518783 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513875 2569 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 01:09:52.518783 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513881 2569 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 01:09:52.518783 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513884 2569 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 01:09:52.518783 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513887 2569 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 01:09:52.518783 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513890 2569 flags.go:64] FLAG: --logging-format="text" Apr 23 01:09:52.518783 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513893 2569 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 01:09:52.518783 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513896 2569 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 01:09:52.518783 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513899 2569 flags.go:64] FLAG: --manifest-url="" Apr 23 01:09:52.518783 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513902 2569 flags.go:64] FLAG: --manifest-url-header="" Apr 23 01:09:52.518783 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513912 2569 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 01:09:52.518783 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513915 2569 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 01:09:52.518783 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513919 2569 flags.go:64] FLAG: --max-pods="110" Apr 23 01:09:52.518783 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513922 2569 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 01:09:52.518783 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513925 2569 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 01:09:52.518783 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513928 2569 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 01:09:52.518783 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513931 2569 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 01:09:52.518783 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513934 2569 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 01:09:52.518783 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513937 2569 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 01:09:52.518783 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513941 2569 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 01:09:52.518783 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513949 2569 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 01:09:52.518783 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513952 2569 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 01:09:52.518783 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513956 2569 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 01:09:52.518783 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513959 2569 flags.go:64] FLAG: --pod-cidr="" Apr 23 01:09:52.519456 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513962 2569 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 01:09:52.519456 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513968 2569 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 01:09:52.519456 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513972 2569 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 01:09:52.519456 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513975 2569 flags.go:64] FLAG: --pods-per-core="0" Apr 23 01:09:52.519456 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513978 2569 flags.go:64] FLAG: --port="10250" Apr 23 01:09:52.519456 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513981 2569 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 01:09:52.519456 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513985 2569 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0948f3fc064149912" Apr 23 01:09:52.519456 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513988 2569 flags.go:64] FLAG: --qos-reserved="" Apr 23 01:09:52.519456 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513991 2569 flags.go:64] FLAG: --read-only-port="10255" Apr 23 01:09:52.519456 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513994 2569 flags.go:64] FLAG: --register-node="true" Apr 23 01:09:52.519456 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.513997 2569 flags.go:64] FLAG: --register-schedulable="true" Apr 23 01:09:52.519456 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514000 2569 flags.go:64] FLAG: --register-with-taints="" Apr 23 01:09:52.519456 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514003 2569 flags.go:64] FLAG: --registry-burst="10" Apr 23 01:09:52.519456 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514007 2569 flags.go:64] FLAG: --registry-qps="5" Apr 23 01:09:52.519456 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514009 2569 flags.go:64] FLAG: --reserved-cpus="" Apr 23 01:09:52.519456 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514012 2569 flags.go:64] FLAG: --reserved-memory="" Apr 23 01:09:52.519456 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514017 2569 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 01:09:52.519456 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514020 2569 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 01:09:52.519456 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514025 2569 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 01:09:52.519456 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514028 2569 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 01:09:52.519456 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514031 2569 flags.go:64] FLAG: --runonce="false" Apr 23 01:09:52.519456 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514034 2569 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 01:09:52.519456 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514037 2569 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 01:09:52.519456 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514040 2569 flags.go:64] FLAG: --seccomp-default="false" Apr 23 01:09:52.519456 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514043 2569 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 01:09:52.520112 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514046 2569 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 01:09:52.520112 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514049 2569 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 01:09:52.520112 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514052 2569 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 01:09:52.520112 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514057 2569 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 01:09:52.520112 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514061 2569 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 01:09:52.520112 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514064 2569 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 01:09:52.520112 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514067 2569 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 01:09:52.520112 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514070 2569 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 01:09:52.520112 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514073 2569 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 01:09:52.520112 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514076 2569 flags.go:64] FLAG: --system-cgroups="" Apr 23 01:09:52.520112 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514079 2569 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 01:09:52.520112 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514084 2569 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 01:09:52.520112 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514087 2569 flags.go:64] FLAG: --tls-cert-file="" Apr 23 01:09:52.520112 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514090 2569 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 01:09:52.520112 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514093 2569 flags.go:64] FLAG: --tls-min-version="" Apr 23 01:09:52.520112 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514096 2569 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 01:09:52.520112 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514099 2569 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 01:09:52.520112 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514102 2569 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 01:09:52.520112 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514105 2569 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 01:09:52.520112 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514109 2569 flags.go:64] FLAG: --v="2" Apr 23 01:09:52.520112 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514113 2569 flags.go:64] FLAG: --version="false" Apr 23 01:09:52.520112 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514117 2569 flags.go:64] FLAG: --vmodule="" Apr 23 01:09:52.520112 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514121 2569 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 01:09:52.520112 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.514126 2569 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 01:09:52.520112 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514224 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 01:09:52.520729 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514229 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 01:09:52.520729 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514232 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 01:09:52.520729 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514235 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 01:09:52.520729 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514237 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 01:09:52.520729 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514240 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 01:09:52.520729 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514243 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 01:09:52.520729 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514246 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 01:09:52.520729 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514248 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 01:09:52.520729 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514251 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 01:09:52.520729 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514254 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 01:09:52.520729 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514256 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 01:09:52.520729 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514260 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 01:09:52.520729 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514262 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 01:09:52.520729 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514265 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 01:09:52.520729 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514268 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 01:09:52.520729 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514270 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 01:09:52.520729 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514273 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 01:09:52.520729 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514276 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 01:09:52.520729 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514278 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 01:09:52.520729 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514281 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 01:09:52.521254 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514284 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 01:09:52.521254 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514288 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 01:09:52.521254 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514291 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 01:09:52.521254 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514294 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 01:09:52.521254 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514296 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 01:09:52.521254 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514299 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 01:09:52.521254 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514303 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 01:09:52.521254 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514307 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 01:09:52.521254 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514309 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 01:09:52.521254 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514312 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 01:09:52.521254 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514316 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 23 01:09:52.521254 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514319 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 01:09:52.521254 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514323 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 01:09:52.521254 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514326 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 01:09:52.521254 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514329 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 01:09:52.521254 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514331 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 01:09:52.521254 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514334 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 01:09:52.521254 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514337 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 01:09:52.521254 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514339 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 01:09:52.522037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514342 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 01:09:52.522037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514345 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 01:09:52.522037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514348 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 01:09:52.522037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514350 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 01:09:52.522037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514353 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 01:09:52.522037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514356 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 01:09:52.522037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514359 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 01:09:52.522037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514361 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 01:09:52.522037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514364 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 01:09:52.522037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514366 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 01:09:52.522037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514369 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 01:09:52.522037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514371 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 01:09:52.522037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514374 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 01:09:52.522037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514376 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 01:09:52.522037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514379 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 01:09:52.522037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514381 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 01:09:52.522037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514384 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 01:09:52.522037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514386 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 01:09:52.522037 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514389 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 01:09:52.522723 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514392 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 01:09:52.522723 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514394 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 01:09:52.522723 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514396 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 01:09:52.522723 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514399 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 01:09:52.522723 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514404 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 01:09:52.522723 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514407 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 01:09:52.522723 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514411 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 01:09:52.522723 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514414 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 01:09:52.522723 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514416 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 01:09:52.522723 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514419 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 01:09:52.522723 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514421 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 01:09:52.522723 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514424 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 01:09:52.522723 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514427 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 01:09:52.522723 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514429 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 01:09:52.522723 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514432 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 01:09:52.522723 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514434 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 01:09:52.522723 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514437 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 01:09:52.522723 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514439 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 01:09:52.522723 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514442 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 01:09:52.523195 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514445 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 01:09:52.523195 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514447 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 01:09:52.523195 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514450 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 01:09:52.523195 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514453 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 01:09:52.523195 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514455 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 01:09:52.523195 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514458 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 01:09:52.523195 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514461 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 01:09:52.523195 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.514463 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 01:09:52.523195 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.516050 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 01:09:52.523758 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.523738 2569 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 01:09:52.523796 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.523759 2569 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 01:09:52.523824 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523809 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 01:09:52.523824 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523814 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 01:09:52.523824 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523817 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 01:09:52.523824 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523820 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 01:09:52.523824 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523824 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 01:09:52.523952 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523828 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 01:09:52.523952 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523831 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 01:09:52.523952 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523834 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 01:09:52.523952 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523837 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 01:09:52.523952 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523840 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 01:09:52.523952 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523843 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 01:09:52.523952 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523846 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 01:09:52.523952 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523849 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 01:09:52.523952 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523852 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 01:09:52.523952 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523856 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 01:09:52.523952 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523859 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 01:09:52.523952 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523861 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 01:09:52.523952 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523864 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 01:09:52.523952 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523866 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 01:09:52.523952 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523870 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 01:09:52.523952 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523873 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 01:09:52.523952 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523878 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 01:09:52.523952 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523882 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 01:09:52.523952 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523885 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 01:09:52.524460 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523888 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 01:09:52.524460 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523891 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 01:09:52.524460 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523894 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 01:09:52.524460 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523896 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 01:09:52.524460 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523899 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 01:09:52.524460 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523902 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 01:09:52.524460 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523905 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 01:09:52.524460 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523908 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 01:09:52.524460 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523910 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 01:09:52.524460 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523913 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 01:09:52.524460 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523916 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 01:09:52.524460 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523918 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 01:09:52.524460 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523921 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 01:09:52.524460 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523924 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 01:09:52.524460 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523927 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 01:09:52.524460 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523930 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 01:09:52.524460 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523933 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 01:09:52.524460 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523936 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 01:09:52.524460 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523939 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 01:09:52.524460 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523941 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 01:09:52.524973 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523944 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 01:09:52.524973 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523947 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 01:09:52.524973 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523950 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 01:09:52.524973 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523952 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 01:09:52.524973 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523955 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 01:09:52.524973 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523958 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 01:09:52.524973 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523961 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 01:09:52.524973 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523964 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 01:09:52.524973 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523966 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 01:09:52.524973 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523969 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 01:09:52.524973 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523972 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 01:09:52.524973 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523975 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 01:09:52.524973 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523978 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 01:09:52.524973 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523980 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 01:09:52.524973 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523983 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 01:09:52.524973 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523985 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 01:09:52.524973 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523988 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 01:09:52.524973 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523990 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 01:09:52.524973 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523993 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 01:09:52.524973 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523996 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 01:09:52.525458 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.523998 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 01:09:52.525458 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524001 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 01:09:52.525458 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524004 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 01:09:52.525458 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524006 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 01:09:52.525458 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524009 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 01:09:52.525458 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524011 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 01:09:52.525458 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524015 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 01:09:52.525458 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524018 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 01:09:52.525458 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524029 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 01:09:52.525458 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524032 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 01:09:52.525458 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524035 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 01:09:52.525458 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524037 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 01:09:52.525458 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524041 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 01:09:52.525458 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524044 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 01:09:52.525458 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524047 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 01:09:52.525458 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524049 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 01:09:52.525458 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524052 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 23 01:09:52.525458 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524055 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 01:09:52.525458 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524058 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 01:09:52.525458 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524060 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 01:09:52.525966 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524062 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 01:09:52.525966 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524065 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 01:09:52.525966 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.524070 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 01:09:52.525966 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524167 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 01:09:52.525966 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524172 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 01:09:52.525966 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524175 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 01:09:52.525966 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524178 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 01:09:52.525966 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524180 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 01:09:52.525966 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524183 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 01:09:52.525966 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524186 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 01:09:52.525966 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524188 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 01:09:52.525966 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524191 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 01:09:52.525966 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524194 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 01:09:52.525966 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524197 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 01:09:52.525966 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524199 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 23 01:09:52.525966 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524202 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 01:09:52.526362 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524204 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 01:09:52.526362 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524207 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 01:09:52.526362 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524210 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 01:09:52.526362 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524212 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 01:09:52.526362 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524215 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 01:09:52.526362 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524219 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 01:09:52.526362 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524223 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 01:09:52.526362 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524226 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 01:09:52.526362 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524229 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 01:09:52.526362 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524233 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 01:09:52.526362 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524236 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 01:09:52.526362 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524238 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 01:09:52.526362 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524241 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 01:09:52.526362 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524244 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 01:09:52.526362 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524246 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 01:09:52.526362 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524249 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 01:09:52.526362 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524251 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 01:09:52.526362 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524254 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 01:09:52.526362 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524256 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 01:09:52.526362 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524259 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 01:09:52.526863 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524262 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 01:09:52.526863 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524265 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 01:09:52.526863 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524267 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 01:09:52.526863 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524270 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 01:09:52.526863 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524272 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 01:09:52.526863 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524275 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 01:09:52.526863 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524278 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 01:09:52.526863 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524280 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 01:09:52.526863 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524283 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 01:09:52.526863 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524286 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 01:09:52.526863 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524289 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 01:09:52.526863 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524292 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 01:09:52.526863 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524295 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 01:09:52.526863 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524297 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 01:09:52.526863 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524300 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 01:09:52.526863 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524303 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 01:09:52.526863 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524306 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 01:09:52.526863 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524308 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 01:09:52.526863 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524311 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 01:09:52.526863 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524313 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 01:09:52.527349 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524316 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 01:09:52.527349 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524318 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 01:09:52.527349 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524322 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 01:09:52.527349 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524324 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 01:09:52.527349 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524327 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 01:09:52.527349 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524329 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 01:09:52.527349 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524332 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 01:09:52.527349 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524334 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 01:09:52.527349 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524337 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 01:09:52.527349 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524340 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 01:09:52.527349 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524342 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 01:09:52.527349 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524344 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 01:09:52.527349 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524347 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 01:09:52.527349 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524349 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 01:09:52.527349 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524352 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 01:09:52.527349 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524354 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 01:09:52.527349 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524357 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 01:09:52.527349 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524360 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 01:09:52.527349 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524362 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 01:09:52.527349 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524365 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 01:09:52.527898 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524367 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 01:09:52.527898 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524370 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 01:09:52.527898 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524372 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 01:09:52.527898 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524375 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 01:09:52.527898 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524377 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 01:09:52.527898 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524380 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 01:09:52.527898 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524383 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 01:09:52.527898 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524386 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 01:09:52.527898 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524388 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 01:09:52.527898 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524391 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 01:09:52.527898 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524393 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 01:09:52.527898 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524396 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 01:09:52.527898 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:52.524398 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 01:09:52.527898 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.524404 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 01:09:52.527898 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.525023 2569 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 01:09:52.528278 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.527523 2569 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 01:09:52.528384 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.528372 2569 server.go:1019] "Starting client certificate rotation" Apr 23 01:09:52.528491 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.528475 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 01:09:52.528527 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.528517 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 01:09:52.549563 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.549543 2569 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 01:09:52.551743 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.551724 2569 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 01:09:52.566289 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.566268 2569 log.go:25] "Validated CRI v1 runtime API" Apr 23 01:09:52.572457 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.572444 2569 log.go:25] "Validated CRI v1 image API" Apr 23 01:09:52.574204 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.574187 2569 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 01:09:52.578306 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.578285 2569 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 b93fe25f-9644-4273-bd76-59e977afdbde:/dev/nvme0n1p3 f55145ac-672b-46d0-90cc-a30f167181b0:/dev/nvme0n1p4] Apr 23 01:09:52.578370 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.578306 2569 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 01:09:52.582272 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.582249 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 01:09:52.585536 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.585425 2569 manager.go:217] Machine: {Timestamp:2026-04-23 01:09:52.583653864 +0000 UTC m=+0.369328214 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101950 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2a47c80e2c02d492702b8b9856966d SystemUUID:ec2a47c8-0e2c-02d4-9270-2b8b9856966d BootID:7d3889de-4d01-43df-85bf-ba8235061d64 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:5a:0e:ae:77:09 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:5a:0e:ae:77:09 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:76:fa:27:08:23:f5 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 01:09:52.585536 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.585529 2569 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 01:09:52.585664 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.585627 2569 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 01:09:52.586659 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.586635 2569 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 01:09:52.586800 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.586662 2569 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-235.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 01:09:52.586848 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.586808 2569 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 01:09:52.586848 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.586818 2569 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 01:09:52.586848 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.586832 2569 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 01:09:52.587489 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.587479 2569 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 01:09:52.588656 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.588645 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 23 01:09:52.588941 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.588931 2569 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 01:09:52.591328 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.591317 2569 kubelet.go:491] "Attempting to sync node with API server" Apr 23 01:09:52.591370 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.591333 2569 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 01:09:52.591370 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.591345 2569 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 01:09:52.591370 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.591354 2569 kubelet.go:397] "Adding apiserver pod source" Apr 23 01:09:52.591370 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.591363 2569 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 01:09:52.592465 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.592407 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 01:09:52.592536 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.592478 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 01:09:52.595150 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.595133 2569 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 01:09:52.596470 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.596454 2569 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 01:09:52.597998 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.597984 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 01:09:52.598078 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.598005 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 01:09:52.598078 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.598014 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 01:09:52.598078 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.598024 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 01:09:52.598078 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.598033 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 01:09:52.598078 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.598042 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 01:09:52.598078 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.598050 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 01:09:52.598078 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.598059 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 01:09:52.598078 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.598069 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 01:09:52.598078 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.598078 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 01:09:52.598343 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.598091 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 01:09:52.598343 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.598104 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 01:09:52.599419 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.599408 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 01:09:52.599468 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.599423 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 01:09:52.601796 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.601774 2569 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-235.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 01:09:52.601881 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:52.601791 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-235.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 01:09:52.601881 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:52.601839 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 01:09:52.603603 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.603589 2569 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 01:09:52.603700 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.603647 2569 server.go:1295] "Started kubelet" Apr 23 01:09:52.603750 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.603726 2569 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 01:09:52.603972 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.603901 2569 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 01:09:52.604042 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.604016 2569 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 01:09:52.604459 ip-10-0-138-235 systemd[1]: Started Kubernetes Kubelet. Apr 23 01:09:52.605108 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.605090 2569 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 01:09:52.605657 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.605644 2569 server.go:317] "Adding debug handlers to kubelet server" Apr 23 01:09:52.610375 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.610356 2569 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 01:09:52.610375 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.610374 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 01:09:52.610806 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:52.610785 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-235.ec2.internal\" not found" Apr 23 01:09:52.611291 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.611154 2569 factory.go:55] Registering systemd factory Apr 23 01:09:52.611370 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.611303 2569 factory.go:223] Registration of the systemd container factory successfully Apr 23 01:09:52.611428 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.611151 2569 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 01:09:52.611428 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.611183 2569 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 01:09:52.611428 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.611398 2569 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 01:09:52.611557 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.611495 2569 reconstruct.go:97] "Volume reconstruction finished" Apr 23 01:09:52.611557 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.611504 2569 reconciler.go:26] "Reconciler: start to sync state" Apr 23 01:09:52.611766 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.611750 2569 factory.go:153] Registering CRI-O factory Apr 23 01:09:52.611766 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.611768 2569 factory.go:223] Registration of the crio container factory successfully Apr 23 01:09:52.611846 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.611834 2569 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 01:09:52.611881 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.611856 2569 factory.go:103] Registering Raw factory Apr 23 01:09:52.611881 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.611870 2569 manager.go:1196] Started watching for new ooms in manager Apr 23 01:09:52.612309 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.612294 2569 manager.go:319] Starting recovery of all containers Apr 23 01:09:52.613517 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:52.613495 2569 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 01:09:52.617556 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:52.617505 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 23 01:09:52.617809 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:52.617780 2569 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-138-235.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 23 01:09:52.618634 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:52.617669 2569 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-235.ec2.internal.18a8d721ab2ffae8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-235.ec2.internal,UID:ip-10-0-138-235.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-138-235.ec2.internal,},FirstTimestamp:2026-04-23 01:09:52.60360164 +0000 UTC m=+0.389275993,LastTimestamp:2026-04-23 01:09:52.60360164 +0000 UTC m=+0.389275993,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-235.ec2.internal,}" Apr 23 01:09:52.619938 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.619907 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-4dp5r" Apr 23 01:09:52.621201 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.621025 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 01:09:52.625289 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.625270 2569 manager.go:324] Recovery completed Apr 23 01:09:52.626650 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:52.626591 2569 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 23 01:09:52.626794 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.626773 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-4dp5r" Apr 23 01:09:52.629675 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.629661 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 01:09:52.632157 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.632143 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-235.ec2.internal" event="NodeHasSufficientMemory" Apr 23 01:09:52.632227 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.632168 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-235.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 01:09:52.632227 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.632179 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-235.ec2.internal" event="NodeHasSufficientPID" Apr 23 01:09:52.632667 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.632652 2569 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 01:09:52.632667 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.632665 2569 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 01:09:52.632742 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.632679 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 23 01:09:52.634204 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:52.634135 2569 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-235.ec2.internal.18a8d721ace3b030 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-235.ec2.internal,UID:ip-10-0-138-235.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-138-235.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-138-235.ec2.internal,},FirstTimestamp:2026-04-23 01:09:52.632156208 +0000 UTC m=+0.417830556,LastTimestamp:2026-04-23 01:09:52.632156208 +0000 UTC m=+0.417830556,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-235.ec2.internal,}" Apr 23 01:09:52.635824 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.635812 2569 policy_none.go:49] "None policy: Start" Apr 23 01:09:52.635868 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.635829 2569 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 01:09:52.635868 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.635840 2569 state_mem.go:35] "Initializing new in-memory state store" Apr 23 01:09:52.678203 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.678185 2569 manager.go:341] "Starting Device Plugin manager" Apr 23 01:09:52.702372 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:52.678219 2569 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 01:09:52.702372 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.678231 2569 server.go:85] "Starting device plugin registration server" Apr 23 01:09:52.702372 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.678499 2569 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 01:09:52.702372 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.678514 2569 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 01:09:52.702372 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.678600 2569 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 01:09:52.702372 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.678699 2569 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 01:09:52.702372 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.678708 2569 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 01:09:52.702372 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:52.679287 2569 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 01:09:52.702372 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:52.679319 2569 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-235.ec2.internal\" not found" Apr 23 01:09:52.761585 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.761518 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 01:09:52.761585 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.761555 2569 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 01:09:52.761585 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.761574 2569 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 01:09:52.761585 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.761580 2569 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 01:09:52.761829 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:52.761622 2569 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 01:09:52.765711 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.765688 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 01:09:52.779484 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.779468 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 01:09:52.780475 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.780459 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-235.ec2.internal" event="NodeHasSufficientMemory" Apr 23 01:09:52.780547 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.780488 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-235.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 01:09:52.780547 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.780500 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-235.ec2.internal" event="NodeHasSufficientPID" Apr 23 01:09:52.780547 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.780523 2569 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-235.ec2.internal" Apr 23 01:09:52.789242 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.789226 2569 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-235.ec2.internal" Apr 23 01:09:52.789285 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:52.789249 2569 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-235.ec2.internal\": node \"ip-10-0-138-235.ec2.internal\" not found" Apr 23 01:09:52.802271 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:52.802251 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-235.ec2.internal\" not found" Apr 23 01:09:52.862100 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.862076 2569 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-235.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-235.ec2.internal"] Apr 23 01:09:52.862174 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.862147 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 01:09:52.863921 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.863904 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-235.ec2.internal" event="NodeHasSufficientMemory" Apr 23 01:09:52.863997 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.863938 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-235.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 01:09:52.863997 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.863948 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-235.ec2.internal" event="NodeHasSufficientPID" Apr 23 01:09:52.866086 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.866073 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 01:09:52.866232 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.866218 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-235.ec2.internal" Apr 23 01:09:52.866276 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.866248 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 01:09:52.869923 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.869908 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-235.ec2.internal" event="NodeHasSufficientMemory" Apr 23 01:09:52.870005 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.869913 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-235.ec2.internal" event="NodeHasSufficientMemory" Apr 23 01:09:52.870005 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.869970 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-235.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 01:09:52.870005 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.869938 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-235.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 01:09:52.870005 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.869988 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-235.ec2.internal" event="NodeHasSufficientPID" Apr 23 01:09:52.870144 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.870018 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-235.ec2.internal" event="NodeHasSufficientPID" Apr 23 01:09:52.872147 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.872127 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-235.ec2.internal" Apr 23 01:09:52.872226 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.872154 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 01:09:52.872826 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.872812 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-235.ec2.internal" event="NodeHasSufficientMemory" Apr 23 01:09:52.872888 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.872833 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-235.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 01:09:52.872888 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.872845 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-235.ec2.internal" event="NodeHasSufficientPID" Apr 23 01:09:52.889139 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:52.889121 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-235.ec2.internal\" not found" node="ip-10-0-138-235.ec2.internal" Apr 23 01:09:52.892449 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:52.892434 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-235.ec2.internal\" not found" node="ip-10-0-138-235.ec2.internal" Apr 23 01:09:52.902308 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:52.902291 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-235.ec2.internal\" not found" Apr 23 01:09:52.913500 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.913484 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c1be7c8c689749547130833b567b2938-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-235.ec2.internal\" (UID: \"c1be7c8c689749547130833b567b2938\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-235.ec2.internal" Apr 23 01:09:52.913569 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.913510 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c1be7c8c689749547130833b567b2938-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-235.ec2.internal\" (UID: \"c1be7c8c689749547130833b567b2938\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-235.ec2.internal" Apr 23 01:09:52.913569 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:52.913528 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b29f92d08abcb80be4466bff586fc859-config\") pod \"kube-apiserver-proxy-ip-10-0-138-235.ec2.internal\" (UID: \"b29f92d08abcb80be4466bff586fc859\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-235.ec2.internal" Apr 23 01:09:53.002518 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:53.002482 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-235.ec2.internal\" not found" Apr 23 01:09:53.013991 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:53.013930 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b29f92d08abcb80be4466bff586fc859-config\") pod \"kube-apiserver-proxy-ip-10-0-138-235.ec2.internal\" (UID: \"b29f92d08abcb80be4466bff586fc859\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-235.ec2.internal" Apr 23 01:09:53.013991 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:53.013964 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c1be7c8c689749547130833b567b2938-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-235.ec2.internal\" (UID: \"c1be7c8c689749547130833b567b2938\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-235.ec2.internal" Apr 23 01:09:53.014139 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:53.014027 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b29f92d08abcb80be4466bff586fc859-config\") pod \"kube-apiserver-proxy-ip-10-0-138-235.ec2.internal\" (UID: \"b29f92d08abcb80be4466bff586fc859\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-235.ec2.internal" Apr 23 01:09:53.014139 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:53.014080 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c1be7c8c689749547130833b567b2938-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-235.ec2.internal\" (UID: \"c1be7c8c689749547130833b567b2938\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-235.ec2.internal" Apr 23 01:09:53.014139 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:53.014133 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c1be7c8c689749547130833b567b2938-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-235.ec2.internal\" (UID: \"c1be7c8c689749547130833b567b2938\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-235.ec2.internal" Apr 23 01:09:53.014257 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:53.014163 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c1be7c8c689749547130833b567b2938-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-235.ec2.internal\" (UID: \"c1be7c8c689749547130833b567b2938\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-235.ec2.internal" Apr 23 01:09:53.103356 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:53.103318 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-235.ec2.internal\" not found" Apr 23 01:09:53.190798 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:53.190768 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-235.ec2.internal" Apr 23 01:09:53.195195 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:53.195177 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-235.ec2.internal" Apr 23 01:09:53.203476 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:53.203454 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-235.ec2.internal\" not found" Apr 23 01:09:53.304058 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:53.303975 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-235.ec2.internal\" not found" Apr 23 01:09:53.404591 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:53.404556 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-235.ec2.internal\" not found" Apr 23 01:09:53.505266 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:53.505235 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-235.ec2.internal\" not found" Apr 23 01:09:53.528791 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:53.528762 2569 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 01:09:53.529504 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:53.528913 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 01:09:53.606276 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:53.606207 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-235.ec2.internal\" not found" Apr 23 01:09:53.611652 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:53.611629 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 01:09:53.620434 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:53.620412 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 01:09:53.629088 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:53.629060 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 01:04:52 +0000 UTC" deadline="2028-01-23 22:46:09.57809919 +0000 UTC" Apr 23 01:09:53.629088 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:53.629086 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15381h36m15.949016505s" Apr 23 01:09:53.637242 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:53.637223 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-xb85v" Apr 23 01:09:53.645318 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:53.645303 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-xb85v" Apr 23 01:09:53.654982 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:53.654881 2569 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 01:09:53.669016 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:53.668989 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1be7c8c689749547130833b567b2938.slice/crio-95e03605635d51e0d5e9dd93bfebcbea216a13d33f096e563461e086833fa915 WatchSource:0}: Error finding container 95e03605635d51e0d5e9dd93bfebcbea216a13d33f096e563461e086833fa915: Status 404 returned error can't find the container with id 95e03605635d51e0d5e9dd93bfebcbea216a13d33f096e563461e086833fa915 Apr 23 01:09:53.669236 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:53.669209 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb29f92d08abcb80be4466bff586fc859.slice/crio-6880fbb179ee2d46b7c8c6ff6f76b0da751ef65fef81144bf96c2a749c3e989e WatchSource:0}: Error finding container 6880fbb179ee2d46b7c8c6ff6f76b0da751ef65fef81144bf96c2a749c3e989e: Status 404 returned error can't find the container with id 6880fbb179ee2d46b7c8c6ff6f76b0da751ef65fef81144bf96c2a749c3e989e Apr 23 01:09:53.674881 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:53.674864 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 01:09:53.706436 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:53.706410 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-235.ec2.internal\" not found" Apr 23 01:09:53.742959 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:53.742932 2569 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 01:09:53.764513 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:53.764465 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-235.ec2.internal" event={"ID":"c1be7c8c689749547130833b567b2938","Type":"ContainerStarted","Data":"95e03605635d51e0d5e9dd93bfebcbea216a13d33f096e563461e086833fa915"} Apr 23 01:09:53.765312 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:53.765290 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-235.ec2.internal" event={"ID":"b29f92d08abcb80be4466bff586fc859","Type":"ContainerStarted","Data":"6880fbb179ee2d46b7c8c6ff6f76b0da751ef65fef81144bf96c2a749c3e989e"} Apr 23 01:09:53.811286 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:53.811265 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-235.ec2.internal" Apr 23 01:09:53.822639 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:53.822620 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 01:09:53.823386 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:53.823375 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-235.ec2.internal" Apr 23 01:09:53.831225 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:53.831202 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 01:09:54.046829 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.046764 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 01:09:54.592908 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.592879 2569 apiserver.go:52] "Watching apiserver" Apr 23 01:09:54.598624 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.598586 2569 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 01:09:54.598985 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.598957 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-5mm4v","openshift-network-operator/iptables-alerter-c5z7x","kube-system/konnectivity-agent-6wfbb","kube-system/kube-apiserver-proxy-ip-10-0-138-235.ec2.internal","openshift-dns/node-resolver-fgvhm","openshift-image-registry/node-ca-grrph","openshift-network-diagnostics/network-check-target-jbfxg","openshift-ovn-kubernetes/ovnkube-node-79v47","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zhnh","openshift-cluster-node-tuning-operator/tuned-s6dcr","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-235.ec2.internal","openshift-multus/multus-additional-cni-plugins-gzwpj","openshift-multus/multus-c6dx7"] Apr 23 01:09:54.601757 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.601735 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.604021 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.603876 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-2v2bq\"" Apr 23 01:09:54.604021 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.603887 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 01:09:54.604021 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.603909 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 01:09:54.606010 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.605989 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fgvhm" Apr 23 01:09:54.607904 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.607856 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 01:09:54.607904 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.607881 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-bmzbz\"" Apr 23 01:09:54.608045 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.607965 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 01:09:54.608045 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.608027 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-grrph" Apr 23 01:09:54.608332 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.608177 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jbfxg" Apr 23 01:09:54.608332 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:54.608239 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jbfxg" podUID="943b6178-2514-4112-956c-8705385d8a3d" Apr 23 01:09:54.610017 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.609846 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 01:09:54.610017 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.609923 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 01:09:54.610455 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.610181 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 01:09:54.610455 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.610437 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-wfw99\"" Apr 23 01:09:54.612802 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.612786 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.612893 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.612844 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zhnh" Apr 23 01:09:54.614827 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.614807 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 01:09:54.615007 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.614990 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 01:09:54.615163 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.615136 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5mm4v" Apr 23 01:09:54.615250 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:54.615197 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5mm4v" podUID="608e8d52-e2cd-48e3-b524-0f0d764d9501" Apr 23 01:09:54.615341 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.615323 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 01:09:54.615405 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.615347 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 01:09:54.615405 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.615383 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-j4s9c\"" Apr 23 01:09:54.615772 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.615693 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-qtv6m\"" Apr 23 01:09:54.615772 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.615734 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 01:09:54.615772 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.615756 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 01:09:54.616273 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.616253 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 01:09:54.616544 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.616523 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 01:09:54.616633 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.616526 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 01:09:54.619296 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.618120 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gzwpj" Apr 23 01:09:54.620346 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.620324 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/83355238-6978-4f5d-8b07-0ea3d3784353-host\") pod \"node-ca-grrph\" (UID: \"83355238-6978-4f5d-8b07-0ea3d3784353\") " pod="openshift-image-registry/node-ca-grrph" Apr 23 01:09:54.620439 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.620357 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/83355238-6978-4f5d-8b07-0ea3d3784353-serviceca\") pod \"node-ca-grrph\" (UID: \"83355238-6978-4f5d-8b07-0ea3d3784353\") " pod="openshift-image-registry/node-ca-grrph" Apr 23 01:09:54.620439 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.620380 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-run-openvswitch\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.620439 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.620403 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-host-cni-bin\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.620439 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.620426 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-host\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.620670 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.620446 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-etc-sysconfig\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.620670 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.620465 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-run-ovn\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.620670 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.620485 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-log-socket\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.620670 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.620508 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-host-cni-netd\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.620670 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.620528 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/52115d8e-033b-485d-aa67-434e7ae395d5-ovnkube-config\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.620670 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.620550 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/52115d8e-033b-485d-aa67-434e7ae395d5-env-overrides\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.620670 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.620571 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmrd7\" (UniqueName: \"kubernetes.io/projected/52115d8e-033b-485d-aa67-434e7ae395d5-kube-api-access-mmrd7\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.620670 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.620591 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-tmp\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.620670 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.620629 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdl8f\" (UniqueName: \"kubernetes.io/projected/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-kube-api-access-hdl8f\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.620670 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.620651 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-host-slash\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.620670 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.620672 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47-socket-dir\") pod \"aws-ebs-csi-driver-node-6zhnh\" (UID: \"c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zhnh" Apr 23 01:09:54.621148 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.620696 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47-registration-dir\") pod \"aws-ebs-csi-driver-node-6zhnh\" (UID: \"c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zhnh" Apr 23 01:09:54.621148 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.620757 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47-sys-fs\") pod \"aws-ebs-csi-driver-node-6zhnh\" (UID: \"c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zhnh" Apr 23 01:09:54.621148 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.620787 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-systemd-units\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.621148 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.620806 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-etc-kubernetes\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.621148 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.620830 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-etc-sysctl-conf\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.621148 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.620853 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-etc-systemd\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.621148 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.620876 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-sys\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.621148 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.620906 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 01:09:54.621148 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.620915 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwf2b\" (UniqueName: \"kubernetes.io/projected/83355238-6978-4f5d-8b07-0ea3d3784353-kube-api-access-lwf2b\") pod \"node-ca-grrph\" (UID: \"83355238-6978-4f5d-8b07-0ea3d3784353\") " pod="openshift-image-registry/node-ca-grrph" Apr 23 01:09:54.621148 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.620946 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdvsx\" (UniqueName: \"kubernetes.io/projected/943b6178-2514-4112-956c-8705385d8a3d-kube-api-access-xdvsx\") pod \"network-check-target-jbfxg\" (UID: \"943b6178-2514-4112-956c-8705385d8a3d\") " pod="openshift-network-diagnostics/network-check-target-jbfxg" Apr 23 01:09:54.621148 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.620974 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47-etc-selinux\") pod \"aws-ebs-csi-driver-node-6zhnh\" (UID: \"c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zhnh" Apr 23 01:09:54.621148 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.620984 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.621148 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.620997 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 01:09:54.621148 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.621016 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-lib-modules\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.621148 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.621039 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-var-lib-kubelet\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.621148 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.621092 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-etc-tuned\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.621148 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.621120 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.621869 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.621158 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/52115d8e-033b-485d-aa67-434e7ae395d5-ovn-node-metrics-cert\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.621869 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.621162 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 01:09:54.621869 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.621186 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47-device-dir\") pod \"aws-ebs-csi-driver-node-6zhnh\" (UID: \"c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zhnh" Apr 23 01:09:54.621869 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.621209 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9k5g\" (UniqueName: \"kubernetes.io/projected/c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47-kube-api-access-x9k5g\") pod \"aws-ebs-csi-driver-node-6zhnh\" (UID: \"c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zhnh" Apr 23 01:09:54.621869 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.621232 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-etc-modprobe-d\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.621869 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.621256 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-etc-sysctl-d\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.621869 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.621277 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b025c029-af84-46be-a329-3c26d61f764a-hosts-file\") pod \"node-resolver-fgvhm\" (UID: \"b025c029-af84-46be-a329-3c26d61f764a\") " pod="openshift-dns/node-resolver-fgvhm" Apr 23 01:09:54.621869 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.621297 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-ftltm\"" Apr 23 01:09:54.621869 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.621319 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 01:09:54.621869 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.621298 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-host-run-netns\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.621869 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.621338 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 01:09:54.621869 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.621357 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-run-systemd\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.621869 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.621381 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6zhnh\" (UID: \"c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zhnh" Apr 23 01:09:54.621869 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.621407 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-var-lib-openvswitch\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.621869 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.621429 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-host-kubelet\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.621869 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.621453 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-run\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.621869 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.621476 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b025c029-af84-46be-a329-3c26d61f764a-tmp-dir\") pod \"node-resolver-fgvhm\" (UID: \"b025c029-af84-46be-a329-3c26d61f764a\") " pod="openshift-dns/node-resolver-fgvhm" Apr 23 01:09:54.621869 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.621499 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-etc-openvswitch\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.621869 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.621525 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-node-log\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.622597 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.621559 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-host-run-ovn-kubernetes\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.622597 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.621583 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/52115d8e-033b-485d-aa67-434e7ae395d5-ovnkube-script-lib\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.622597 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.621627 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgd4w\" (UniqueName: \"kubernetes.io/projected/b025c029-af84-46be-a329-3c26d61f764a-kube-api-access-cgd4w\") pod \"node-resolver-fgvhm\" (UID: \"b025c029-af84-46be-a329-3c26d61f764a\") " pod="openshift-dns/node-resolver-fgvhm" Apr 23 01:09:54.622982 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.622965 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-vk6fd\"" Apr 23 01:09:54.622982 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.622979 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 01:09:54.623359 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.623342 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-c5z7x" Apr 23 01:09:54.625270 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.625246 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-vssrh\"" Apr 23 01:09:54.625338 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.625308 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 01:09:54.625338 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.625325 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 01:09:54.625438 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.625309 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 01:09:54.625665 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.625643 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6wfbb" Apr 23 01:09:54.627553 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.627532 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 01:09:54.627650 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.627564 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 01:09:54.627738 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.627722 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-kl525\"" Apr 23 01:09:54.643772 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.643756 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 01:09:54.645963 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.645936 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 01:04:53 +0000 UTC" deadline="2027-11-15 23:07:16.100271715 +0000 UTC" Apr 23 01:09:54.646044 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.645965 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13725h57m21.45430899s" Apr 23 01:09:54.712306 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.712280 2569 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 01:09:54.722550 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.722526 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cgd4w\" (UniqueName: \"kubernetes.io/projected/b025c029-af84-46be-a329-3c26d61f764a-kube-api-access-cgd4w\") pod \"node-resolver-fgvhm\" (UID: \"b025c029-af84-46be-a329-3c26d61f764a\") " pod="openshift-dns/node-resolver-fgvhm" Apr 23 01:09:54.722690 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.722563 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/83355238-6978-4f5d-8b07-0ea3d3784353-serviceca\") pod \"node-ca-grrph\" (UID: \"83355238-6978-4f5d-8b07-0ea3d3784353\") " pod="openshift-image-registry/node-ca-grrph" Apr 23 01:09:54.722690 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.722590 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-run-openvswitch\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.722690 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.722634 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb-system-cni-dir\") pod \"multus-additional-cni-plugins-gzwpj\" (UID: \"b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb\") " pod="openshift-multus/multus-additional-cni-plugins-gzwpj" Apr 23 01:09:54.722690 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.722663 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gzwpj\" (UID: \"b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb\") " pod="openshift-multus/multus-additional-cni-plugins-gzwpj" Apr 23 01:09:54.722690 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.722673 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-run-openvswitch\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.722690 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.722689 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-etc-sysconfig\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.722980 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.722714 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-run-ovn\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.722980 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.722738 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/52115d8e-033b-485d-aa67-434e7ae395d5-ovnkube-config\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.722980 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.722772 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/52115d8e-033b-485d-aa67-434e7ae395d5-env-overrides\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.722980 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.722796 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47-socket-dir\") pod \"aws-ebs-csi-driver-node-6zhnh\" (UID: \"c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zhnh" Apr 23 01:09:54.722980 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.722804 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-run-ovn\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.722980 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.722821 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47-sys-fs\") pod \"aws-ebs-csi-driver-node-6zhnh\" (UID: \"c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zhnh" Apr 23 01:09:54.722980 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.722847 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/21b4a1fd-e436-4824-abb9-d40c296dc036-multus-daemon-config\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.722980 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.722863 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-etc-sysconfig\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.723271 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.723211 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47-socket-dir\") pod \"aws-ebs-csi-driver-node-6zhnh\" (UID: \"c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zhnh" Apr 23 01:09:54.723271 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.723245 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47-sys-fs\") pod \"aws-ebs-csi-driver-node-6zhnh\" (UID: \"c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zhnh" Apr 23 01:09:54.723271 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.723251 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-tmp\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.723415 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.723294 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/83355238-6978-4f5d-8b07-0ea3d3784353-serviceca\") pod \"node-ca-grrph\" (UID: \"83355238-6978-4f5d-8b07-0ea3d3784353\") " pod="openshift-image-registry/node-ca-grrph" Apr 23 01:09:54.723415 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.723299 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47-registration-dir\") pod \"aws-ebs-csi-driver-node-6zhnh\" (UID: \"c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zhnh" Apr 23 01:09:54.723415 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.723344 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47-registration-dir\") pod \"aws-ebs-csi-driver-node-6zhnh\" (UID: \"c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zhnh" Apr 23 01:09:54.723415 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.723359 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-host-var-lib-kubelet\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.723415 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.723387 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-multus-conf-dir\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.723415 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.723393 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/52115d8e-033b-485d-aa67-434e7ae395d5-ovnkube-config\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.723415 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.723414 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-systemd-units\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.723755 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.723440 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-etc-kubernetes\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.723755 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.723468 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lwf2b\" (UniqueName: \"kubernetes.io/projected/83355238-6978-4f5d-8b07-0ea3d3784353-kube-api-access-lwf2b\") pod \"node-ca-grrph\" (UID: \"83355238-6978-4f5d-8b07-0ea3d3784353\") " pod="openshift-image-registry/node-ca-grrph" Apr 23 01:09:54.723755 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.723491 2569 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 01:09:54.723755 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.723513 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xdvsx\" (UniqueName: \"kubernetes.io/projected/943b6178-2514-4112-956c-8705385d8a3d-kube-api-access-xdvsx\") pod \"network-check-target-jbfxg\" (UID: \"943b6178-2514-4112-956c-8705385d8a3d\") " pod="openshift-network-diagnostics/network-check-target-jbfxg" Apr 23 01:09:54.723755 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.723541 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47-etc-selinux\") pod \"aws-ebs-csi-driver-node-6zhnh\" (UID: \"c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zhnh" Apr 23 01:09:54.723755 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.723551 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-systemd-units\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.723755 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.723566 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-etc-tuned\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.723755 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.723593 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.723755 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.723652 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47-device-dir\") pod \"aws-ebs-csi-driver-node-6zhnh\" (UID: \"c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zhnh" Apr 23 01:09:54.723755 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.723698 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/52115d8e-033b-485d-aa67-434e7ae395d5-env-overrides\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.723755 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.723710 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hr4s\" (UniqueName: \"kubernetes.io/projected/b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb-kube-api-access-7hr4s\") pod \"multus-additional-cni-plugins-gzwpj\" (UID: \"b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb\") " pod="openshift-multus/multus-additional-cni-plugins-gzwpj" Apr 23 01:09:54.723755 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.723738 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/577e99e2-2b77-4e82-9822-3e4be2d6ba4d-iptables-alerter-script\") pod \"iptables-alerter-c5z7x\" (UID: \"577e99e2-2b77-4e82-9822-3e4be2d6ba4d\") " pod="openshift-network-operator/iptables-alerter-c5z7x" Apr 23 01:09:54.723755 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.723749 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47-etc-selinux\") pod \"aws-ebs-csi-driver-node-6zhnh\" (UID: \"c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zhnh" Apr 23 01:09:54.724336 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.723854 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-etc-modprobe-d\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.724336 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.723896 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47-device-dir\") pod \"aws-ebs-csi-driver-node-6zhnh\" (UID: \"c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zhnh" Apr 23 01:09:54.724336 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.723902 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-etc-sysctl-d\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.724336 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.723969 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b025c029-af84-46be-a329-3c26d61f764a-hosts-file\") pod \"node-resolver-fgvhm\" (UID: \"b025c029-af84-46be-a329-3c26d61f764a\") " pod="openshift-dns/node-resolver-fgvhm" Apr 23 01:09:54.724336 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.723963 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-etc-modprobe-d\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.724336 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724006 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-etc-sysctl-d\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.724336 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.723993 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.724336 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724006 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-host-run-netns\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.724336 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724023 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-etc-kubernetes\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.724336 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724031 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b025c029-af84-46be-a329-3c26d61f764a-hosts-file\") pod \"node-resolver-fgvhm\" (UID: \"b025c029-af84-46be-a329-3c26d61f764a\") " pod="openshift-dns/node-resolver-fgvhm" Apr 23 01:09:54.724336 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724069 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-run-systemd\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.724336 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724139 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-host-run-netns\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.724336 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724150 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-etc-kubernetes\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.724336 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724170 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-run-systemd\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.724336 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724209 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-host-run-netns\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.724336 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724236 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-host-var-lib-cni-bin\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.724336 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724268 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-host-kubelet\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.724336 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724290 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-run\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.725059 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724313 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-node-log\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.725059 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724341 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-host-run-ovn-kubernetes\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.725059 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724344 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-host-kubelet\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.725059 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724365 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-os-release\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.725059 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724411 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-node-log\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.725059 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724417 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-host-run-multus-certs\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.725059 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724450 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb7rg\" (UniqueName: \"kubernetes.io/projected/21b4a1fd-e436-4824-abb9-d40c296dc036-kube-api-access-cb7rg\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.725059 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724463 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-run\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.725059 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724493 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/643f99c6-2212-475f-8fa2-d1ea3fa8a17a-konnectivity-ca\") pod \"konnectivity-agent-6wfbb\" (UID: \"643f99c6-2212-475f-8fa2-d1ea3fa8a17a\") " pod="kube-system/konnectivity-agent-6wfbb" Apr 23 01:09:54.725059 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724510 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-host-run-ovn-kubernetes\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.725059 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724521 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/83355238-6978-4f5d-8b07-0ea3d3784353-host\") pod \"node-ca-grrph\" (UID: \"83355238-6978-4f5d-8b07-0ea3d3784353\") " pod="openshift-image-registry/node-ca-grrph" Apr 23 01:09:54.725059 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724544 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-host-cni-bin\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.725059 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724568 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/21b4a1fd-e436-4824-abb9-d40c296dc036-cni-binary-copy\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.725059 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724580 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/83355238-6978-4f5d-8b07-0ea3d3784353-host\") pod \"node-ca-grrph\" (UID: \"83355238-6978-4f5d-8b07-0ea3d3784353\") " pod="openshift-image-registry/node-ca-grrph" Apr 23 01:09:54.725059 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724593 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-multus-socket-dir-parent\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.725059 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724627 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-host-cni-bin\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.725059 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724633 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-hostroot\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.725059 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724673 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-host\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.725877 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724695 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-log-socket\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.725877 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724717 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-host-cni-netd\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.725877 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724738 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mmrd7\" (UniqueName: \"kubernetes.io/projected/52115d8e-033b-485d-aa67-434e7ae395d5-kube-api-access-mmrd7\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.725877 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724764 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcb98\" (UniqueName: \"kubernetes.io/projected/608e8d52-e2cd-48e3-b524-0f0d764d9501-kube-api-access-tcb98\") pod \"network-metrics-daemon-5mm4v\" (UID: \"608e8d52-e2cd-48e3-b524-0f0d764d9501\") " pod="openshift-multus/network-metrics-daemon-5mm4v" Apr 23 01:09:54.725877 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724780 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-log-socket\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.725877 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724791 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb-cni-binary-copy\") pod \"multus-additional-cni-plugins-gzwpj\" (UID: \"b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb\") " pod="openshift-multus/multus-additional-cni-plugins-gzwpj" Apr 23 01:09:54.725877 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724803 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-host-cni-netd\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.725877 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724817 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkg5x\" (UniqueName: \"kubernetes.io/projected/577e99e2-2b77-4e82-9822-3e4be2d6ba4d-kube-api-access-rkg5x\") pod \"iptables-alerter-c5z7x\" (UID: \"577e99e2-2b77-4e82-9822-3e4be2d6ba4d\") " pod="openshift-network-operator/iptables-alerter-c5z7x" Apr 23 01:09:54.725877 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724765 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-host\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.725877 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724852 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hdl8f\" (UniqueName: \"kubernetes.io/projected/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-kube-api-access-hdl8f\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.725877 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724879 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-host-slash\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.725877 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724907 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-multus-cni-dir\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.725877 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724934 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-etc-sysctl-conf\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.725877 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724962 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-etc-systemd\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.725877 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724968 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-host-slash\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.725877 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.724986 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-sys\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.725877 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.725110 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-host-var-lib-cni-multus\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.726698 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.725110 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-sys\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.726698 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.725158 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-lib-modules\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.726698 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.725176 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-etc-systemd\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.726698 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.725184 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-var-lib-kubelet\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.726698 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.725208 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/52115d8e-033b-485d-aa67-434e7ae395d5-ovn-node-metrics-cert\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.726698 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.725236 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9k5g\" (UniqueName: \"kubernetes.io/projected/c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47-kube-api-access-x9k5g\") pod \"aws-ebs-csi-driver-node-6zhnh\" (UID: \"c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zhnh" Apr 23 01:09:54.726698 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.725274 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-lib-modules\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.726698 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.725297 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-etc-sysctl-conf\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.726698 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.725327 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-var-lib-kubelet\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.726698 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.725379 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/608e8d52-e2cd-48e3-b524-0f0d764d9501-metrics-certs\") pod \"network-metrics-daemon-5mm4v\" (UID: \"608e8d52-e2cd-48e3-b524-0f0d764d9501\") " pod="openshift-multus/network-metrics-daemon-5mm4v" Apr 23 01:09:54.726698 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.725406 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb-os-release\") pod \"multus-additional-cni-plugins-gzwpj\" (UID: \"b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb\") " pod="openshift-multus/multus-additional-cni-plugins-gzwpj" Apr 23 01:09:54.726698 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.725535 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6zhnh\" (UID: \"c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zhnh" Apr 23 01:09:54.726698 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.725622 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb-cnibin\") pod \"multus-additional-cni-plugins-gzwpj\" (UID: \"b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb\") " pod="openshift-multus/multus-additional-cni-plugins-gzwpj" Apr 23 01:09:54.726698 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.725661 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-system-cni-dir\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.726698 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.725668 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6zhnh\" (UID: \"c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zhnh" Apr 23 01:09:54.726698 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.725719 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-host-run-k8s-cni-cncf-io\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.726698 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.725755 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/643f99c6-2212-475f-8fa2-d1ea3fa8a17a-agent-certs\") pod \"konnectivity-agent-6wfbb\" (UID: \"643f99c6-2212-475f-8fa2-d1ea3fa8a17a\") " pod="kube-system/konnectivity-agent-6wfbb" Apr 23 01:09:54.727432 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.725787 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-var-lib-openvswitch\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.727432 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.725811 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gzwpj\" (UID: \"b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb\") " pod="openshift-multus/multus-additional-cni-plugins-gzwpj" Apr 23 01:09:54.727432 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.725833 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-cnibin\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.727432 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.725855 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/577e99e2-2b77-4e82-9822-3e4be2d6ba4d-host-slash\") pod \"iptables-alerter-c5z7x\" (UID: \"577e99e2-2b77-4e82-9822-3e4be2d6ba4d\") " pod="openshift-network-operator/iptables-alerter-c5z7x" Apr 23 01:09:54.727432 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.725867 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-var-lib-openvswitch\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.727432 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.725878 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b025c029-af84-46be-a329-3c26d61f764a-tmp-dir\") pod \"node-resolver-fgvhm\" (UID: \"b025c029-af84-46be-a329-3c26d61f764a\") " pod="openshift-dns/node-resolver-fgvhm" Apr 23 01:09:54.727432 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.725901 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-etc-openvswitch\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.727432 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.725929 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/52115d8e-033b-485d-aa67-434e7ae395d5-ovnkube-script-lib\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.727432 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.725965 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gzwpj\" (UID: \"b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb\") " pod="openshift-multus/multus-additional-cni-plugins-gzwpj" Apr 23 01:09:54.727432 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.726056 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52115d8e-033b-485d-aa67-434e7ae395d5-etc-openvswitch\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.727432 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.726156 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b025c029-af84-46be-a329-3c26d61f764a-tmp-dir\") pod \"node-resolver-fgvhm\" (UID: \"b025c029-af84-46be-a329-3c26d61f764a\") " pod="openshift-dns/node-resolver-fgvhm" Apr 23 01:09:54.727432 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.726600 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/52115d8e-033b-485d-aa67-434e7ae395d5-ovnkube-script-lib\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.727432 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.726916 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-etc-tuned\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.727432 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.726987 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-tmp\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.728124 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.727529 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/52115d8e-033b-485d-aa67-434e7ae395d5-ovn-node-metrics-cert\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.728712 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:54.728680 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 01:09:54.728712 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:54.728701 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 01:09:54.728712 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:54.728715 2569 projected.go:194] Error preparing data for projected volume kube-api-access-xdvsx for pod openshift-network-diagnostics/network-check-target-jbfxg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:09:54.728901 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:54.728783 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/943b6178-2514-4112-956c-8705385d8a3d-kube-api-access-xdvsx podName:943b6178-2514-4112-956c-8705385d8a3d nodeName:}" failed. No retries permitted until 2026-04-23 01:09:55.228764224 +0000 UTC m=+3.014438561 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-xdvsx" (UniqueName: "kubernetes.io/projected/943b6178-2514-4112-956c-8705385d8a3d-kube-api-access-xdvsx") pod "network-check-target-jbfxg" (UID: "943b6178-2514-4112-956c-8705385d8a3d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:09:54.730820 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.730465 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgd4w\" (UniqueName: \"kubernetes.io/projected/b025c029-af84-46be-a329-3c26d61f764a-kube-api-access-cgd4w\") pod \"node-resolver-fgvhm\" (UID: \"b025c029-af84-46be-a329-3c26d61f764a\") " pod="openshift-dns/node-resolver-fgvhm" Apr 23 01:09:54.731011 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.730988 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwf2b\" (UniqueName: \"kubernetes.io/projected/83355238-6978-4f5d-8b07-0ea3d3784353-kube-api-access-lwf2b\") pod \"node-ca-grrph\" (UID: \"83355238-6978-4f5d-8b07-0ea3d3784353\") " pod="openshift-image-registry/node-ca-grrph" Apr 23 01:09:54.732084 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.732046 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdl8f\" (UniqueName: \"kubernetes.io/projected/ced0a103-b7b9-41d1-9355-6f1429a2a4a6-kube-api-access-hdl8f\") pod \"tuned-s6dcr\" (UID: \"ced0a103-b7b9-41d1-9355-6f1429a2a4a6\") " pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.732629 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.732593 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmrd7\" (UniqueName: \"kubernetes.io/projected/52115d8e-033b-485d-aa67-434e7ae395d5-kube-api-access-mmrd7\") pod \"ovnkube-node-79v47\" (UID: \"52115d8e-033b-485d-aa67-434e7ae395d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.733066 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.733049 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9k5g\" (UniqueName: \"kubernetes.io/projected/c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47-kube-api-access-x9k5g\") pod \"aws-ebs-csi-driver-node-6zhnh\" (UID: \"c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zhnh" Apr 23 01:09:54.826350 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.826319 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7hr4s\" (UniqueName: \"kubernetes.io/projected/b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb-kube-api-access-7hr4s\") pod \"multus-additional-cni-plugins-gzwpj\" (UID: \"b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb\") " pod="openshift-multus/multus-additional-cni-plugins-gzwpj" Apr 23 01:09:54.826350 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.826352 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/577e99e2-2b77-4e82-9822-3e4be2d6ba4d-iptables-alerter-script\") pod \"iptables-alerter-c5z7x\" (UID: \"577e99e2-2b77-4e82-9822-3e4be2d6ba4d\") " pod="openshift-network-operator/iptables-alerter-c5z7x" Apr 23 01:09:54.826571 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.826376 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-etc-kubernetes\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.826571 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.826398 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-host-run-netns\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.826571 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.826418 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-host-var-lib-cni-bin\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.826571 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.826443 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-os-release\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.826571 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.826458 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-etc-kubernetes\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.826571 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.826465 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-host-run-multus-certs\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.826571 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.826497 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-host-var-lib-cni-bin\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.826571 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.826510 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-host-run-multus-certs\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.826571 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.826468 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-host-run-netns\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.826571 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.826534 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cb7rg\" (UniqueName: \"kubernetes.io/projected/21b4a1fd-e436-4824-abb9-d40c296dc036-kube-api-access-cb7rg\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.826571 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.826547 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-os-release\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.826571 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.826558 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/643f99c6-2212-475f-8fa2-d1ea3fa8a17a-konnectivity-ca\") pod \"konnectivity-agent-6wfbb\" (UID: \"643f99c6-2212-475f-8fa2-d1ea3fa8a17a\") " pod="kube-system/konnectivity-agent-6wfbb" Apr 23 01:09:54.827128 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.826595 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/21b4a1fd-e436-4824-abb9-d40c296dc036-cni-binary-copy\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.827128 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.826638 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-multus-socket-dir-parent\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.827128 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.826666 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-hostroot\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.827128 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.826710 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-hostroot\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.827128 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.826742 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tcb98\" (UniqueName: \"kubernetes.io/projected/608e8d52-e2cd-48e3-b524-0f0d764d9501-kube-api-access-tcb98\") pod \"network-metrics-daemon-5mm4v\" (UID: \"608e8d52-e2cd-48e3-b524-0f0d764d9501\") " pod="openshift-multus/network-metrics-daemon-5mm4v" Apr 23 01:09:54.827128 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.826758 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-multus-socket-dir-parent\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.827128 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.826767 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb-cni-binary-copy\") pod \"multus-additional-cni-plugins-gzwpj\" (UID: \"b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb\") " pod="openshift-multus/multus-additional-cni-plugins-gzwpj" Apr 23 01:09:54.827128 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.826806 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rkg5x\" (UniqueName: \"kubernetes.io/projected/577e99e2-2b77-4e82-9822-3e4be2d6ba4d-kube-api-access-rkg5x\") pod \"iptables-alerter-c5z7x\" (UID: \"577e99e2-2b77-4e82-9822-3e4be2d6ba4d\") " pod="openshift-network-operator/iptables-alerter-c5z7x" Apr 23 01:09:54.827128 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.826839 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-multus-cni-dir\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.827128 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.826865 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-host-var-lib-cni-multus\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.827128 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.826894 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/608e8d52-e2cd-48e3-b524-0f0d764d9501-metrics-certs\") pod \"network-metrics-daemon-5mm4v\" (UID: \"608e8d52-e2cd-48e3-b524-0f0d764d9501\") " pod="openshift-multus/network-metrics-daemon-5mm4v" Apr 23 01:09:54.827128 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.826912 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb-os-release\") pod \"multus-additional-cni-plugins-gzwpj\" (UID: \"b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb\") " pod="openshift-multus/multus-additional-cni-plugins-gzwpj" Apr 23 01:09:54.827128 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.826928 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb-cnibin\") pod \"multus-additional-cni-plugins-gzwpj\" (UID: \"b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb\") " pod="openshift-multus/multus-additional-cni-plugins-gzwpj" Apr 23 01:09:54.827128 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.826942 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-system-cni-dir\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.827128 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.826957 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-host-run-k8s-cni-cncf-io\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.827128 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.826975 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/643f99c6-2212-475f-8fa2-d1ea3fa8a17a-agent-certs\") pod \"konnectivity-agent-6wfbb\" (UID: \"643f99c6-2212-475f-8fa2-d1ea3fa8a17a\") " pod="kube-system/konnectivity-agent-6wfbb" Apr 23 01:09:54.827128 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.826975 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/577e99e2-2b77-4e82-9822-3e4be2d6ba4d-iptables-alerter-script\") pod \"iptables-alerter-c5z7x\" (UID: \"577e99e2-2b77-4e82-9822-3e4be2d6ba4d\") " pod="openshift-network-operator/iptables-alerter-c5z7x" Apr 23 01:09:54.827902 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.826994 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gzwpj\" (UID: \"b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb\") " pod="openshift-multus/multus-additional-cni-plugins-gzwpj" Apr 23 01:09:54.827902 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:54.827040 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:09:54.827902 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.827045 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-multus-cni-dir\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.827902 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.827078 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-cnibin\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.827902 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.827098 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/643f99c6-2212-475f-8fa2-d1ea3fa8a17a-konnectivity-ca\") pod \"konnectivity-agent-6wfbb\" (UID: \"643f99c6-2212-475f-8fa2-d1ea3fa8a17a\") " pod="kube-system/konnectivity-agent-6wfbb" Apr 23 01:09:54.827902 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.827118 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-cnibin\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.827902 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.827103 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gzwpj\" (UID: \"b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb\") " pod="openshift-multus/multus-additional-cni-plugins-gzwpj" Apr 23 01:09:54.827902 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.827136 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb-os-release\") pod \"multus-additional-cni-plugins-gzwpj\" (UID: \"b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb\") " pod="openshift-multus/multus-additional-cni-plugins-gzwpj" Apr 23 01:09:54.827902 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.827169 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-system-cni-dir\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.827902 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.827174 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-host-var-lib-cni-multus\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.827902 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:54.827104 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/608e8d52-e2cd-48e3-b524-0f0d764d9501-metrics-certs podName:608e8d52-e2cd-48e3-b524-0f0d764d9501 nodeName:}" failed. No retries permitted until 2026-04-23 01:09:55.327085303 +0000 UTC m=+3.112759641 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/608e8d52-e2cd-48e3-b524-0f0d764d9501-metrics-certs") pod "network-metrics-daemon-5mm4v" (UID: "608e8d52-e2cd-48e3-b524-0f0d764d9501") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:09:54.827902 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.827207 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-host-run-k8s-cni-cncf-io\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.827902 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.827199 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/21b4a1fd-e436-4824-abb9-d40c296dc036-cni-binary-copy\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.827902 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.827213 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/577e99e2-2b77-4e82-9822-3e4be2d6ba4d-host-slash\") pod \"iptables-alerter-c5z7x\" (UID: \"577e99e2-2b77-4e82-9822-3e4be2d6ba4d\") " pod="openshift-network-operator/iptables-alerter-c5z7x" Apr 23 01:09:54.827902 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.827249 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/577e99e2-2b77-4e82-9822-3e4be2d6ba4d-host-slash\") pod \"iptables-alerter-c5z7x\" (UID: \"577e99e2-2b77-4e82-9822-3e4be2d6ba4d\") " pod="openshift-network-operator/iptables-alerter-c5z7x" Apr 23 01:09:54.827902 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.827258 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb-cni-binary-copy\") pod \"multus-additional-cni-plugins-gzwpj\" (UID: \"b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb\") " pod="openshift-multus/multus-additional-cni-plugins-gzwpj" Apr 23 01:09:54.827902 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.827265 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gzwpj\" (UID: \"b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb\") " pod="openshift-multus/multus-additional-cni-plugins-gzwpj" Apr 23 01:09:54.828478 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.827273 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb-cnibin\") pod \"multus-additional-cni-plugins-gzwpj\" (UID: \"b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb\") " pod="openshift-multus/multus-additional-cni-plugins-gzwpj" Apr 23 01:09:54.828478 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.827322 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb-system-cni-dir\") pod \"multus-additional-cni-plugins-gzwpj\" (UID: \"b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb\") " pod="openshift-multus/multus-additional-cni-plugins-gzwpj" Apr 23 01:09:54.828478 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.827354 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gzwpj\" (UID: \"b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb\") " pod="openshift-multus/multus-additional-cni-plugins-gzwpj" Apr 23 01:09:54.828478 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.827380 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb-system-cni-dir\") pod \"multus-additional-cni-plugins-gzwpj\" (UID: \"b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb\") " pod="openshift-multus/multus-additional-cni-plugins-gzwpj" Apr 23 01:09:54.828478 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.827383 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/21b4a1fd-e436-4824-abb9-d40c296dc036-multus-daemon-config\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.828478 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.827426 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-host-var-lib-kubelet\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.828478 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.827454 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-multus-conf-dir\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.828478 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.827487 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-host-var-lib-kubelet\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.828478 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.827507 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/21b4a1fd-e436-4824-abb9-d40c296dc036-multus-conf-dir\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.828478 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.827731 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gzwpj\" (UID: \"b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb\") " pod="openshift-multus/multus-additional-cni-plugins-gzwpj" Apr 23 01:09:54.828478 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.827873 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gzwpj\" (UID: \"b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb\") " pod="openshift-multus/multus-additional-cni-plugins-gzwpj" Apr 23 01:09:54.828478 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.827936 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/21b4a1fd-e436-4824-abb9-d40c296dc036-multus-daemon-config\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.829344 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.829329 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/643f99c6-2212-475f-8fa2-d1ea3fa8a17a-agent-certs\") pod \"konnectivity-agent-6wfbb\" (UID: \"643f99c6-2212-475f-8fa2-d1ea3fa8a17a\") " pod="kube-system/konnectivity-agent-6wfbb" Apr 23 01:09:54.834951 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.834927 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcb98\" (UniqueName: \"kubernetes.io/projected/608e8d52-e2cd-48e3-b524-0f0d764d9501-kube-api-access-tcb98\") pod \"network-metrics-daemon-5mm4v\" (UID: \"608e8d52-e2cd-48e3-b524-0f0d764d9501\") " pod="openshift-multus/network-metrics-daemon-5mm4v" Apr 23 01:09:54.835305 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.835285 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hr4s\" (UniqueName: \"kubernetes.io/projected/b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb-kube-api-access-7hr4s\") pod \"multus-additional-cni-plugins-gzwpj\" (UID: \"b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb\") " pod="openshift-multus/multus-additional-cni-plugins-gzwpj" Apr 23 01:09:54.835417 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.835397 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkg5x\" (UniqueName: \"kubernetes.io/projected/577e99e2-2b77-4e82-9822-3e4be2d6ba4d-kube-api-access-rkg5x\") pod \"iptables-alerter-c5z7x\" (UID: \"577e99e2-2b77-4e82-9822-3e4be2d6ba4d\") " pod="openshift-network-operator/iptables-alerter-c5z7x" Apr 23 01:09:54.835469 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.835456 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb7rg\" (UniqueName: \"kubernetes.io/projected/21b4a1fd-e436-4824-abb9-d40c296dc036-kube-api-access-cb7rg\") pod \"multus-c6dx7\" (UID: \"21b4a1fd-e436-4824-abb9-d40c296dc036\") " pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.913429 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.913349 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" Apr 23 01:09:54.921106 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.921085 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fgvhm" Apr 23 01:09:54.929555 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.929536 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-grrph" Apr 23 01:09:54.935144 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.935126 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:09:54.941771 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.941751 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zhnh" Apr 23 01:09:54.948284 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.948267 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gzwpj" Apr 23 01:09:54.953779 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.953755 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-c6dx7" Apr 23 01:09:54.962286 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.962260 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-c5z7x" Apr 23 01:09:54.966812 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:54.966792 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6wfbb" Apr 23 01:09:55.230530 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:55.230452 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xdvsx\" (UniqueName: \"kubernetes.io/projected/943b6178-2514-4112-956c-8705385d8a3d-kube-api-access-xdvsx\") pod \"network-check-target-jbfxg\" (UID: \"943b6178-2514-4112-956c-8705385d8a3d\") " pod="openshift-network-diagnostics/network-check-target-jbfxg" Apr 23 01:09:55.230709 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:55.230589 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 01:09:55.230709 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:55.230603 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 01:09:55.230709 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:55.230631 2569 projected.go:194] Error preparing data for projected volume kube-api-access-xdvsx for pod openshift-network-diagnostics/network-check-target-jbfxg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:09:55.230709 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:55.230692 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/943b6178-2514-4112-956c-8705385d8a3d-kube-api-access-xdvsx podName:943b6178-2514-4112-956c-8705385d8a3d nodeName:}" failed. No retries permitted until 2026-04-23 01:09:56.230675089 +0000 UTC m=+4.016349444 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-xdvsx" (UniqueName: "kubernetes.io/projected/943b6178-2514-4112-956c-8705385d8a3d-kube-api-access-xdvsx") pod "network-check-target-jbfxg" (UID: "943b6178-2514-4112-956c-8705385d8a3d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:09:55.330860 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:55.330830 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/608e8d52-e2cd-48e3-b524-0f0d764d9501-metrics-certs\") pod \"network-metrics-daemon-5mm4v\" (UID: \"608e8d52-e2cd-48e3-b524-0f0d764d9501\") " pod="openshift-multus/network-metrics-daemon-5mm4v" Apr 23 01:09:55.331032 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:55.330976 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:09:55.331096 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:55.331050 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/608e8d52-e2cd-48e3-b524-0f0d764d9501-metrics-certs podName:608e8d52-e2cd-48e3-b524-0f0d764d9501 nodeName:}" failed. No retries permitted until 2026-04-23 01:09:56.331030678 +0000 UTC m=+4.116705021 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/608e8d52-e2cd-48e3-b524-0f0d764d9501-metrics-certs") pod "network-metrics-daemon-5mm4v" (UID: "608e8d52-e2cd-48e3-b524-0f0d764d9501") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:09:55.351296 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:55.351269 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83355238_6978_4f5d_8b07_0ea3d3784353.slice/crio-b4809b2f244deef576b642491d253d98bcc777a554ae6683f7e36367ee252a37 WatchSource:0}: Error finding container b4809b2f244deef576b642491d253d98bcc777a554ae6683f7e36367ee252a37: Status 404 returned error can't find the container with id b4809b2f244deef576b642491d253d98bcc777a554ae6683f7e36367ee252a37 Apr 23 01:09:55.352106 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:55.352082 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podced0a103_b7b9_41d1_9355_6f1429a2a4a6.slice/crio-4aca9988ef42e4ff0357c1d865235332b815797e8bab550841dcaac4377902be WatchSource:0}: Error finding container 4aca9988ef42e4ff0357c1d865235332b815797e8bab550841dcaac4377902be: Status 404 returned error can't find the container with id 4aca9988ef42e4ff0357c1d865235332b815797e8bab550841dcaac4377902be Apr 23 01:09:55.354271 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:55.354245 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb83ed83c_9c45_410a_8bf4_06d1bf0a4bbb.slice/crio-55715120dfd0ab3ab29a044043ed5ee65d11bb81c3f1bb50f6361c5207ce960a WatchSource:0}: Error finding container 55715120dfd0ab3ab29a044043ed5ee65d11bb81c3f1bb50f6361c5207ce960a: Status 404 returned error can't find the container with id 55715120dfd0ab3ab29a044043ed5ee65d11bb81c3f1bb50f6361c5207ce960a Apr 23 01:09:55.354482 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:55.354457 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52115d8e_033b_485d_aa67_434e7ae395d5.slice/crio-b3a20c576c774304132576deeaa89769ece5a1d8c2c0840225718952924f8f12 WatchSource:0}: Error finding container b3a20c576c774304132576deeaa89769ece5a1d8c2c0840225718952924f8f12: Status 404 returned error can't find the container with id b3a20c576c774304132576deeaa89769ece5a1d8c2c0840225718952924f8f12 Apr 23 01:09:55.376335 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:55.376297 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod643f99c6_2212_475f_8fa2_d1ea3fa8a17a.slice/crio-252c4076cc54ada805ad43d5ece73f9d7d97475e02e489f2cfe33e83f9ed366f WatchSource:0}: Error finding container 252c4076cc54ada805ad43d5ece73f9d7d97475e02e489f2cfe33e83f9ed366f: Status 404 returned error can't find the container with id 252c4076cc54ada805ad43d5ece73f9d7d97475e02e489f2cfe33e83f9ed366f Apr 23 01:09:55.376952 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:55.376930 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod577e99e2_2b77_4e82_9822_3e4be2d6ba4d.slice/crio-45b3868c062aa400e311f6cbb765b2450fa42549adba0e0ad7b2394cb59e1718 WatchSource:0}: Error finding container 45b3868c062aa400e311f6cbb765b2450fa42549adba0e0ad7b2394cb59e1718: Status 404 returned error can't find the container with id 45b3868c062aa400e311f6cbb765b2450fa42549adba0e0ad7b2394cb59e1718 Apr 23 01:09:55.377835 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:55.377815 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc489e14c_dbdb_4dca_bbd0_bc90d6cdcd47.slice/crio-c1eb33189fbec7496c9123af0a9bd94170aeb28178384efb85a474b0a1a41084 WatchSource:0}: Error finding container c1eb33189fbec7496c9123af0a9bd94170aeb28178384efb85a474b0a1a41084: Status 404 returned error can't find the container with id c1eb33189fbec7496c9123af0a9bd94170aeb28178384efb85a474b0a1a41084 Apr 23 01:09:55.378477 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:55.378452 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21b4a1fd_e436_4824_abb9_d40c296dc036.slice/crio-eca030f5091aa471b6bbbb975e2f9fc21c60f0ae33c48642666cc90e0e330fa6 WatchSource:0}: Error finding container eca030f5091aa471b6bbbb975e2f9fc21c60f0ae33c48642666cc90e0e330fa6: Status 404 returned error can't find the container with id eca030f5091aa471b6bbbb975e2f9fc21c60f0ae33c48642666cc90e0e330fa6 Apr 23 01:09:55.379375 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:09:55.379353 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb025c029_af84_46be_a329_3c26d61f764a.slice/crio-5ad924963b120fffc325a8faaff8dc61ea859a73c51001926468dfa331529be5 WatchSource:0}: Error finding container 5ad924963b120fffc325a8faaff8dc61ea859a73c51001926468dfa331529be5: Status 404 returned error can't find the container with id 5ad924963b120fffc325a8faaff8dc61ea859a73c51001926468dfa331529be5 Apr 23 01:09:55.646721 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:55.646416 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 01:04:53 +0000 UTC" deadline="2027-10-29 07:45:22.363447065 +0000 UTC" Apr 23 01:09:55.646721 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:55.646646 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13302h35m26.71680582s" Apr 23 01:09:55.770121 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:55.770074 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-c5z7x" event={"ID":"577e99e2-2b77-4e82-9822-3e4be2d6ba4d","Type":"ContainerStarted","Data":"45b3868c062aa400e311f6cbb765b2450fa42549adba0e0ad7b2394cb59e1718"} Apr 23 01:09:55.772800 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:55.772747 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79v47" event={"ID":"52115d8e-033b-485d-aa67-434e7ae395d5","Type":"ContainerStarted","Data":"b3a20c576c774304132576deeaa89769ece5a1d8c2c0840225718952924f8f12"} Apr 23 01:09:55.776192 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:55.776157 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gzwpj" event={"ID":"b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb","Type":"ContainerStarted","Data":"55715120dfd0ab3ab29a044043ed5ee65d11bb81c3f1bb50f6361c5207ce960a"} Apr 23 01:09:55.780954 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:55.780915 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c6dx7" event={"ID":"21b4a1fd-e436-4824-abb9-d40c296dc036","Type":"ContainerStarted","Data":"eca030f5091aa471b6bbbb975e2f9fc21c60f0ae33c48642666cc90e0e330fa6"} Apr 23 01:09:55.790154 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:55.790084 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6wfbb" event={"ID":"643f99c6-2212-475f-8fa2-d1ea3fa8a17a","Type":"ContainerStarted","Data":"252c4076cc54ada805ad43d5ece73f9d7d97475e02e489f2cfe33e83f9ed366f"} Apr 23 01:09:55.794839 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:55.794759 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" event={"ID":"ced0a103-b7b9-41d1-9355-6f1429a2a4a6","Type":"ContainerStarted","Data":"4aca9988ef42e4ff0357c1d865235332b815797e8bab550841dcaac4377902be"} Apr 23 01:09:55.798822 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:55.798765 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-grrph" event={"ID":"83355238-6978-4f5d-8b07-0ea3d3784353","Type":"ContainerStarted","Data":"b4809b2f244deef576b642491d253d98bcc777a554ae6683f7e36367ee252a37"} Apr 23 01:09:55.802676 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:55.802157 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-235.ec2.internal" event={"ID":"b29f92d08abcb80be4466bff586fc859","Type":"ContainerStarted","Data":"9e2174f20ef2904e2a21d7a5b8a5cd2393d27f1205eb3b88f989ca0f5137dc03"} Apr 23 01:09:55.804438 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:55.804412 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fgvhm" event={"ID":"b025c029-af84-46be-a329-3c26d61f764a","Type":"ContainerStarted","Data":"5ad924963b120fffc325a8faaff8dc61ea859a73c51001926468dfa331529be5"} Apr 23 01:09:55.809050 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:55.808757 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zhnh" event={"ID":"c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47","Type":"ContainerStarted","Data":"c1eb33189fbec7496c9123af0a9bd94170aeb28178384efb85a474b0a1a41084"} Apr 23 01:09:56.238899 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:56.238065 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xdvsx\" (UniqueName: \"kubernetes.io/projected/943b6178-2514-4112-956c-8705385d8a3d-kube-api-access-xdvsx\") pod \"network-check-target-jbfxg\" (UID: \"943b6178-2514-4112-956c-8705385d8a3d\") " pod="openshift-network-diagnostics/network-check-target-jbfxg" Apr 23 01:09:56.238899 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:56.238268 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 01:09:56.238899 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:56.238290 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 01:09:56.238899 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:56.238302 2569 projected.go:194] Error preparing data for projected volume kube-api-access-xdvsx for pod openshift-network-diagnostics/network-check-target-jbfxg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:09:56.238899 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:56.238363 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/943b6178-2514-4112-956c-8705385d8a3d-kube-api-access-xdvsx podName:943b6178-2514-4112-956c-8705385d8a3d nodeName:}" failed. No retries permitted until 2026-04-23 01:09:58.238345326 +0000 UTC m=+6.024019678 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-xdvsx" (UniqueName: "kubernetes.io/projected/943b6178-2514-4112-956c-8705385d8a3d-kube-api-access-xdvsx") pod "network-check-target-jbfxg" (UID: "943b6178-2514-4112-956c-8705385d8a3d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:09:56.340139 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:56.339507 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/608e8d52-e2cd-48e3-b524-0f0d764d9501-metrics-certs\") pod \"network-metrics-daemon-5mm4v\" (UID: \"608e8d52-e2cd-48e3-b524-0f0d764d9501\") " pod="openshift-multus/network-metrics-daemon-5mm4v" Apr 23 01:09:56.340139 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:56.339670 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:09:56.340139 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:56.339746 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/608e8d52-e2cd-48e3-b524-0f0d764d9501-metrics-certs podName:608e8d52-e2cd-48e3-b524-0f0d764d9501 nodeName:}" failed. No retries permitted until 2026-04-23 01:09:58.339726765 +0000 UTC m=+6.125401122 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/608e8d52-e2cd-48e3-b524-0f0d764d9501-metrics-certs") pod "network-metrics-daemon-5mm4v" (UID: "608e8d52-e2cd-48e3-b524-0f0d764d9501") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:09:56.764452 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:56.764381 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jbfxg" Apr 23 01:09:56.764895 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:56.764501 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jbfxg" podUID="943b6178-2514-4112-956c-8705385d8a3d" Apr 23 01:09:56.764895 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:56.764847 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5mm4v" Apr 23 01:09:56.764999 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:56.764968 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5mm4v" podUID="608e8d52-e2cd-48e3-b524-0f0d764d9501" Apr 23 01:09:56.833435 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:56.833403 2569 generic.go:358] "Generic (PLEG): container finished" podID="c1be7c8c689749547130833b567b2938" containerID="61174115daa2429d5a4d29f4ad012b6fc7460cfbbdfc500fd3dcbd70a300f67a" exitCode=0 Apr 23 01:09:56.834356 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:56.834243 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-235.ec2.internal" event={"ID":"c1be7c8c689749547130833b567b2938","Type":"ContainerDied","Data":"61174115daa2429d5a4d29f4ad012b6fc7460cfbbdfc500fd3dcbd70a300f67a"} Apr 23 01:09:56.847148 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:56.846890 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-235.ec2.internal" podStartSLOduration=3.846871781 podStartE2EDuration="3.846871781s" podCreationTimestamp="2026-04-23 01:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 01:09:55.815038744 +0000 UTC m=+3.600713101" watchObservedRunningTime="2026-04-23 01:09:56.846871781 +0000 UTC m=+4.632546138" Apr 23 01:09:57.842362 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:57.842321 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-235.ec2.internal" event={"ID":"c1be7c8c689749547130833b567b2938","Type":"ContainerStarted","Data":"dfc80096e79f1519416a3e878f1debe859410a41eae0f66d584c2dcc3db9fa7f"} Apr 23 01:09:58.254961 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:58.254927 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xdvsx\" (UniqueName: \"kubernetes.io/projected/943b6178-2514-4112-956c-8705385d8a3d-kube-api-access-xdvsx\") pod \"network-check-target-jbfxg\" (UID: \"943b6178-2514-4112-956c-8705385d8a3d\") " pod="openshift-network-diagnostics/network-check-target-jbfxg" Apr 23 01:09:58.255125 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:58.255077 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 01:09:58.255125 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:58.255097 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 01:09:58.255125 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:58.255109 2569 projected.go:194] Error preparing data for projected volume kube-api-access-xdvsx for pod openshift-network-diagnostics/network-check-target-jbfxg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:09:58.255285 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:58.255163 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/943b6178-2514-4112-956c-8705385d8a3d-kube-api-access-xdvsx podName:943b6178-2514-4112-956c-8705385d8a3d nodeName:}" failed. No retries permitted until 2026-04-23 01:10:02.255145009 +0000 UTC m=+10.040819346 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-xdvsx" (UniqueName: "kubernetes.io/projected/943b6178-2514-4112-956c-8705385d8a3d-kube-api-access-xdvsx") pod "network-check-target-jbfxg" (UID: "943b6178-2514-4112-956c-8705385d8a3d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:09:58.355676 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:58.355634 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/608e8d52-e2cd-48e3-b524-0f0d764d9501-metrics-certs\") pod \"network-metrics-daemon-5mm4v\" (UID: \"608e8d52-e2cd-48e3-b524-0f0d764d9501\") " pod="openshift-multus/network-metrics-daemon-5mm4v" Apr 23 01:09:58.355848 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:58.355757 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:09:58.355848 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:58.355823 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/608e8d52-e2cd-48e3-b524-0f0d764d9501-metrics-certs podName:608e8d52-e2cd-48e3-b524-0f0d764d9501 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:02.355804709 +0000 UTC m=+10.141479044 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/608e8d52-e2cd-48e3-b524-0f0d764d9501-metrics-certs") pod "network-metrics-daemon-5mm4v" (UID: "608e8d52-e2cd-48e3-b524-0f0d764d9501") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:09:58.764274 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:58.763759 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5mm4v" Apr 23 01:09:58.764274 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:58.763891 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5mm4v" podUID="608e8d52-e2cd-48e3-b524-0f0d764d9501" Apr 23 01:09:58.764770 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:09:58.764591 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jbfxg" Apr 23 01:09:58.764770 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:09:58.764714 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jbfxg" podUID="943b6178-2514-4112-956c-8705385d8a3d" Apr 23 01:10:00.762925 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:00.762892 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jbfxg" Apr 23 01:10:00.763360 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:00.763009 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jbfxg" podUID="943b6178-2514-4112-956c-8705385d8a3d" Apr 23 01:10:00.763360 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:00.763041 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5mm4v" Apr 23 01:10:00.763360 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:00.763109 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5mm4v" podUID="608e8d52-e2cd-48e3-b524-0f0d764d9501" Apr 23 01:10:01.547957 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:01.547882 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-235.ec2.internal" podStartSLOduration=8.547860436 podStartE2EDuration="8.547860436s" podCreationTimestamp="2026-04-23 01:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 01:09:57.85451813 +0000 UTC m=+5.640192489" watchObservedRunningTime="2026-04-23 01:10:01.547860436 +0000 UTC m=+9.333534795" Apr 23 01:10:01.548181 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:01.548160 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-6jtxr"] Apr 23 01:10:01.551328 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:01.551303 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6jtxr" Apr 23 01:10:01.551439 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:01.551389 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6jtxr" podUID="ae1a6350-32fe-4569-a4d1-9c369aaff8e4" Apr 23 01:10:01.583920 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:01.583891 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ae1a6350-32fe-4569-a4d1-9c369aaff8e4-kubelet-config\") pod \"global-pull-secret-syncer-6jtxr\" (UID: \"ae1a6350-32fe-4569-a4d1-9c369aaff8e4\") " pod="kube-system/global-pull-secret-syncer-6jtxr" Apr 23 01:10:01.584073 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:01.583941 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ae1a6350-32fe-4569-a4d1-9c369aaff8e4-original-pull-secret\") pod \"global-pull-secret-syncer-6jtxr\" (UID: \"ae1a6350-32fe-4569-a4d1-9c369aaff8e4\") " pod="kube-system/global-pull-secret-syncer-6jtxr" Apr 23 01:10:01.584073 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:01.583973 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ae1a6350-32fe-4569-a4d1-9c369aaff8e4-dbus\") pod \"global-pull-secret-syncer-6jtxr\" (UID: \"ae1a6350-32fe-4569-a4d1-9c369aaff8e4\") " pod="kube-system/global-pull-secret-syncer-6jtxr" Apr 23 01:10:01.684970 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:01.684932 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ae1a6350-32fe-4569-a4d1-9c369aaff8e4-dbus\") pod \"global-pull-secret-syncer-6jtxr\" (UID: \"ae1a6350-32fe-4569-a4d1-9c369aaff8e4\") " pod="kube-system/global-pull-secret-syncer-6jtxr" Apr 23 01:10:01.685133 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:01.685039 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ae1a6350-32fe-4569-a4d1-9c369aaff8e4-kubelet-config\") pod \"global-pull-secret-syncer-6jtxr\" (UID: \"ae1a6350-32fe-4569-a4d1-9c369aaff8e4\") " pod="kube-system/global-pull-secret-syncer-6jtxr" Apr 23 01:10:01.685133 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:01.685076 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ae1a6350-32fe-4569-a4d1-9c369aaff8e4-original-pull-secret\") pod \"global-pull-secret-syncer-6jtxr\" (UID: \"ae1a6350-32fe-4569-a4d1-9c369aaff8e4\") " pod="kube-system/global-pull-secret-syncer-6jtxr" Apr 23 01:10:01.685256 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:01.685203 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 01:10:01.685311 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:01.685262 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae1a6350-32fe-4569-a4d1-9c369aaff8e4-original-pull-secret podName:ae1a6350-32fe-4569-a4d1-9c369aaff8e4 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:02.185245631 +0000 UTC m=+9.970919971 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ae1a6350-32fe-4569-a4d1-9c369aaff8e4-original-pull-secret") pod "global-pull-secret-syncer-6jtxr" (UID: "ae1a6350-32fe-4569-a4d1-9c369aaff8e4") : object "kube-system"/"original-pull-secret" not registered Apr 23 01:10:01.685383 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:01.685359 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ae1a6350-32fe-4569-a4d1-9c369aaff8e4-dbus\") pod \"global-pull-secret-syncer-6jtxr\" (UID: \"ae1a6350-32fe-4569-a4d1-9c369aaff8e4\") " pod="kube-system/global-pull-secret-syncer-6jtxr" Apr 23 01:10:01.685452 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:01.685426 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ae1a6350-32fe-4569-a4d1-9c369aaff8e4-kubelet-config\") pod \"global-pull-secret-syncer-6jtxr\" (UID: \"ae1a6350-32fe-4569-a4d1-9c369aaff8e4\") " pod="kube-system/global-pull-secret-syncer-6jtxr" Apr 23 01:10:02.189892 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:02.189842 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ae1a6350-32fe-4569-a4d1-9c369aaff8e4-original-pull-secret\") pod \"global-pull-secret-syncer-6jtxr\" (UID: \"ae1a6350-32fe-4569-a4d1-9c369aaff8e4\") " pod="kube-system/global-pull-secret-syncer-6jtxr" Apr 23 01:10:02.190355 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:02.190084 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 01:10:02.190355 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:02.190152 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae1a6350-32fe-4569-a4d1-9c369aaff8e4-original-pull-secret podName:ae1a6350-32fe-4569-a4d1-9c369aaff8e4 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:03.190134194 +0000 UTC m=+10.975808535 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ae1a6350-32fe-4569-a4d1-9c369aaff8e4-original-pull-secret") pod "global-pull-secret-syncer-6jtxr" (UID: "ae1a6350-32fe-4569-a4d1-9c369aaff8e4") : object "kube-system"/"original-pull-secret" not registered Apr 23 01:10:02.290918 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:02.290311 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xdvsx\" (UniqueName: \"kubernetes.io/projected/943b6178-2514-4112-956c-8705385d8a3d-kube-api-access-xdvsx\") pod \"network-check-target-jbfxg\" (UID: \"943b6178-2514-4112-956c-8705385d8a3d\") " pod="openshift-network-diagnostics/network-check-target-jbfxg" Apr 23 01:10:02.290918 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:02.290467 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 01:10:02.290918 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:02.290487 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 01:10:02.290918 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:02.290499 2569 projected.go:194] Error preparing data for projected volume kube-api-access-xdvsx for pod openshift-network-diagnostics/network-check-target-jbfxg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:10:02.290918 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:02.290556 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/943b6178-2514-4112-956c-8705385d8a3d-kube-api-access-xdvsx podName:943b6178-2514-4112-956c-8705385d8a3d nodeName:}" failed. No retries permitted until 2026-04-23 01:10:10.290536325 +0000 UTC m=+18.076210684 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-xdvsx" (UniqueName: "kubernetes.io/projected/943b6178-2514-4112-956c-8705385d8a3d-kube-api-access-xdvsx") pod "network-check-target-jbfxg" (UID: "943b6178-2514-4112-956c-8705385d8a3d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:10:02.391172 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:02.391132 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/608e8d52-e2cd-48e3-b524-0f0d764d9501-metrics-certs\") pod \"network-metrics-daemon-5mm4v\" (UID: \"608e8d52-e2cd-48e3-b524-0f0d764d9501\") " pod="openshift-multus/network-metrics-daemon-5mm4v" Apr 23 01:10:02.391358 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:02.391298 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:10:02.391414 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:02.391360 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/608e8d52-e2cd-48e3-b524-0f0d764d9501-metrics-certs podName:608e8d52-e2cd-48e3-b524-0f0d764d9501 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:10.391342464 +0000 UTC m=+18.177016811 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/608e8d52-e2cd-48e3-b524-0f0d764d9501-metrics-certs") pod "network-metrics-daemon-5mm4v" (UID: "608e8d52-e2cd-48e3-b524-0f0d764d9501") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:10:02.764328 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:02.763582 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5mm4v" Apr 23 01:10:02.764328 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:02.763720 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5mm4v" podUID="608e8d52-e2cd-48e3-b524-0f0d764d9501" Apr 23 01:10:02.764328 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:02.764104 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6jtxr" Apr 23 01:10:02.764328 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:02.764196 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6jtxr" podUID="ae1a6350-32fe-4569-a4d1-9c369aaff8e4" Apr 23 01:10:02.764328 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:02.764238 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jbfxg" Apr 23 01:10:02.764328 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:02.764299 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jbfxg" podUID="943b6178-2514-4112-956c-8705385d8a3d" Apr 23 01:10:03.198594 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:03.198557 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ae1a6350-32fe-4569-a4d1-9c369aaff8e4-original-pull-secret\") pod \"global-pull-secret-syncer-6jtxr\" (UID: \"ae1a6350-32fe-4569-a4d1-9c369aaff8e4\") " pod="kube-system/global-pull-secret-syncer-6jtxr" Apr 23 01:10:03.199052 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:03.198744 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 01:10:03.199052 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:03.198843 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae1a6350-32fe-4569-a4d1-9c369aaff8e4-original-pull-secret podName:ae1a6350-32fe-4569-a4d1-9c369aaff8e4 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:05.198823446 +0000 UTC m=+12.984497795 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ae1a6350-32fe-4569-a4d1-9c369aaff8e4-original-pull-secret") pod "global-pull-secret-syncer-6jtxr" (UID: "ae1a6350-32fe-4569-a4d1-9c369aaff8e4") : object "kube-system"/"original-pull-secret" not registered Apr 23 01:10:04.762299 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:04.762264 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5mm4v" Apr 23 01:10:04.762750 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:04.762265 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6jtxr" Apr 23 01:10:04.762750 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:04.762397 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5mm4v" podUID="608e8d52-e2cd-48e3-b524-0f0d764d9501" Apr 23 01:10:04.762750 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:04.762466 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6jtxr" podUID="ae1a6350-32fe-4569-a4d1-9c369aaff8e4" Apr 23 01:10:04.762750 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:04.762265 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jbfxg" Apr 23 01:10:04.762750 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:04.762540 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jbfxg" podUID="943b6178-2514-4112-956c-8705385d8a3d" Apr 23 01:10:05.212351 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:05.212277 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ae1a6350-32fe-4569-a4d1-9c369aaff8e4-original-pull-secret\") pod \"global-pull-secret-syncer-6jtxr\" (UID: \"ae1a6350-32fe-4569-a4d1-9c369aaff8e4\") " pod="kube-system/global-pull-secret-syncer-6jtxr" Apr 23 01:10:05.212493 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:05.212431 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 01:10:05.212493 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:05.212488 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae1a6350-32fe-4569-a4d1-9c369aaff8e4-original-pull-secret podName:ae1a6350-32fe-4569-a4d1-9c369aaff8e4 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:09.212473642 +0000 UTC m=+16.998147977 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ae1a6350-32fe-4569-a4d1-9c369aaff8e4-original-pull-secret") pod "global-pull-secret-syncer-6jtxr" (UID: "ae1a6350-32fe-4569-a4d1-9c369aaff8e4") : object "kube-system"/"original-pull-secret" not registered Apr 23 01:10:06.762409 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:06.762373 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jbfxg" Apr 23 01:10:06.762866 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:06.762373 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6jtxr" Apr 23 01:10:06.762866 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:06.762496 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jbfxg" podUID="943b6178-2514-4112-956c-8705385d8a3d" Apr 23 01:10:06.762866 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:06.762386 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5mm4v" Apr 23 01:10:06.762866 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:06.762590 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6jtxr" podUID="ae1a6350-32fe-4569-a4d1-9c369aaff8e4" Apr 23 01:10:06.762866 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:06.762691 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5mm4v" podUID="608e8d52-e2cd-48e3-b524-0f0d764d9501" Apr 23 01:10:08.761914 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:08.761884 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5mm4v" Apr 23 01:10:08.762400 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:08.761884 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jbfxg" Apr 23 01:10:08.762400 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:08.762013 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5mm4v" podUID="608e8d52-e2cd-48e3-b524-0f0d764d9501" Apr 23 01:10:08.762400 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:08.761884 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6jtxr" Apr 23 01:10:08.762400 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:08.762096 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jbfxg" podUID="943b6178-2514-4112-956c-8705385d8a3d" Apr 23 01:10:08.762400 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:08.762196 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6jtxr" podUID="ae1a6350-32fe-4569-a4d1-9c369aaff8e4" Apr 23 01:10:09.241876 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:09.241796 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ae1a6350-32fe-4569-a4d1-9c369aaff8e4-original-pull-secret\") pod \"global-pull-secret-syncer-6jtxr\" (UID: \"ae1a6350-32fe-4569-a4d1-9c369aaff8e4\") " pod="kube-system/global-pull-secret-syncer-6jtxr" Apr 23 01:10:09.242028 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:09.241912 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 01:10:09.242028 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:09.241982 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae1a6350-32fe-4569-a4d1-9c369aaff8e4-original-pull-secret podName:ae1a6350-32fe-4569-a4d1-9c369aaff8e4 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:17.241965411 +0000 UTC m=+25.027639745 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ae1a6350-32fe-4569-a4d1-9c369aaff8e4-original-pull-secret") pod "global-pull-secret-syncer-6jtxr" (UID: "ae1a6350-32fe-4569-a4d1-9c369aaff8e4") : object "kube-system"/"original-pull-secret" not registered Apr 23 01:10:10.351140 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:10.351108 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xdvsx\" (UniqueName: \"kubernetes.io/projected/943b6178-2514-4112-956c-8705385d8a3d-kube-api-access-xdvsx\") pod \"network-check-target-jbfxg\" (UID: \"943b6178-2514-4112-956c-8705385d8a3d\") " pod="openshift-network-diagnostics/network-check-target-jbfxg" Apr 23 01:10:10.351659 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:10.351255 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 01:10:10.351659 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:10.351272 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 01:10:10.351659 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:10.351284 2569 projected.go:194] Error preparing data for projected volume kube-api-access-xdvsx for pod openshift-network-diagnostics/network-check-target-jbfxg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:10:10.351659 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:10.351342 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/943b6178-2514-4112-956c-8705385d8a3d-kube-api-access-xdvsx podName:943b6178-2514-4112-956c-8705385d8a3d nodeName:}" failed. No retries permitted until 2026-04-23 01:10:26.351324921 +0000 UTC m=+34.136999271 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-xdvsx" (UniqueName: "kubernetes.io/projected/943b6178-2514-4112-956c-8705385d8a3d-kube-api-access-xdvsx") pod "network-check-target-jbfxg" (UID: "943b6178-2514-4112-956c-8705385d8a3d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:10:10.451898 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:10.451863 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/608e8d52-e2cd-48e3-b524-0f0d764d9501-metrics-certs\") pod \"network-metrics-daemon-5mm4v\" (UID: \"608e8d52-e2cd-48e3-b524-0f0d764d9501\") " pod="openshift-multus/network-metrics-daemon-5mm4v" Apr 23 01:10:10.452079 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:10.451982 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:10:10.452079 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:10.452046 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/608e8d52-e2cd-48e3-b524-0f0d764d9501-metrics-certs podName:608e8d52-e2cd-48e3-b524-0f0d764d9501 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:26.452026767 +0000 UTC m=+34.237701108 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/608e8d52-e2cd-48e3-b524-0f0d764d9501-metrics-certs") pod "network-metrics-daemon-5mm4v" (UID: "608e8d52-e2cd-48e3-b524-0f0d764d9501") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:10:10.761889 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:10.761796 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6jtxr" Apr 23 01:10:10.761889 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:10.761808 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5mm4v" Apr 23 01:10:10.762087 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:10.761911 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6jtxr" podUID="ae1a6350-32fe-4569-a4d1-9c369aaff8e4" Apr 23 01:10:10.762087 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:10.761808 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jbfxg" Apr 23 01:10:10.762087 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:10.761989 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5mm4v" podUID="608e8d52-e2cd-48e3-b524-0f0d764d9501" Apr 23 01:10:10.762237 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:10.762081 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jbfxg" podUID="943b6178-2514-4112-956c-8705385d8a3d" Apr 23 01:10:12.762590 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:12.762291 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6jtxr" Apr 23 01:10:12.763336 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:12.762360 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jbfxg" Apr 23 01:10:12.763336 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:12.762383 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5mm4v" Apr 23 01:10:12.763336 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:12.762702 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6jtxr" podUID="ae1a6350-32fe-4569-a4d1-9c369aaff8e4" Apr 23 01:10:12.763336 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:12.762762 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jbfxg" podUID="943b6178-2514-4112-956c-8705385d8a3d" Apr 23 01:10:12.763336 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:12.762837 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5mm4v" podUID="608e8d52-e2cd-48e3-b524-0f0d764d9501" Apr 23 01:10:12.867339 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:12.867303 2569 generic.go:358] "Generic (PLEG): container finished" podID="b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb" containerID="e502d1502672acf2833bbb09934002b280d565d5dd2261c9c0d62f340c46a899" exitCode=0 Apr 23 01:10:12.867530 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:12.867393 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gzwpj" event={"ID":"b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb","Type":"ContainerDied","Data":"e502d1502672acf2833bbb09934002b280d565d5dd2261c9c0d62f340c46a899"} Apr 23 01:10:12.868856 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:12.868830 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c6dx7" event={"ID":"21b4a1fd-e436-4824-abb9-d40c296dc036","Type":"ContainerStarted","Data":"ea08b9aec2e1a98caa35e156f82218179363891de98ba90bf93618e117253aea"} Apr 23 01:10:12.870418 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:12.870392 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6wfbb" event={"ID":"643f99c6-2212-475f-8fa2-d1ea3fa8a17a","Type":"ContainerStarted","Data":"2a0b25743cac81d204950a108cfe401fd675fb48cd9a357812a9175061b79847"} Apr 23 01:10:12.871962 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:12.871934 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" event={"ID":"ced0a103-b7b9-41d1-9355-6f1429a2a4a6","Type":"ContainerStarted","Data":"84d1266b79749ce076ef2548bcddf6ec1b2c64a243c2828ef7774719859095b9"} Apr 23 01:10:12.873661 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:12.873639 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-grrph" event={"ID":"83355238-6978-4f5d-8b07-0ea3d3784353","Type":"ContainerStarted","Data":"52e90a4c171d035538950f36cc3ad4d36ebb5e0f7f825a300f100d877760d504"} Apr 23 01:10:12.874937 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:12.874918 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fgvhm" event={"ID":"b025c029-af84-46be-a329-3c26d61f764a","Type":"ContainerStarted","Data":"d31da228907be4441c4a6b5965cbe2f079257462d1a43c3cf00c16aeb5d2b845"} Apr 23 01:10:12.876327 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:12.876309 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zhnh" event={"ID":"c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47","Type":"ContainerStarted","Data":"c44146b5e8a61b33f10310a638c3f32a847f244c22eba6b3493122b9e18c99e3"} Apr 23 01:10:12.878926 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:12.878900 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79v47_52115d8e-033b-485d-aa67-434e7ae395d5/ovn-acl-logging/0.log" Apr 23 01:10:12.879280 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:12.879259 2569 generic.go:358] "Generic (PLEG): container finished" podID="52115d8e-033b-485d-aa67-434e7ae395d5" containerID="5b3ea3c1c826feaa10df41ff5bb2837fbbf258c52c610faf0039452ebfd0e02c" exitCode=1 Apr 23 01:10:12.879393 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:12.879281 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79v47" event={"ID":"52115d8e-033b-485d-aa67-434e7ae395d5","Type":"ContainerStarted","Data":"b8147b53032ad372076a8c00216b92b2c95745f752e64e2ede17f32a7c2b3d43"} Apr 23 01:10:12.879393 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:12.879316 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79v47" event={"ID":"52115d8e-033b-485d-aa67-434e7ae395d5","Type":"ContainerStarted","Data":"24f9ebe7531ebc1222852b9560f5479506ce4bfe58bd35ccfe8a3bfe923068d2"} Apr 23 01:10:12.879393 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:12.879330 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79v47" event={"ID":"52115d8e-033b-485d-aa67-434e7ae395d5","Type":"ContainerStarted","Data":"91d031aff5ddbf526927a19d843cd1a01fe40c8b4351fe5b2b28abd21b5e706d"} Apr 23 01:10:12.879393 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:12.879342 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79v47" event={"ID":"52115d8e-033b-485d-aa67-434e7ae395d5","Type":"ContainerStarted","Data":"bd47c69c0b1e0872555a7b00a64851950cb29a135007febae9063307f3489d75"} Apr 23 01:10:12.879393 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:12.879354 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79v47" event={"ID":"52115d8e-033b-485d-aa67-434e7ae395d5","Type":"ContainerDied","Data":"5b3ea3c1c826feaa10df41ff5bb2837fbbf258c52c610faf0039452ebfd0e02c"} Apr 23 01:10:12.879393 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:12.879368 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79v47" event={"ID":"52115d8e-033b-485d-aa67-434e7ae395d5","Type":"ContainerStarted","Data":"601ff9efbfb8e5ce35c9a2246e5ce1552cdaf85c8c756347e5cb602f2ffa08e3"} Apr 23 01:10:12.904839 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:12.904792 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-fgvhm" podStartSLOduration=4.181317391 podStartE2EDuration="20.904777179s" podCreationTimestamp="2026-04-23 01:09:52 +0000 UTC" firstStartedPulling="2026-04-23 01:09:55.381906964 +0000 UTC m=+3.167581313" lastFinishedPulling="2026-04-23 01:10:12.105366762 +0000 UTC m=+19.891041101" observedRunningTime="2026-04-23 01:10:12.904443912 +0000 UTC m=+20.690118268" watchObservedRunningTime="2026-04-23 01:10:12.904777179 +0000 UTC m=+20.690451537" Apr 23 01:10:12.905176 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:12.905145 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-s6dcr" podStartSLOduration=4.085205774 podStartE2EDuration="20.905141024s" podCreationTimestamp="2026-04-23 01:09:52 +0000 UTC" firstStartedPulling="2026-04-23 01:09:55.354455758 +0000 UTC m=+3.140130100" lastFinishedPulling="2026-04-23 01:10:12.17439101 +0000 UTC m=+19.960065350" observedRunningTime="2026-04-23 01:10:12.894201461 +0000 UTC m=+20.679875819" watchObservedRunningTime="2026-04-23 01:10:12.905141024 +0000 UTC m=+20.690815383" Apr 23 01:10:12.937774 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:12.937739 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-grrph" podStartSLOduration=12.035524738 podStartE2EDuration="20.937728366s" podCreationTimestamp="2026-04-23 01:09:52 +0000 UTC" firstStartedPulling="2026-04-23 01:09:55.352976327 +0000 UTC m=+3.138650676" lastFinishedPulling="2026-04-23 01:10:04.255179969 +0000 UTC m=+12.040854304" observedRunningTime="2026-04-23 01:10:12.916637313 +0000 UTC m=+20.702311671" watchObservedRunningTime="2026-04-23 01:10:12.937728366 +0000 UTC m=+20.723402729" Apr 23 01:10:12.956378 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:12.956339 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-c6dx7" podStartSLOduration=3.922502907 podStartE2EDuration="20.956327743s" podCreationTimestamp="2026-04-23 01:09:52 +0000 UTC" firstStartedPulling="2026-04-23 01:09:55.381226448 +0000 UTC m=+3.166900783" lastFinishedPulling="2026-04-23 01:10:12.415051268 +0000 UTC m=+20.200725619" observedRunningTime="2026-04-23 01:10:12.93751524 +0000 UTC m=+20.723189597" watchObservedRunningTime="2026-04-23 01:10:12.956327743 +0000 UTC m=+20.742002099" Apr 23 01:10:12.956510 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:12.956483 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-6wfbb" podStartSLOduration=3.232751137 podStartE2EDuration="19.956478679s" podCreationTimestamp="2026-04-23 01:09:53 +0000 UTC" firstStartedPulling="2026-04-23 01:09:55.381635163 +0000 UTC m=+3.167309509" lastFinishedPulling="2026-04-23 01:10:12.1053627 +0000 UTC m=+19.891037051" observedRunningTime="2026-04-23 01:10:12.955977776 +0000 UTC m=+20.741652132" watchObservedRunningTime="2026-04-23 01:10:12.956478679 +0000 UTC m=+20.742153036" Apr 23 01:10:13.286558 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:13.286538 2569 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 01:10:13.693527 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:13.693421 2569 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T01:10:13.286555125Z","UUID":"1c1d03bc-1e05-4994-98ae-9695eac94260","Handler":null,"Name":"","Endpoint":""} Apr 23 01:10:13.695098 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:13.695072 2569 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 01:10:13.695098 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:13.695098 2569 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 01:10:13.882751 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:13.882715 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zhnh" event={"ID":"c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47","Type":"ContainerStarted","Data":"b9d75c5d8d2a6376ccce0374d0c1559c01f5ff86f481499e9c3444095fe450b3"} Apr 23 01:10:13.884186 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:13.884155 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-c5z7x" event={"ID":"577e99e2-2b77-4e82-9822-3e4be2d6ba4d","Type":"ContainerStarted","Data":"b1e19effc5268035b5369e6fe160ca8b5dbaaab42bb1f368628f017832518497"} Apr 23 01:10:13.895710 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:13.895634 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-c5z7x" podStartSLOduration=4.136911294 podStartE2EDuration="20.895606061s" podCreationTimestamp="2026-04-23 01:09:53 +0000 UTC" firstStartedPulling="2026-04-23 01:09:55.380282643 +0000 UTC m=+3.165956995" lastFinishedPulling="2026-04-23 01:10:12.138977415 +0000 UTC m=+19.924651762" observedRunningTime="2026-04-23 01:10:13.895505438 +0000 UTC m=+21.681179797" watchObservedRunningTime="2026-04-23 01:10:13.895606061 +0000 UTC m=+21.681280433" Apr 23 01:10:14.764995 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:14.764817 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jbfxg" Apr 23 01:10:14.765103 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:14.764822 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5mm4v" Apr 23 01:10:14.765103 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:14.765067 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jbfxg" podUID="943b6178-2514-4112-956c-8705385d8a3d" Apr 23 01:10:14.765213 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:14.765148 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5mm4v" podUID="608e8d52-e2cd-48e3-b524-0f0d764d9501" Apr 23 01:10:14.765213 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:14.764820 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6jtxr" Apr 23 01:10:14.765296 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:14.765225 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6jtxr" podUID="ae1a6350-32fe-4569-a4d1-9c369aaff8e4" Apr 23 01:10:14.888847 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:14.888819 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79v47_52115d8e-033b-485d-aa67-434e7ae395d5/ovn-acl-logging/0.log" Apr 23 01:10:14.889254 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:14.889225 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79v47" event={"ID":"52115d8e-033b-485d-aa67-434e7ae395d5","Type":"ContainerStarted","Data":"0338d4250e8aa8db957c711d0a6fe1a36ad4de5a373baad088d9286da56e176c"} Apr 23 01:10:14.891532 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:14.891432 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zhnh" event={"ID":"c489e14c-dbdb-4dca-bbd0-bc90d6cdcd47","Type":"ContainerStarted","Data":"c2de64bdd0df1fa9f8f9004de68d2e2185ea59a88efbca86ded9ff9ee3e524ed"} Apr 23 01:10:14.919403 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:14.919364 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zhnh" podStartSLOduration=4.139770792 podStartE2EDuration="22.919352015s" podCreationTimestamp="2026-04-23 01:09:52 +0000 UTC" firstStartedPulling="2026-04-23 01:09:55.380206765 +0000 UTC m=+3.165881115" lastFinishedPulling="2026-04-23 01:10:14.159787999 +0000 UTC m=+21.945462338" observedRunningTime="2026-04-23 01:10:14.919168893 +0000 UTC m=+22.704843249" watchObservedRunningTime="2026-04-23 01:10:14.919352015 +0000 UTC m=+22.705026375" Apr 23 01:10:16.280072 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:16.280039 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-6wfbb" Apr 23 01:10:16.280749 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:16.280728 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-6wfbb" Apr 23 01:10:16.625376 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:16.625298 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-6wfbb" Apr 23 01:10:16.625926 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:16.625897 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-6wfbb" Apr 23 01:10:16.762869 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:16.762830 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jbfxg" Apr 23 01:10:16.762869 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:16.762856 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6jtxr" Apr 23 01:10:16.763103 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:16.762926 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5mm4v" Apr 23 01:10:16.763103 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:16.762941 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jbfxg" podUID="943b6178-2514-4112-956c-8705385d8a3d" Apr 23 01:10:16.763103 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:16.763027 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5mm4v" podUID="608e8d52-e2cd-48e3-b524-0f0d764d9501" Apr 23 01:10:16.763261 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:16.763101 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6jtxr" podUID="ae1a6350-32fe-4569-a4d1-9c369aaff8e4" Apr 23 01:10:17.303072 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:17.303037 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ae1a6350-32fe-4569-a4d1-9c369aaff8e4-original-pull-secret\") pod \"global-pull-secret-syncer-6jtxr\" (UID: \"ae1a6350-32fe-4569-a4d1-9c369aaff8e4\") " pod="kube-system/global-pull-secret-syncer-6jtxr" Apr 23 01:10:17.303533 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:17.303198 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 01:10:17.303533 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:17.303267 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae1a6350-32fe-4569-a4d1-9c369aaff8e4-original-pull-secret podName:ae1a6350-32fe-4569-a4d1-9c369aaff8e4 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:33.303251579 +0000 UTC m=+41.088925921 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ae1a6350-32fe-4569-a4d1-9c369aaff8e4-original-pull-secret") pod "global-pull-secret-syncer-6jtxr" (UID: "ae1a6350-32fe-4569-a4d1-9c369aaff8e4") : object "kube-system"/"original-pull-secret" not registered Apr 23 01:10:17.900778 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:17.900591 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79v47_52115d8e-033b-485d-aa67-434e7ae395d5/ovn-acl-logging/0.log" Apr 23 01:10:17.901186 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:17.901157 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79v47" event={"ID":"52115d8e-033b-485d-aa67-434e7ae395d5","Type":"ContainerStarted","Data":"35bd5627d836aa0b4693f844dd7e447e6b2760c2847816cbf01a6a64e1b4351e"} Apr 23 01:10:17.901455 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:17.901433 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:10:17.901649 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:17.901631 2569 scope.go:117] "RemoveContainer" containerID="5b3ea3c1c826feaa10df41ff5bb2837fbbf258c52c610faf0039452ebfd0e02c" Apr 23 01:10:17.902898 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:17.902866 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gzwpj" event={"ID":"b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb","Type":"ContainerStarted","Data":"2752e079889956489306572cfa13cdd8691d9ce30ebc8b799bb222f3fcf05051"} Apr 23 01:10:17.916253 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:17.916232 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:10:18.765491 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:18.765398 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6jtxr" Apr 23 01:10:18.765957 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:18.765518 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5mm4v" Apr 23 01:10:18.765957 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:18.765535 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6jtxr" podUID="ae1a6350-32fe-4569-a4d1-9c369aaff8e4" Apr 23 01:10:18.765957 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:18.765654 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5mm4v" podUID="608e8d52-e2cd-48e3-b524-0f0d764d9501" Apr 23 01:10:18.765957 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:18.765692 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jbfxg" Apr 23 01:10:18.765957 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:18.765755 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jbfxg" podUID="943b6178-2514-4112-956c-8705385d8a3d" Apr 23 01:10:18.907824 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:18.907796 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79v47_52115d8e-033b-485d-aa67-434e7ae395d5/ovn-acl-logging/0.log" Apr 23 01:10:18.908167 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:18.908139 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-79v47" event={"ID":"52115d8e-033b-485d-aa67-434e7ae395d5","Type":"ContainerStarted","Data":"34963d2c92fe5f7cd9666404f171b5388d54fb5e5d9ebd5e8fcd0f5e3df0ce64"} Apr 23 01:10:18.908430 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:18.908407 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:10:18.908567 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:18.908436 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:10:18.909863 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:18.909840 2569 generic.go:358] "Generic (PLEG): container finished" podID="b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb" containerID="2752e079889956489306572cfa13cdd8691d9ce30ebc8b799bb222f3fcf05051" exitCode=0 Apr 23 01:10:18.909960 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:18.909885 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gzwpj" event={"ID":"b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb","Type":"ContainerDied","Data":"2752e079889956489306572cfa13cdd8691d9ce30ebc8b799bb222f3fcf05051"} Apr 23 01:10:18.923018 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:18.922999 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:10:18.933116 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:18.933078 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-79v47" podStartSLOduration=10.072488032 podStartE2EDuration="26.933065553s" podCreationTimestamp="2026-04-23 01:09:52 +0000 UTC" firstStartedPulling="2026-04-23 01:09:55.374466815 +0000 UTC m=+3.160141161" lastFinishedPulling="2026-04-23 01:10:12.23504433 +0000 UTC m=+20.020718682" observedRunningTime="2026-04-23 01:10:18.932211163 +0000 UTC m=+26.717885520" watchObservedRunningTime="2026-04-23 01:10:18.933065553 +0000 UTC m=+26.718739906" Apr 23 01:10:19.490581 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:19.490544 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5mm4v"] Apr 23 01:10:19.490778 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:19.490707 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5mm4v" Apr 23 01:10:19.490879 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:19.490828 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5mm4v" podUID="608e8d52-e2cd-48e3-b524-0f0d764d9501" Apr 23 01:10:19.493173 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:19.493141 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jbfxg"] Apr 23 01:10:19.493304 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:19.493254 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jbfxg" Apr 23 01:10:19.493366 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:19.493338 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jbfxg" podUID="943b6178-2514-4112-956c-8705385d8a3d" Apr 23 01:10:19.493978 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:19.493952 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6jtxr"] Apr 23 01:10:19.494075 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:19.494059 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6jtxr" Apr 23 01:10:19.494179 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:19.494154 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6jtxr" podUID="ae1a6350-32fe-4569-a4d1-9c369aaff8e4" Apr 23 01:10:19.913721 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:19.913688 2569 generic.go:358] "Generic (PLEG): container finished" podID="b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb" containerID="ad68aa2f1b922c5589730067a02331586e26681f4908f38920e74bcdfbf20ab5" exitCode=0 Apr 23 01:10:19.914061 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:19.913770 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gzwpj" event={"ID":"b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb","Type":"ContainerDied","Data":"ad68aa2f1b922c5589730067a02331586e26681f4908f38920e74bcdfbf20ab5"} Apr 23 01:10:20.761969 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:20.761941 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5mm4v" Apr 23 01:10:20.761969 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:20.761969 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jbfxg" Apr 23 01:10:20.762137 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:20.762066 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5mm4v" podUID="608e8d52-e2cd-48e3-b524-0f0d764d9501" Apr 23 01:10:20.762201 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:20.762182 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jbfxg" podUID="943b6178-2514-4112-956c-8705385d8a3d" Apr 23 01:10:20.917699 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:20.917498 2569 generic.go:358] "Generic (PLEG): container finished" podID="b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb" containerID="1fee7e7127b967d5fd3ea7cbaa0a70142a72248d62672bc421f349674218122d" exitCode=0 Apr 23 01:10:20.918079 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:20.917581 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gzwpj" event={"ID":"b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb","Type":"ContainerDied","Data":"1fee7e7127b967d5fd3ea7cbaa0a70142a72248d62672bc421f349674218122d"} Apr 23 01:10:21.762482 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:21.762404 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6jtxr" Apr 23 01:10:21.762683 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:21.762520 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6jtxr" podUID="ae1a6350-32fe-4569-a4d1-9c369aaff8e4" Apr 23 01:10:22.763312 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:22.763284 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jbfxg" Apr 23 01:10:22.763979 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:22.763377 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5mm4v" Apr 23 01:10:22.763979 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:22.763402 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jbfxg" podUID="943b6178-2514-4112-956c-8705385d8a3d" Apr 23 01:10:22.763979 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:22.763441 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5mm4v" podUID="608e8d52-e2cd-48e3-b524-0f0d764d9501" Apr 23 01:10:23.762135 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:23.762108 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6jtxr" Apr 23 01:10:23.762300 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:23.762204 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6jtxr" podUID="ae1a6350-32fe-4569-a4d1-9c369aaff8e4" Apr 23 01:10:24.762134 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:24.762049 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jbfxg" Apr 23 01:10:24.762134 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:24.762086 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5mm4v" Apr 23 01:10:24.762811 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:24.762177 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jbfxg" podUID="943b6178-2514-4112-956c-8705385d8a3d" Apr 23 01:10:24.762811 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:24.762635 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5mm4v" podUID="608e8d52-e2cd-48e3-b524-0f0d764d9501" Apr 23 01:10:25.074652 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.074572 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-235.ec2.internal" event="NodeReady" Apr 23 01:10:25.074810 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.074741 2569 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 01:10:25.105365 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.105336 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-65b68c6657-v6kp5"] Apr 23 01:10:25.139573 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.139544 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-rpd44"] Apr 23 01:10:25.139736 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.139718 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:10:25.143527 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.142367 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 01:10:25.143527 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.142438 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 01:10:25.143527 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.142507 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-ndfmc\"" Apr 23 01:10:25.143527 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.142558 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 01:10:25.147758 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.147736 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 01:10:25.157766 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.157743 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-65b68c6657-v6kp5"] Apr 23 01:10:25.157867 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.157771 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rpd44"] Apr 23 01:10:25.157917 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.157888 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rpd44" Apr 23 01:10:25.160130 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.159913 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 01:10:25.160130 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.159929 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 01:10:25.160130 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.159965 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-h4q4b\"" Apr 23 01:10:25.216387 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.216360 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-bk66m"] Apr 23 01:10:25.237826 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.237798 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bk66m"] Apr 23 01:10:25.237969 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.237948 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bk66m" Apr 23 01:10:25.240171 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.240149 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 01:10:25.240171 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.240155 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 01:10:25.240370 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.240242 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-jjv8h\"" Apr 23 01:10:25.240370 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.240157 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 01:10:25.257658 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.257631 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1679ef2e-e9c0-4738-a19c-35582013fe18-image-registry-private-configuration\") pod \"image-registry-65b68c6657-v6kp5\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:10:25.257783 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.257673 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8nns\" (UniqueName: \"kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-kube-api-access-v8nns\") pod \"image-registry-65b68c6657-v6kp5\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:10:25.257783 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.257709 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/746bf4ef-4ba5-45d1-9cc6-ab6354c10b18-tmp-dir\") pod \"dns-default-rpd44\" (UID: \"746bf4ef-4ba5-45d1-9cc6-ab6354c10b18\") " pod="openshift-dns/dns-default-rpd44" Apr 23 01:10:25.257783 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.257727 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1679ef2e-e9c0-4738-a19c-35582013fe18-installation-pull-secrets\") pod \"image-registry-65b68c6657-v6kp5\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:10:25.257783 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.257748 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1679ef2e-e9c0-4738-a19c-35582013fe18-registry-certificates\") pod \"image-registry-65b68c6657-v6kp5\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:10:25.257993 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.257805 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1679ef2e-e9c0-4738-a19c-35582013fe18-ca-trust-extracted\") pod \"image-registry-65b68c6657-v6kp5\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:10:25.257993 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.257839 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/746bf4ef-4ba5-45d1-9cc6-ab6354c10b18-config-volume\") pod \"dns-default-rpd44\" (UID: \"746bf4ef-4ba5-45d1-9cc6-ab6354c10b18\") " pod="openshift-dns/dns-default-rpd44" Apr 23 01:10:25.257993 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.257878 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drjbr\" (UniqueName: \"kubernetes.io/projected/746bf4ef-4ba5-45d1-9cc6-ab6354c10b18-kube-api-access-drjbr\") pod \"dns-default-rpd44\" (UID: \"746bf4ef-4ba5-45d1-9cc6-ab6354c10b18\") " pod="openshift-dns/dns-default-rpd44" Apr 23 01:10:25.257993 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.257897 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-bound-sa-token\") pod \"image-registry-65b68c6657-v6kp5\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:10:25.257993 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.257917 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1679ef2e-e9c0-4738-a19c-35582013fe18-trusted-ca\") pod \"image-registry-65b68c6657-v6kp5\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:10:25.257993 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.257951 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/746bf4ef-4ba5-45d1-9cc6-ab6354c10b18-metrics-tls\") pod \"dns-default-rpd44\" (UID: \"746bf4ef-4ba5-45d1-9cc6-ab6354c10b18\") " pod="openshift-dns/dns-default-rpd44" Apr 23 01:10:25.257993 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.257972 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-registry-tls\") pod \"image-registry-65b68c6657-v6kp5\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:10:25.359025 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.358939 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1679ef2e-e9c0-4738-a19c-35582013fe18-registry-certificates\") pod \"image-registry-65b68c6657-v6kp5\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:10:25.359025 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.358994 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1679ef2e-e9c0-4738-a19c-35582013fe18-ca-trust-extracted\") pod \"image-registry-65b68c6657-v6kp5\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:10:25.359025 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.359018 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/746bf4ef-4ba5-45d1-9cc6-ab6354c10b18-config-volume\") pod \"dns-default-rpd44\" (UID: \"746bf4ef-4ba5-45d1-9cc6-ab6354c10b18\") " pod="openshift-dns/dns-default-rpd44" Apr 23 01:10:25.359317 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.359056 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-drjbr\" (UniqueName: \"kubernetes.io/projected/746bf4ef-4ba5-45d1-9cc6-ab6354c10b18-kube-api-access-drjbr\") pod \"dns-default-rpd44\" (UID: \"746bf4ef-4ba5-45d1-9cc6-ab6354c10b18\") " pod="openshift-dns/dns-default-rpd44" Apr 23 01:10:25.359317 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.359085 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-bound-sa-token\") pod \"image-registry-65b68c6657-v6kp5\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:10:25.359317 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.359113 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1679ef2e-e9c0-4738-a19c-35582013fe18-trusted-ca\") pod \"image-registry-65b68c6657-v6kp5\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:10:25.359317 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.359156 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xtt9\" (UniqueName: \"kubernetes.io/projected/79d4f338-f964-4fa6-985e-50bbb3b105a9-kube-api-access-8xtt9\") pod \"ingress-canary-bk66m\" (UID: \"79d4f338-f964-4fa6-985e-50bbb3b105a9\") " pod="openshift-ingress-canary/ingress-canary-bk66m" Apr 23 01:10:25.359317 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.359186 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/746bf4ef-4ba5-45d1-9cc6-ab6354c10b18-metrics-tls\") pod \"dns-default-rpd44\" (UID: \"746bf4ef-4ba5-45d1-9cc6-ab6354c10b18\") " pod="openshift-dns/dns-default-rpd44" Apr 23 01:10:25.359317 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.359244 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-registry-tls\") pod \"image-registry-65b68c6657-v6kp5\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:10:25.359317 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.359298 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/79d4f338-f964-4fa6-985e-50bbb3b105a9-cert\") pod \"ingress-canary-bk66m\" (UID: \"79d4f338-f964-4fa6-985e-50bbb3b105a9\") " pod="openshift-ingress-canary/ingress-canary-bk66m" Apr 23 01:10:25.359665 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.359343 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1679ef2e-e9c0-4738-a19c-35582013fe18-image-registry-private-configuration\") pod \"image-registry-65b68c6657-v6kp5\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:10:25.359665 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.359377 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8nns\" (UniqueName: \"kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-kube-api-access-v8nns\") pod \"image-registry-65b68c6657-v6kp5\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:10:25.359665 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.359412 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/746bf4ef-4ba5-45d1-9cc6-ab6354c10b18-tmp-dir\") pod \"dns-default-rpd44\" (UID: \"746bf4ef-4ba5-45d1-9cc6-ab6354c10b18\") " pod="openshift-dns/dns-default-rpd44" Apr 23 01:10:25.359665 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.359455 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1679ef2e-e9c0-4738-a19c-35582013fe18-installation-pull-secrets\") pod \"image-registry-65b68c6657-v6kp5\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:10:25.359665 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:25.359537 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 01:10:25.359665 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:25.359668 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/746bf4ef-4ba5-45d1-9cc6-ab6354c10b18-metrics-tls podName:746bf4ef-4ba5-45d1-9cc6-ab6354c10b18 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:25.859642376 +0000 UTC m=+33.645316727 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/746bf4ef-4ba5-45d1-9cc6-ab6354c10b18-metrics-tls") pod "dns-default-rpd44" (UID: "746bf4ef-4ba5-45d1-9cc6-ab6354c10b18") : secret "dns-default-metrics-tls" not found Apr 23 01:10:25.359943 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.359821 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1679ef2e-e9c0-4738-a19c-35582013fe18-registry-certificates\") pod \"image-registry-65b68c6657-v6kp5\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:10:25.359943 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.359900 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/746bf4ef-4ba5-45d1-9cc6-ab6354c10b18-tmp-dir\") pod \"dns-default-rpd44\" (UID: \"746bf4ef-4ba5-45d1-9cc6-ab6354c10b18\") " pod="openshift-dns/dns-default-rpd44" Apr 23 01:10:25.359943 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.359902 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/746bf4ef-4ba5-45d1-9cc6-ab6354c10b18-config-volume\") pod \"dns-default-rpd44\" (UID: \"746bf4ef-4ba5-45d1-9cc6-ab6354c10b18\") " pod="openshift-dns/dns-default-rpd44" Apr 23 01:10:25.360096 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:25.359949 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 01:10:25.360096 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:25.359965 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65b68c6657-v6kp5: secret "image-registry-tls" not found Apr 23 01:10:25.360096 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:25.360019 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-registry-tls podName:1679ef2e-e9c0-4738-a19c-35582013fe18 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:25.860004161 +0000 UTC m=+33.645678496 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-registry-tls") pod "image-registry-65b68c6657-v6kp5" (UID: "1679ef2e-e9c0-4738-a19c-35582013fe18") : secret "image-registry-tls" not found Apr 23 01:10:25.360239 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.360123 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1679ef2e-e9c0-4738-a19c-35582013fe18-ca-trust-extracted\") pod \"image-registry-65b68c6657-v6kp5\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:10:25.360660 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.360640 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1679ef2e-e9c0-4738-a19c-35582013fe18-trusted-ca\") pod \"image-registry-65b68c6657-v6kp5\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:10:25.364015 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.363995 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1679ef2e-e9c0-4738-a19c-35582013fe18-image-registry-private-configuration\") pod \"image-registry-65b68c6657-v6kp5\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:10:25.364015 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.364006 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1679ef2e-e9c0-4738-a19c-35582013fe18-installation-pull-secrets\") pod \"image-registry-65b68c6657-v6kp5\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:10:25.370778 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.370730 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-drjbr\" (UniqueName: \"kubernetes.io/projected/746bf4ef-4ba5-45d1-9cc6-ab6354c10b18-kube-api-access-drjbr\") pod \"dns-default-rpd44\" (UID: \"746bf4ef-4ba5-45d1-9cc6-ab6354c10b18\") " pod="openshift-dns/dns-default-rpd44" Apr 23 01:10:25.370936 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.370914 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8nns\" (UniqueName: \"kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-kube-api-access-v8nns\") pod \"image-registry-65b68c6657-v6kp5\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:10:25.371121 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.371067 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-bound-sa-token\") pod \"image-registry-65b68c6657-v6kp5\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:10:25.460417 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.460377 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/79d4f338-f964-4fa6-985e-50bbb3b105a9-cert\") pod \"ingress-canary-bk66m\" (UID: \"79d4f338-f964-4fa6-985e-50bbb3b105a9\") " pod="openshift-ingress-canary/ingress-canary-bk66m" Apr 23 01:10:25.460600 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:25.460568 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 01:10:25.460681 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:25.460660 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79d4f338-f964-4fa6-985e-50bbb3b105a9-cert podName:79d4f338-f964-4fa6-985e-50bbb3b105a9 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:25.960640413 +0000 UTC m=+33.746314772 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/79d4f338-f964-4fa6-985e-50bbb3b105a9-cert") pod "ingress-canary-bk66m" (UID: "79d4f338-f964-4fa6-985e-50bbb3b105a9") : secret "canary-serving-cert" not found Apr 23 01:10:25.460746 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.460703 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8xtt9\" (UniqueName: \"kubernetes.io/projected/79d4f338-f964-4fa6-985e-50bbb3b105a9-kube-api-access-8xtt9\") pod \"ingress-canary-bk66m\" (UID: \"79d4f338-f964-4fa6-985e-50bbb3b105a9\") " pod="openshift-ingress-canary/ingress-canary-bk66m" Apr 23 01:10:25.468822 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.468803 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xtt9\" (UniqueName: \"kubernetes.io/projected/79d4f338-f964-4fa6-985e-50bbb3b105a9-kube-api-access-8xtt9\") pod \"ingress-canary-bk66m\" (UID: \"79d4f338-f964-4fa6-985e-50bbb3b105a9\") " pod="openshift-ingress-canary/ingress-canary-bk66m" Apr 23 01:10:25.762656 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.762598 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6jtxr" Apr 23 01:10:25.764946 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.764919 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 01:10:25.864808 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.864770 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/746bf4ef-4ba5-45d1-9cc6-ab6354c10b18-metrics-tls\") pod \"dns-default-rpd44\" (UID: \"746bf4ef-4ba5-45d1-9cc6-ab6354c10b18\") " pod="openshift-dns/dns-default-rpd44" Apr 23 01:10:25.864987 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.864818 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-registry-tls\") pod \"image-registry-65b68c6657-v6kp5\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:10:25.864987 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:25.864941 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 01:10:25.864987 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:25.864943 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 01:10:25.865142 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:25.865027 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/746bf4ef-4ba5-45d1-9cc6-ab6354c10b18-metrics-tls podName:746bf4ef-4ba5-45d1-9cc6-ab6354c10b18 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:26.865008734 +0000 UTC m=+34.650683086 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/746bf4ef-4ba5-45d1-9cc6-ab6354c10b18-metrics-tls") pod "dns-default-rpd44" (UID: "746bf4ef-4ba5-45d1-9cc6-ab6354c10b18") : secret "dns-default-metrics-tls" not found Apr 23 01:10:25.865142 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:25.864956 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65b68c6657-v6kp5: secret "image-registry-tls" not found Apr 23 01:10:25.865142 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:25.865085 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-registry-tls podName:1679ef2e-e9c0-4738-a19c-35582013fe18 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:26.865075339 +0000 UTC m=+34.650749697 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-registry-tls") pod "image-registry-65b68c6657-v6kp5" (UID: "1679ef2e-e9c0-4738-a19c-35582013fe18") : secret "image-registry-tls" not found Apr 23 01:10:25.965448 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:25.965411 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/79d4f338-f964-4fa6-985e-50bbb3b105a9-cert\") pod \"ingress-canary-bk66m\" (UID: \"79d4f338-f964-4fa6-985e-50bbb3b105a9\") " pod="openshift-ingress-canary/ingress-canary-bk66m" Apr 23 01:10:25.965652 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:25.965548 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 01:10:25.965652 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:25.965632 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79d4f338-f964-4fa6-985e-50bbb3b105a9-cert podName:79d4f338-f964-4fa6-985e-50bbb3b105a9 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:26.965599496 +0000 UTC m=+34.751273847 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/79d4f338-f964-4fa6-985e-50bbb3b105a9-cert") pod "ingress-canary-bk66m" (UID: "79d4f338-f964-4fa6-985e-50bbb3b105a9") : secret "canary-serving-cert" not found Apr 23 01:10:26.369863 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:26.369824 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xdvsx\" (UniqueName: \"kubernetes.io/projected/943b6178-2514-4112-956c-8705385d8a3d-kube-api-access-xdvsx\") pod \"network-check-target-jbfxg\" (UID: \"943b6178-2514-4112-956c-8705385d8a3d\") " pod="openshift-network-diagnostics/network-check-target-jbfxg" Apr 23 01:10:26.370084 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:26.369993 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 01:10:26.370084 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:26.370016 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 01:10:26.370084 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:26.370026 2569 projected.go:194] Error preparing data for projected volume kube-api-access-xdvsx for pod openshift-network-diagnostics/network-check-target-jbfxg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:10:26.370084 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:26.370078 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/943b6178-2514-4112-956c-8705385d8a3d-kube-api-access-xdvsx podName:943b6178-2514-4112-956c-8705385d8a3d nodeName:}" failed. No retries permitted until 2026-04-23 01:10:58.370063035 +0000 UTC m=+66.155737389 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-xdvsx" (UniqueName: "kubernetes.io/projected/943b6178-2514-4112-956c-8705385d8a3d-kube-api-access-xdvsx") pod "network-check-target-jbfxg" (UID: "943b6178-2514-4112-956c-8705385d8a3d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 01:10:26.471001 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:26.470962 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/608e8d52-e2cd-48e3-b524-0f0d764d9501-metrics-certs\") pod \"network-metrics-daemon-5mm4v\" (UID: \"608e8d52-e2cd-48e3-b524-0f0d764d9501\") " pod="openshift-multus/network-metrics-daemon-5mm4v" Apr 23 01:10:26.471181 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:26.471109 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:10:26.471181 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:26.471172 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/608e8d52-e2cd-48e3-b524-0f0d764d9501-metrics-certs podName:608e8d52-e2cd-48e3-b524-0f0d764d9501 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:58.471157162 +0000 UTC m=+66.256831517 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/608e8d52-e2cd-48e3-b524-0f0d764d9501-metrics-certs") pod "network-metrics-daemon-5mm4v" (UID: "608e8d52-e2cd-48e3-b524-0f0d764d9501") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 01:10:26.762089 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:26.762055 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5mm4v" Apr 23 01:10:26.762321 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:26.762055 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jbfxg" Apr 23 01:10:26.766754 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:26.766724 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 01:10:26.766754 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:26.766753 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 01:10:26.767225 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:26.766728 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lj44d\"" Apr 23 01:10:26.767283 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:26.767240 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-hjv94\"" Apr 23 01:10:26.767371 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:26.767352 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 01:10:26.874490 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:26.874457 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/746bf4ef-4ba5-45d1-9cc6-ab6354c10b18-metrics-tls\") pod \"dns-default-rpd44\" (UID: \"746bf4ef-4ba5-45d1-9cc6-ab6354c10b18\") " pod="openshift-dns/dns-default-rpd44" Apr 23 01:10:26.874490 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:26.874495 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-registry-tls\") pod \"image-registry-65b68c6657-v6kp5\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:10:26.874738 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:26.874581 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 01:10:26.874738 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:26.874593 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65b68c6657-v6kp5: secret "image-registry-tls" not found Apr 23 01:10:26.874738 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:26.874582 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 01:10:26.874738 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:26.874667 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-registry-tls podName:1679ef2e-e9c0-4738-a19c-35582013fe18 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:28.874652885 +0000 UTC m=+36.660327219 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-registry-tls") pod "image-registry-65b68c6657-v6kp5" (UID: "1679ef2e-e9c0-4738-a19c-35582013fe18") : secret "image-registry-tls" not found Apr 23 01:10:26.874738 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:26.874726 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/746bf4ef-4ba5-45d1-9cc6-ab6354c10b18-metrics-tls podName:746bf4ef-4ba5-45d1-9cc6-ab6354c10b18 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:28.874706301 +0000 UTC m=+36.660380652 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/746bf4ef-4ba5-45d1-9cc6-ab6354c10b18-metrics-tls") pod "dns-default-rpd44" (UID: "746bf4ef-4ba5-45d1-9cc6-ab6354c10b18") : secret "dns-default-metrics-tls" not found Apr 23 01:10:26.975039 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:26.975004 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/79d4f338-f964-4fa6-985e-50bbb3b105a9-cert\") pod \"ingress-canary-bk66m\" (UID: \"79d4f338-f964-4fa6-985e-50bbb3b105a9\") " pod="openshift-ingress-canary/ingress-canary-bk66m" Apr 23 01:10:26.975187 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:26.975175 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 01:10:26.975245 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:26.975236 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79d4f338-f964-4fa6-985e-50bbb3b105a9-cert podName:79d4f338-f964-4fa6-985e-50bbb3b105a9 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:28.975217143 +0000 UTC m=+36.760891495 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/79d4f338-f964-4fa6-985e-50bbb3b105a9-cert") pod "ingress-canary-bk66m" (UID: "79d4f338-f964-4fa6-985e-50bbb3b105a9") : secret "canary-serving-cert" not found Apr 23 01:10:27.931324 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:27.931293 2569 generic.go:358] "Generic (PLEG): container finished" podID="b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb" containerID="4a33099f00bd95379551cf50a02ff06f711f787d6bf007b238a89d2225192237" exitCode=0 Apr 23 01:10:27.931739 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:27.931340 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gzwpj" event={"ID":"b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb","Type":"ContainerDied","Data":"4a33099f00bd95379551cf50a02ff06f711f787d6bf007b238a89d2225192237"} Apr 23 01:10:28.889918 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:28.889730 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/746bf4ef-4ba5-45d1-9cc6-ab6354c10b18-metrics-tls\") pod \"dns-default-rpd44\" (UID: \"746bf4ef-4ba5-45d1-9cc6-ab6354c10b18\") " pod="openshift-dns/dns-default-rpd44" Apr 23 01:10:28.889918 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:28.889923 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-registry-tls\") pod \"image-registry-65b68c6657-v6kp5\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:10:28.890107 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:28.889880 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 01:10:28.890107 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:28.890014 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 01:10:28.890107 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:28.890024 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65b68c6657-v6kp5: secret "image-registry-tls" not found Apr 23 01:10:28.890107 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:28.890044 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/746bf4ef-4ba5-45d1-9cc6-ab6354c10b18-metrics-tls podName:746bf4ef-4ba5-45d1-9cc6-ab6354c10b18 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:32.890027416 +0000 UTC m=+40.675701774 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/746bf4ef-4ba5-45d1-9cc6-ab6354c10b18-metrics-tls") pod "dns-default-rpd44" (UID: "746bf4ef-4ba5-45d1-9cc6-ab6354c10b18") : secret "dns-default-metrics-tls" not found Apr 23 01:10:28.890107 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:28.890062 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-registry-tls podName:1679ef2e-e9c0-4738-a19c-35582013fe18 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:32.890051087 +0000 UTC m=+40.675725423 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-registry-tls") pod "image-registry-65b68c6657-v6kp5" (UID: "1679ef2e-e9c0-4738-a19c-35582013fe18") : secret "image-registry-tls" not found Apr 23 01:10:28.935430 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:28.935399 2569 generic.go:358] "Generic (PLEG): container finished" podID="b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb" containerID="5c6eadec53d61eee61c3dd2207a04ce2904033ed2c00ff2a28767024fcecf91e" exitCode=0 Apr 23 01:10:28.935815 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:28.935445 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gzwpj" event={"ID":"b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb","Type":"ContainerDied","Data":"5c6eadec53d61eee61c3dd2207a04ce2904033ed2c00ff2a28767024fcecf91e"} Apr 23 01:10:28.990840 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:28.990806 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/79d4f338-f964-4fa6-985e-50bbb3b105a9-cert\") pod \"ingress-canary-bk66m\" (UID: \"79d4f338-f964-4fa6-985e-50bbb3b105a9\") " pod="openshift-ingress-canary/ingress-canary-bk66m" Apr 23 01:10:28.990984 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:28.990965 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 01:10:28.991060 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:28.991041 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79d4f338-f964-4fa6-985e-50bbb3b105a9-cert podName:79d4f338-f964-4fa6-985e-50bbb3b105a9 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:32.991019743 +0000 UTC m=+40.776694098 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/79d4f338-f964-4fa6-985e-50bbb3b105a9-cert") pod "ingress-canary-bk66m" (UID: "79d4f338-f964-4fa6-985e-50bbb3b105a9") : secret "canary-serving-cert" not found Apr 23 01:10:29.940435 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:29.940389 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gzwpj" event={"ID":"b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb","Type":"ContainerStarted","Data":"5c4d0d1f84ded5c9733236374f381356b86463b194ff782c76a1250d36f023fc"} Apr 23 01:10:29.960187 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:29.960141 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-gzwpj" podStartSLOduration=6.421414866 podStartE2EDuration="37.960128397s" podCreationTimestamp="2026-04-23 01:09:52 +0000 UTC" firstStartedPulling="2026-04-23 01:09:55.374478672 +0000 UTC m=+3.160153018" lastFinishedPulling="2026-04-23 01:10:26.913192197 +0000 UTC m=+34.698866549" observedRunningTime="2026-04-23 01:10:29.959129037 +0000 UTC m=+37.744803394" watchObservedRunningTime="2026-04-23 01:10:29.960128397 +0000 UTC m=+37.745802754" Apr 23 01:10:32.920928 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:32.920886 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/746bf4ef-4ba5-45d1-9cc6-ab6354c10b18-metrics-tls\") pod \"dns-default-rpd44\" (UID: \"746bf4ef-4ba5-45d1-9cc6-ab6354c10b18\") " pod="openshift-dns/dns-default-rpd44" Apr 23 01:10:32.920928 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:32.920932 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-registry-tls\") pod \"image-registry-65b68c6657-v6kp5\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:10:32.921412 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:32.921028 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 01:10:32.921412 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:32.921091 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/746bf4ef-4ba5-45d1-9cc6-ab6354c10b18-metrics-tls podName:746bf4ef-4ba5-45d1-9cc6-ab6354c10b18 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:40.921075951 +0000 UTC m=+48.706750289 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/746bf4ef-4ba5-45d1-9cc6-ab6354c10b18-metrics-tls") pod "dns-default-rpd44" (UID: "746bf4ef-4ba5-45d1-9cc6-ab6354c10b18") : secret "dns-default-metrics-tls" not found Apr 23 01:10:32.921412 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:32.921033 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 01:10:32.921412 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:32.921111 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65b68c6657-v6kp5: secret "image-registry-tls" not found Apr 23 01:10:32.921412 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:32.921135 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-registry-tls podName:1679ef2e-e9c0-4738-a19c-35582013fe18 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:40.921129216 +0000 UTC m=+48.706803550 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-registry-tls") pod "image-registry-65b68c6657-v6kp5" (UID: "1679ef2e-e9c0-4738-a19c-35582013fe18") : secret "image-registry-tls" not found Apr 23 01:10:33.021817 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:33.021778 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/79d4f338-f964-4fa6-985e-50bbb3b105a9-cert\") pod \"ingress-canary-bk66m\" (UID: \"79d4f338-f964-4fa6-985e-50bbb3b105a9\") " pod="openshift-ingress-canary/ingress-canary-bk66m" Apr 23 01:10:33.021952 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:33.021885 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 01:10:33.021952 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:33.021946 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79d4f338-f964-4fa6-985e-50bbb3b105a9-cert podName:79d4f338-f964-4fa6-985e-50bbb3b105a9 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:41.021932332 +0000 UTC m=+48.807606667 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/79d4f338-f964-4fa6-985e-50bbb3b105a9-cert") pod "ingress-canary-bk66m" (UID: "79d4f338-f964-4fa6-985e-50bbb3b105a9") : secret "canary-serving-cert" not found Apr 23 01:10:33.324221 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:33.324140 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ae1a6350-32fe-4569-a4d1-9c369aaff8e4-original-pull-secret\") pod \"global-pull-secret-syncer-6jtxr\" (UID: \"ae1a6350-32fe-4569-a4d1-9c369aaff8e4\") " pod="kube-system/global-pull-secret-syncer-6jtxr" Apr 23 01:10:33.327590 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:33.327565 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ae1a6350-32fe-4569-a4d1-9c369aaff8e4-original-pull-secret\") pod \"global-pull-secret-syncer-6jtxr\" (UID: \"ae1a6350-32fe-4569-a4d1-9c369aaff8e4\") " pod="kube-system/global-pull-secret-syncer-6jtxr" Apr 23 01:10:33.574350 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:33.574270 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6jtxr" Apr 23 01:10:33.721469 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:33.721435 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6jtxr"] Apr 23 01:10:33.727219 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:10:33.727189 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae1a6350_32fe_4569_a4d1_9c369aaff8e4.slice/crio-4d814eaead8876bdbce298f1b073aace61d9657c784f666ab03f2f2c6ad54e5e WatchSource:0}: Error finding container 4d814eaead8876bdbce298f1b073aace61d9657c784f666ab03f2f2c6ad54e5e: Status 404 returned error can't find the container with id 4d814eaead8876bdbce298f1b073aace61d9657c784f666ab03f2f2c6ad54e5e Apr 23 01:10:33.948357 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:33.948321 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6jtxr" event={"ID":"ae1a6350-32fe-4569-a4d1-9c369aaff8e4","Type":"ContainerStarted","Data":"4d814eaead8876bdbce298f1b073aace61d9657c784f666ab03f2f2c6ad54e5e"} Apr 23 01:10:39.961232 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:39.961189 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6jtxr" event={"ID":"ae1a6350-32fe-4569-a4d1-9c369aaff8e4","Type":"ContainerStarted","Data":"18f8bfb1d53075c61e6de9fb2d7d6e26458a4be9c6a4a68ba81e0838fddc36a5"} Apr 23 01:10:39.975042 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:39.974997 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-6jtxr" podStartSLOduration=33.369882311 podStartE2EDuration="38.974984858s" podCreationTimestamp="2026-04-23 01:10:01 +0000 UTC" firstStartedPulling="2026-04-23 01:10:33.729464163 +0000 UTC m=+41.515138501" lastFinishedPulling="2026-04-23 01:10:39.334566705 +0000 UTC m=+47.120241048" observedRunningTime="2026-04-23 01:10:39.974558175 +0000 UTC m=+47.760232533" watchObservedRunningTime="2026-04-23 01:10:39.974984858 +0000 UTC m=+47.760659236" Apr 23 01:10:40.983044 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:40.983008 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/746bf4ef-4ba5-45d1-9cc6-ab6354c10b18-metrics-tls\") pod \"dns-default-rpd44\" (UID: \"746bf4ef-4ba5-45d1-9cc6-ab6354c10b18\") " pod="openshift-dns/dns-default-rpd44" Apr 23 01:10:40.983477 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:40.983051 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-registry-tls\") pod \"image-registry-65b68c6657-v6kp5\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:10:40.983477 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:40.983149 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 01:10:40.983477 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:40.983154 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 01:10:40.983477 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:40.983209 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/746bf4ef-4ba5-45d1-9cc6-ab6354c10b18-metrics-tls podName:746bf4ef-4ba5-45d1-9cc6-ab6354c10b18 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:56.983195165 +0000 UTC m=+64.768869501 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/746bf4ef-4ba5-45d1-9cc6-ab6354c10b18-metrics-tls") pod "dns-default-rpd44" (UID: "746bf4ef-4ba5-45d1-9cc6-ab6354c10b18") : secret "dns-default-metrics-tls" not found Apr 23 01:10:40.983477 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:40.983212 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65b68c6657-v6kp5: secret "image-registry-tls" not found Apr 23 01:10:40.983477 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:40.983257 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-registry-tls podName:1679ef2e-e9c0-4738-a19c-35582013fe18 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:56.983242878 +0000 UTC m=+64.768917212 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-registry-tls") pod "image-registry-65b68c6657-v6kp5" (UID: "1679ef2e-e9c0-4738-a19c-35582013fe18") : secret "image-registry-tls" not found Apr 23 01:10:41.084395 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:41.084369 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/79d4f338-f964-4fa6-985e-50bbb3b105a9-cert\") pod \"ingress-canary-bk66m\" (UID: \"79d4f338-f964-4fa6-985e-50bbb3b105a9\") " pod="openshift-ingress-canary/ingress-canary-bk66m" Apr 23 01:10:41.084524 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:41.084495 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 01:10:41.084568 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:41.084547 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79d4f338-f964-4fa6-985e-50bbb3b105a9-cert podName:79d4f338-f964-4fa6-985e-50bbb3b105a9 nodeName:}" failed. No retries permitted until 2026-04-23 01:10:57.084534148 +0000 UTC m=+64.870208488 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/79d4f338-f964-4fa6-985e-50bbb3b105a9-cert") pod "ingress-canary-bk66m" (UID: "79d4f338-f964-4fa6-985e-50bbb3b105a9") : secret "canary-serving-cert" not found Apr 23 01:10:50.928499 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:50.928470 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-79v47" Apr 23 01:10:56.993367 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:56.993325 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/746bf4ef-4ba5-45d1-9cc6-ab6354c10b18-metrics-tls\") pod \"dns-default-rpd44\" (UID: \"746bf4ef-4ba5-45d1-9cc6-ab6354c10b18\") " pod="openshift-dns/dns-default-rpd44" Apr 23 01:10:56.993780 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:56.993378 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-registry-tls\") pod \"image-registry-65b68c6657-v6kp5\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:10:56.993780 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:56.993480 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 01:10:56.993780 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:56.993535 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 01:10:56.993780 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:56.993549 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65b68c6657-v6kp5: secret "image-registry-tls" not found Apr 23 01:10:56.993780 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:56.993559 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/746bf4ef-4ba5-45d1-9cc6-ab6354c10b18-metrics-tls podName:746bf4ef-4ba5-45d1-9cc6-ab6354c10b18 nodeName:}" failed. No retries permitted until 2026-04-23 01:11:28.993535881 +0000 UTC m=+96.779210233 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/746bf4ef-4ba5-45d1-9cc6-ab6354c10b18-metrics-tls") pod "dns-default-rpd44" (UID: "746bf4ef-4ba5-45d1-9cc6-ab6354c10b18") : secret "dns-default-metrics-tls" not found Apr 23 01:10:56.993780 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:56.993585 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-registry-tls podName:1679ef2e-e9c0-4738-a19c-35582013fe18 nodeName:}" failed. No retries permitted until 2026-04-23 01:11:28.993574657 +0000 UTC m=+96.779249007 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-registry-tls") pod "image-registry-65b68c6657-v6kp5" (UID: "1679ef2e-e9c0-4738-a19c-35582013fe18") : secret "image-registry-tls" not found Apr 23 01:10:57.094293 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:57.094254 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/79d4f338-f964-4fa6-985e-50bbb3b105a9-cert\") pod \"ingress-canary-bk66m\" (UID: \"79d4f338-f964-4fa6-985e-50bbb3b105a9\") " pod="openshift-ingress-canary/ingress-canary-bk66m" Apr 23 01:10:57.094421 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:57.094394 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 01:10:57.094512 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:57.094471 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79d4f338-f964-4fa6-985e-50bbb3b105a9-cert podName:79d4f338-f964-4fa6-985e-50bbb3b105a9 nodeName:}" failed. No retries permitted until 2026-04-23 01:11:29.094455643 +0000 UTC m=+96.880129983 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/79d4f338-f964-4fa6-985e-50bbb3b105a9-cert") pod "ingress-canary-bk66m" (UID: "79d4f338-f964-4fa6-985e-50bbb3b105a9") : secret "canary-serving-cert" not found Apr 23 01:10:58.401937 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:58.401899 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xdvsx\" (UniqueName: \"kubernetes.io/projected/943b6178-2514-4112-956c-8705385d8a3d-kube-api-access-xdvsx\") pod \"network-check-target-jbfxg\" (UID: \"943b6178-2514-4112-956c-8705385d8a3d\") " pod="openshift-network-diagnostics/network-check-target-jbfxg" Apr 23 01:10:58.404463 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:58.404445 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 01:10:58.414105 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:58.414090 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 01:10:58.425195 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:58.425171 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdvsx\" (UniqueName: \"kubernetes.io/projected/943b6178-2514-4112-956c-8705385d8a3d-kube-api-access-xdvsx\") pod \"network-check-target-jbfxg\" (UID: \"943b6178-2514-4112-956c-8705385d8a3d\") " pod="openshift-network-diagnostics/network-check-target-jbfxg" Apr 23 01:10:58.502993 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:58.502956 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/608e8d52-e2cd-48e3-b524-0f0d764d9501-metrics-certs\") pod \"network-metrics-daemon-5mm4v\" (UID: \"608e8d52-e2cd-48e3-b524-0f0d764d9501\") " pod="openshift-multus/network-metrics-daemon-5mm4v" Apr 23 01:10:58.505009 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:58.504991 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 01:10:58.513614 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:58.513593 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 01:10:58.513677 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:10:58.513667 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/608e8d52-e2cd-48e3-b524-0f0d764d9501-metrics-certs podName:608e8d52-e2cd-48e3-b524-0f0d764d9501 nodeName:}" failed. No retries permitted until 2026-04-23 01:12:02.513652376 +0000 UTC m=+130.299326711 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/608e8d52-e2cd-48e3-b524-0f0d764d9501-metrics-certs") pod "network-metrics-daemon-5mm4v" (UID: "608e8d52-e2cd-48e3-b524-0f0d764d9501") : secret "metrics-daemon-secret" not found Apr 23 01:10:58.580995 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:58.580969 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-hjv94\"" Apr 23 01:10:58.589706 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:58.589684 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jbfxg" Apr 23 01:10:58.700650 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:58.700593 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jbfxg"] Apr 23 01:10:58.704744 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:10:58.704716 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod943b6178_2514_4112_956c_8705385d8a3d.slice/crio-4b6fc092ebc8ee6a4767fa4c2834cc0d98b3db3bfe0aeac137f8885eae22caae WatchSource:0}: Error finding container 4b6fc092ebc8ee6a4767fa4c2834cc0d98b3db3bfe0aeac137f8885eae22caae: Status 404 returned error can't find the container with id 4b6fc092ebc8ee6a4767fa4c2834cc0d98b3db3bfe0aeac137f8885eae22caae Apr 23 01:10:58.998939 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:10:58.998851 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jbfxg" event={"ID":"943b6178-2514-4112-956c-8705385d8a3d","Type":"ContainerStarted","Data":"4b6fc092ebc8ee6a4767fa4c2834cc0d98b3db3bfe0aeac137f8885eae22caae"} Apr 23 01:11:02.006099 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:02.006065 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jbfxg" event={"ID":"943b6178-2514-4112-956c-8705385d8a3d","Type":"ContainerStarted","Data":"e26d1edb4fc3f3c5af482c4196a2731d664d91b0a008a5245ade87c6e50c0eae"} Apr 23 01:11:02.006505 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:02.006179 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-jbfxg" Apr 23 01:11:02.020693 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:02.020647 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-jbfxg" podStartSLOduration=67.478466738 podStartE2EDuration="1m10.020635032s" podCreationTimestamp="2026-04-23 01:09:52 +0000 UTC" firstStartedPulling="2026-04-23 01:10:58.706544335 +0000 UTC m=+66.492218670" lastFinishedPulling="2026-04-23 01:11:01.248712608 +0000 UTC m=+69.034386964" observedRunningTime="2026-04-23 01:11:02.019884032 +0000 UTC m=+69.805558388" watchObservedRunningTime="2026-04-23 01:11:02.020635032 +0000 UTC m=+69.806309386" Apr 23 01:11:29.029591 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:29.029526 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/746bf4ef-4ba5-45d1-9cc6-ab6354c10b18-metrics-tls\") pod \"dns-default-rpd44\" (UID: \"746bf4ef-4ba5-45d1-9cc6-ab6354c10b18\") " pod="openshift-dns/dns-default-rpd44" Apr 23 01:11:29.029591 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:29.029593 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-registry-tls\") pod \"image-registry-65b68c6657-v6kp5\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:11:29.030127 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:11:29.029693 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 01:11:29.030127 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:11:29.029751 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 01:11:29.030127 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:11:29.029770 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65b68c6657-v6kp5: secret "image-registry-tls" not found Apr 23 01:11:29.030127 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:11:29.029777 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/746bf4ef-4ba5-45d1-9cc6-ab6354c10b18-metrics-tls podName:746bf4ef-4ba5-45d1-9cc6-ab6354c10b18 nodeName:}" failed. No retries permitted until 2026-04-23 01:12:33.029760225 +0000 UTC m=+160.815434579 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/746bf4ef-4ba5-45d1-9cc6-ab6354c10b18-metrics-tls") pod "dns-default-rpd44" (UID: "746bf4ef-4ba5-45d1-9cc6-ab6354c10b18") : secret "dns-default-metrics-tls" not found Apr 23 01:11:29.030127 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:11:29.029813 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-registry-tls podName:1679ef2e-e9c0-4738-a19c-35582013fe18 nodeName:}" failed. No retries permitted until 2026-04-23 01:12:33.029799088 +0000 UTC m=+160.815473423 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-registry-tls") pod "image-registry-65b68c6657-v6kp5" (UID: "1679ef2e-e9c0-4738-a19c-35582013fe18") : secret "image-registry-tls" not found Apr 23 01:11:29.130680 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:29.130645 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/79d4f338-f964-4fa6-985e-50bbb3b105a9-cert\") pod \"ingress-canary-bk66m\" (UID: \"79d4f338-f964-4fa6-985e-50bbb3b105a9\") " pod="openshift-ingress-canary/ingress-canary-bk66m" Apr 23 01:11:29.130847 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:11:29.130761 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 01:11:29.130847 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:11:29.130814 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79d4f338-f964-4fa6-985e-50bbb3b105a9-cert podName:79d4f338-f964-4fa6-985e-50bbb3b105a9 nodeName:}" failed. No retries permitted until 2026-04-23 01:12:33.130799545 +0000 UTC m=+160.916473881 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/79d4f338-f964-4fa6-985e-50bbb3b105a9-cert") pod "ingress-canary-bk66m" (UID: "79d4f338-f964-4fa6-985e-50bbb3b105a9") : secret "canary-serving-cert" not found Apr 23 01:11:33.010111 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:33.010078 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-jbfxg" Apr 23 01:11:41.487494 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.487456 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-qgtjh"] Apr 23 01:11:41.490300 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.490283 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-sxqx7"] Apr 23 01:11:41.490454 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.490435 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qgtjh" Apr 23 01:11:41.492790 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.492765 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 01:11:41.492790 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.492792 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-sxqx7" Apr 23 01:11:41.492972 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.492844 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 23 01:11:41.493954 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.493931 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 01:11:41.494063 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.493970 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 23 01:11:41.494134 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.494078 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-7bzlt\"" Apr 23 01:11:41.494685 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.494665 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 23 01:11:41.494780 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.494737 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 01:11:41.494842 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.494785 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-x88z7\"" Apr 23 01:11:41.494842 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.494828 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 23 01:11:41.495011 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.494994 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 01:11:41.501743 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.501715 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-qgtjh"] Apr 23 01:11:41.502399 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.502377 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-sxqx7"] Apr 23 01:11:41.503169 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.503147 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 23 01:11:41.626506 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.626471 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7df9359b-30f8-4806-98db-cf999c7f0ed8-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-sxqx7\" (UID: \"7df9359b-30f8-4806-98db-cf999c7f0ed8\") " pod="openshift-insights/insights-operator-585dfdc468-sxqx7" Apr 23 01:11:41.626506 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.626512 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/287bea1a-68cd-49c1-a36a-2fc24dbc7719-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qgtjh\" (UID: \"287bea1a-68cd-49c1-a36a-2fc24dbc7719\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qgtjh" Apr 23 01:11:41.626758 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.626646 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7df9359b-30f8-4806-98db-cf999c7f0ed8-tmp\") pod \"insights-operator-585dfdc468-sxqx7\" (UID: \"7df9359b-30f8-4806-98db-cf999c7f0ed8\") " pod="openshift-insights/insights-operator-585dfdc468-sxqx7" Apr 23 01:11:41.626758 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.626683 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7df9359b-30f8-4806-98db-cf999c7f0ed8-service-ca-bundle\") pod \"insights-operator-585dfdc468-sxqx7\" (UID: \"7df9359b-30f8-4806-98db-cf999c7f0ed8\") " pod="openshift-insights/insights-operator-585dfdc468-sxqx7" Apr 23 01:11:41.626758 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.626717 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/7df9359b-30f8-4806-98db-cf999c7f0ed8-snapshots\") pod \"insights-operator-585dfdc468-sxqx7\" (UID: \"7df9359b-30f8-4806-98db-cf999c7f0ed8\") " pod="openshift-insights/insights-operator-585dfdc468-sxqx7" Apr 23 01:11:41.626758 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.626738 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/287bea1a-68cd-49c1-a36a-2fc24dbc7719-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-qgtjh\" (UID: \"287bea1a-68cd-49c1-a36a-2fc24dbc7719\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qgtjh" Apr 23 01:11:41.626901 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.626761 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7df9359b-30f8-4806-98db-cf999c7f0ed8-serving-cert\") pod \"insights-operator-585dfdc468-sxqx7\" (UID: \"7df9359b-30f8-4806-98db-cf999c7f0ed8\") " pod="openshift-insights/insights-operator-585dfdc468-sxqx7" Apr 23 01:11:41.626901 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.626783 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg2tw\" (UniqueName: \"kubernetes.io/projected/287bea1a-68cd-49c1-a36a-2fc24dbc7719-kube-api-access-zg2tw\") pod \"cluster-monitoring-operator-75587bd455-qgtjh\" (UID: \"287bea1a-68cd-49c1-a36a-2fc24dbc7719\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qgtjh" Apr 23 01:11:41.626901 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.626844 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g8n6\" (UniqueName: \"kubernetes.io/projected/7df9359b-30f8-4806-98db-cf999c7f0ed8-kube-api-access-8g8n6\") pod \"insights-operator-585dfdc468-sxqx7\" (UID: \"7df9359b-30f8-4806-98db-cf999c7f0ed8\") " pod="openshift-insights/insights-operator-585dfdc468-sxqx7" Apr 23 01:11:41.727752 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.727707 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/7df9359b-30f8-4806-98db-cf999c7f0ed8-snapshots\") pod \"insights-operator-585dfdc468-sxqx7\" (UID: \"7df9359b-30f8-4806-98db-cf999c7f0ed8\") " pod="openshift-insights/insights-operator-585dfdc468-sxqx7" Apr 23 01:11:41.727873 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.727785 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/287bea1a-68cd-49c1-a36a-2fc24dbc7719-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-qgtjh\" (UID: \"287bea1a-68cd-49c1-a36a-2fc24dbc7719\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qgtjh" Apr 23 01:11:41.727873 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.727809 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7df9359b-30f8-4806-98db-cf999c7f0ed8-serving-cert\") pod \"insights-operator-585dfdc468-sxqx7\" (UID: \"7df9359b-30f8-4806-98db-cf999c7f0ed8\") " pod="openshift-insights/insights-operator-585dfdc468-sxqx7" Apr 23 01:11:41.727873 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.727827 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zg2tw\" (UniqueName: \"kubernetes.io/projected/287bea1a-68cd-49c1-a36a-2fc24dbc7719-kube-api-access-zg2tw\") pod \"cluster-monitoring-operator-75587bd455-qgtjh\" (UID: \"287bea1a-68cd-49c1-a36a-2fc24dbc7719\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qgtjh" Apr 23 01:11:41.727873 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.727849 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8g8n6\" (UniqueName: \"kubernetes.io/projected/7df9359b-30f8-4806-98db-cf999c7f0ed8-kube-api-access-8g8n6\") pod \"insights-operator-585dfdc468-sxqx7\" (UID: \"7df9359b-30f8-4806-98db-cf999c7f0ed8\") " pod="openshift-insights/insights-operator-585dfdc468-sxqx7" Apr 23 01:11:41.728070 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.727886 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7df9359b-30f8-4806-98db-cf999c7f0ed8-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-sxqx7\" (UID: \"7df9359b-30f8-4806-98db-cf999c7f0ed8\") " pod="openshift-insights/insights-operator-585dfdc468-sxqx7" Apr 23 01:11:41.728070 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.727932 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/287bea1a-68cd-49c1-a36a-2fc24dbc7719-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qgtjh\" (UID: \"287bea1a-68cd-49c1-a36a-2fc24dbc7719\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qgtjh" Apr 23 01:11:41.728070 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.727998 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7df9359b-30f8-4806-98db-cf999c7f0ed8-tmp\") pod \"insights-operator-585dfdc468-sxqx7\" (UID: \"7df9359b-30f8-4806-98db-cf999c7f0ed8\") " pod="openshift-insights/insights-operator-585dfdc468-sxqx7" Apr 23 01:11:41.728070 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.728038 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7df9359b-30f8-4806-98db-cf999c7f0ed8-service-ca-bundle\") pod \"insights-operator-585dfdc468-sxqx7\" (UID: \"7df9359b-30f8-4806-98db-cf999c7f0ed8\") " pod="openshift-insights/insights-operator-585dfdc468-sxqx7" Apr 23 01:11:41.728277 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:11:41.728162 2569 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 01:11:41.728277 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:11:41.728247 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/287bea1a-68cd-49c1-a36a-2fc24dbc7719-cluster-monitoring-operator-tls podName:287bea1a-68cd-49c1-a36a-2fc24dbc7719 nodeName:}" failed. No retries permitted until 2026-04-23 01:11:42.228223915 +0000 UTC m=+110.013898265 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/287bea1a-68cd-49c1-a36a-2fc24dbc7719-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-qgtjh" (UID: "287bea1a-68cd-49c1-a36a-2fc24dbc7719") : secret "cluster-monitoring-operator-tls" not found Apr 23 01:11:41.728493 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.728462 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/7df9359b-30f8-4806-98db-cf999c7f0ed8-snapshots\") pod \"insights-operator-585dfdc468-sxqx7\" (UID: \"7df9359b-30f8-4806-98db-cf999c7f0ed8\") " pod="openshift-insights/insights-operator-585dfdc468-sxqx7" Apr 23 01:11:41.728575 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.728488 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7df9359b-30f8-4806-98db-cf999c7f0ed8-tmp\") pod \"insights-operator-585dfdc468-sxqx7\" (UID: \"7df9359b-30f8-4806-98db-cf999c7f0ed8\") " pod="openshift-insights/insights-operator-585dfdc468-sxqx7" Apr 23 01:11:41.728697 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.728674 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/287bea1a-68cd-49c1-a36a-2fc24dbc7719-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-qgtjh\" (UID: \"287bea1a-68cd-49c1-a36a-2fc24dbc7719\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qgtjh" Apr 23 01:11:41.728815 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.728799 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7df9359b-30f8-4806-98db-cf999c7f0ed8-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-sxqx7\" (UID: \"7df9359b-30f8-4806-98db-cf999c7f0ed8\") " pod="openshift-insights/insights-operator-585dfdc468-sxqx7" Apr 23 01:11:41.729136 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.729121 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7df9359b-30f8-4806-98db-cf999c7f0ed8-service-ca-bundle\") pod \"insights-operator-585dfdc468-sxqx7\" (UID: \"7df9359b-30f8-4806-98db-cf999c7f0ed8\") " pod="openshift-insights/insights-operator-585dfdc468-sxqx7" Apr 23 01:11:41.730105 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.730083 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7df9359b-30f8-4806-98db-cf999c7f0ed8-serving-cert\") pod \"insights-operator-585dfdc468-sxqx7\" (UID: \"7df9359b-30f8-4806-98db-cf999c7f0ed8\") " pod="openshift-insights/insights-operator-585dfdc468-sxqx7" Apr 23 01:11:41.738156 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.738099 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg2tw\" (UniqueName: \"kubernetes.io/projected/287bea1a-68cd-49c1-a36a-2fc24dbc7719-kube-api-access-zg2tw\") pod \"cluster-monitoring-operator-75587bd455-qgtjh\" (UID: \"287bea1a-68cd-49c1-a36a-2fc24dbc7719\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qgtjh" Apr 23 01:11:41.738156 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.738149 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g8n6\" (UniqueName: \"kubernetes.io/projected/7df9359b-30f8-4806-98db-cf999c7f0ed8-kube-api-access-8g8n6\") pod \"insights-operator-585dfdc468-sxqx7\" (UID: \"7df9359b-30f8-4806-98db-cf999c7f0ed8\") " pod="openshift-insights/insights-operator-585dfdc468-sxqx7" Apr 23 01:11:41.811128 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.811077 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-sxqx7" Apr 23 01:11:41.923848 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:41.923814 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-sxqx7"] Apr 23 01:11:41.926416 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:11:41.926388 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7df9359b_30f8_4806_98db_cf999c7f0ed8.slice/crio-cd9d9132b75b41b9eb102cccf44a0d151e42fff25d72a33d7cbe1cbbe2c282a0 WatchSource:0}: Error finding container cd9d9132b75b41b9eb102cccf44a0d151e42fff25d72a33d7cbe1cbbe2c282a0: Status 404 returned error can't find the container with id cd9d9132b75b41b9eb102cccf44a0d151e42fff25d72a33d7cbe1cbbe2c282a0 Apr 23 01:11:42.084316 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:42.084233 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-sxqx7" event={"ID":"7df9359b-30f8-4806-98db-cf999c7f0ed8","Type":"ContainerStarted","Data":"cd9d9132b75b41b9eb102cccf44a0d151e42fff25d72a33d7cbe1cbbe2c282a0"} Apr 23 01:11:42.232594 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:42.232564 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/287bea1a-68cd-49c1-a36a-2fc24dbc7719-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qgtjh\" (UID: \"287bea1a-68cd-49c1-a36a-2fc24dbc7719\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qgtjh" Apr 23 01:11:42.232753 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:11:42.232734 2569 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 01:11:42.232834 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:11:42.232819 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/287bea1a-68cd-49c1-a36a-2fc24dbc7719-cluster-monitoring-operator-tls podName:287bea1a-68cd-49c1-a36a-2fc24dbc7719 nodeName:}" failed. No retries permitted until 2026-04-23 01:11:43.232798758 +0000 UTC m=+111.018473107 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/287bea1a-68cd-49c1-a36a-2fc24dbc7719-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-qgtjh" (UID: "287bea1a-68cd-49c1-a36a-2fc24dbc7719") : secret "cluster-monitoring-operator-tls" not found Apr 23 01:11:43.240814 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:43.240774 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/287bea1a-68cd-49c1-a36a-2fc24dbc7719-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qgtjh\" (UID: \"287bea1a-68cd-49c1-a36a-2fc24dbc7719\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qgtjh" Apr 23 01:11:43.241264 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:11:43.240931 2569 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 01:11:43.241264 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:11:43.240998 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/287bea1a-68cd-49c1-a36a-2fc24dbc7719-cluster-monitoring-operator-tls podName:287bea1a-68cd-49c1-a36a-2fc24dbc7719 nodeName:}" failed. No retries permitted until 2026-04-23 01:11:45.240982838 +0000 UTC m=+113.026657173 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/287bea1a-68cd-49c1-a36a-2fc24dbc7719-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-qgtjh" (UID: "287bea1a-68cd-49c1-a36a-2fc24dbc7719") : secret "cluster-monitoring-operator-tls" not found Apr 23 01:11:45.258262 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:45.258219 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/287bea1a-68cd-49c1-a36a-2fc24dbc7719-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qgtjh\" (UID: \"287bea1a-68cd-49c1-a36a-2fc24dbc7719\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qgtjh" Apr 23 01:11:45.258659 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:11:45.258383 2569 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 01:11:45.258659 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:11:45.258455 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/287bea1a-68cd-49c1-a36a-2fc24dbc7719-cluster-monitoring-operator-tls podName:287bea1a-68cd-49c1-a36a-2fc24dbc7719 nodeName:}" failed. No retries permitted until 2026-04-23 01:11:49.258436546 +0000 UTC m=+117.044110880 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/287bea1a-68cd-49c1-a36a-2fc24dbc7719-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-qgtjh" (UID: "287bea1a-68cd-49c1-a36a-2fc24dbc7719") : secret "cluster-monitoring-operator-tls" not found Apr 23 01:11:46.092555 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:46.092515 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-sxqx7" event={"ID":"7df9359b-30f8-4806-98db-cf999c7f0ed8","Type":"ContainerStarted","Data":"b59396295ce0c96421f2e3a9d5ec6e9a623378d4e0ba0789648fa275c48df7d7"} Apr 23 01:11:46.118198 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:46.118150 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-sxqx7" podStartSLOduration=1.96752371 podStartE2EDuration="5.118135925s" podCreationTimestamp="2026-04-23 01:11:41 +0000 UTC" firstStartedPulling="2026-04-23 01:11:41.928159438 +0000 UTC m=+109.713833773" lastFinishedPulling="2026-04-23 01:11:45.07877165 +0000 UTC m=+112.864445988" observedRunningTime="2026-04-23 01:11:46.117553857 +0000 UTC m=+113.903228214" watchObservedRunningTime="2026-04-23 01:11:46.118135925 +0000 UTC m=+113.903810281" Apr 23 01:11:47.781359 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:47.781330 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-fgvhm_b025c029-af84-46be-a329-3c26d61f764a/dns-node-resolver/0.log" Apr 23 01:11:48.786379 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:48.786351 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-grrph_83355238-6978-4f5d-8b07-0ea3d3784353/node-ca/0.log" Apr 23 01:11:49.287955 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:49.287919 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/287bea1a-68cd-49c1-a36a-2fc24dbc7719-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qgtjh\" (UID: \"287bea1a-68cd-49c1-a36a-2fc24dbc7719\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qgtjh" Apr 23 01:11:49.288135 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:11:49.288060 2569 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 01:11:49.288135 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:11:49.288115 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/287bea1a-68cd-49c1-a36a-2fc24dbc7719-cluster-monitoring-operator-tls podName:287bea1a-68cd-49c1-a36a-2fc24dbc7719 nodeName:}" failed. No retries permitted until 2026-04-23 01:11:57.288101236 +0000 UTC m=+125.073775572 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/287bea1a-68cd-49c1-a36a-2fc24dbc7719-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-qgtjh" (UID: "287bea1a-68cd-49c1-a36a-2fc24dbc7719") : secret "cluster-monitoring-operator-tls" not found Apr 23 01:11:50.523376 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:50.523342 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4f877"] Apr 23 01:11:50.526430 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:50.526415 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4f877" Apr 23 01:11:50.528690 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:50.528667 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-vpsm2\"" Apr 23 01:11:50.528775 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:50.528677 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 23 01:11:50.529334 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:50.529313 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 23 01:11:50.529435 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:50.529316 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 23 01:11:50.535039 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:50.535022 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4f877"] Apr 23 01:11:50.600935 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:50.600890 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm6kj\" (UniqueName: \"kubernetes.io/projected/cc66612f-a771-483f-99ec-74f874ac2d4d-kube-api-access-mm6kj\") pod \"cluster-samples-operator-6dc5bdb6b4-4f877\" (UID: \"cc66612f-a771-483f-99ec-74f874ac2d4d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4f877" Apr 23 01:11:50.600935 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:50.600937 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cc66612f-a771-483f-99ec-74f874ac2d4d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4f877\" (UID: \"cc66612f-a771-483f-99ec-74f874ac2d4d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4f877" Apr 23 01:11:50.624124 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:50.624091 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-jnpmf"] Apr 23 01:11:50.626941 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:50.626924 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-jnpmf" Apr 23 01:11:50.629180 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:50.629159 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 23 01:11:50.629180 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:50.629169 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-z6gb2\"" Apr 23 01:11:50.629392 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:50.629162 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 23 01:11:50.633890 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:50.633862 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-jnpmf"] Apr 23 01:11:50.702240 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:50.702201 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mm6kj\" (UniqueName: \"kubernetes.io/projected/cc66612f-a771-483f-99ec-74f874ac2d4d-kube-api-access-mm6kj\") pod \"cluster-samples-operator-6dc5bdb6b4-4f877\" (UID: \"cc66612f-a771-483f-99ec-74f874ac2d4d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4f877" Apr 23 01:11:50.702240 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:50.702242 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cc66612f-a771-483f-99ec-74f874ac2d4d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4f877\" (UID: \"cc66612f-a771-483f-99ec-74f874ac2d4d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4f877" Apr 23 01:11:50.702457 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:50.702284 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkd78\" (UniqueName: \"kubernetes.io/projected/b4aa80dc-3684-4d0e-b46a-70d65c9c0782-kube-api-access-gkd78\") pod \"volume-data-source-validator-7c6cbb6c87-jnpmf\" (UID: \"b4aa80dc-3684-4d0e-b46a-70d65c9c0782\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-jnpmf" Apr 23 01:11:50.702457 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:11:50.702381 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 01:11:50.702457 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:11:50.702437 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc66612f-a771-483f-99ec-74f874ac2d4d-samples-operator-tls podName:cc66612f-a771-483f-99ec-74f874ac2d4d nodeName:}" failed. No retries permitted until 2026-04-23 01:11:51.202424164 +0000 UTC m=+118.988098500 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/cc66612f-a771-483f-99ec-74f874ac2d4d-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-4f877" (UID: "cc66612f-a771-483f-99ec-74f874ac2d4d") : secret "samples-operator-tls" not found Apr 23 01:11:50.712400 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:50.712380 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm6kj\" (UniqueName: \"kubernetes.io/projected/cc66612f-a771-483f-99ec-74f874ac2d4d-kube-api-access-mm6kj\") pod \"cluster-samples-operator-6dc5bdb6b4-4f877\" (UID: \"cc66612f-a771-483f-99ec-74f874ac2d4d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4f877" Apr 23 01:11:50.802950 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:50.802842 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gkd78\" (UniqueName: \"kubernetes.io/projected/b4aa80dc-3684-4d0e-b46a-70d65c9c0782-kube-api-access-gkd78\") pod \"volume-data-source-validator-7c6cbb6c87-jnpmf\" (UID: \"b4aa80dc-3684-4d0e-b46a-70d65c9c0782\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-jnpmf" Apr 23 01:11:50.809773 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:50.809742 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkd78\" (UniqueName: \"kubernetes.io/projected/b4aa80dc-3684-4d0e-b46a-70d65c9c0782-kube-api-access-gkd78\") pod \"volume-data-source-validator-7c6cbb6c87-jnpmf\" (UID: \"b4aa80dc-3684-4d0e-b46a-70d65c9c0782\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-jnpmf" Apr 23 01:11:50.936340 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:50.936307 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-jnpmf" Apr 23 01:11:51.048228 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.048175 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-jnpmf"] Apr 23 01:11:51.051793 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:11:51.051761 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4aa80dc_3684_4d0e_b46a_70d65c9c0782.slice/crio-8b7153455dbfe6c635f67f2141f7bdde1163e9e96f8d76f93d93858a050d47c5 WatchSource:0}: Error finding container 8b7153455dbfe6c635f67f2141f7bdde1163e9e96f8d76f93d93858a050d47c5: Status 404 returned error can't find the container with id 8b7153455dbfe6c635f67f2141f7bdde1163e9e96f8d76f93d93858a050d47c5 Apr 23 01:11:51.102166 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.102131 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-jnpmf" event={"ID":"b4aa80dc-3684-4d0e-b46a-70d65c9c0782","Type":"ContainerStarted","Data":"8b7153455dbfe6c635f67f2141f7bdde1163e9e96f8d76f93d93858a050d47c5"} Apr 23 01:11:51.206104 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.206063 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cc66612f-a771-483f-99ec-74f874ac2d4d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4f877\" (UID: \"cc66612f-a771-483f-99ec-74f874ac2d4d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4f877" Apr 23 01:11:51.206280 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:11:51.206184 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 01:11:51.206280 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:11:51.206241 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc66612f-a771-483f-99ec-74f874ac2d4d-samples-operator-tls podName:cc66612f-a771-483f-99ec-74f874ac2d4d nodeName:}" failed. No retries permitted until 2026-04-23 01:11:52.206226907 +0000 UTC m=+119.991901242 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/cc66612f-a771-483f-99ec-74f874ac2d4d-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-4f877" (UID: "cc66612f-a771-483f-99ec-74f874ac2d4d") : secret "samples-operator-tls" not found Apr 23 01:11:51.525980 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.525947 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-grc82"] Apr 23 01:11:51.530183 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.530167 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-grc82" Apr 23 01:11:51.532395 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.532374 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 23 01:11:51.532692 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.532674 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 23 01:11:51.532781 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.532695 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 23 01:11:51.532781 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.532677 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 23 01:11:51.533317 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.533304 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-2g9r8\"" Apr 23 01:11:51.538025 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.538005 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 23 01:11:51.538561 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.538540 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-grc82"] Apr 23 01:11:51.609544 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.609511 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58715151-e1c9-475c-be78-487774704c95-trusted-ca\") pod \"console-operator-9d4b6777b-grc82\" (UID: \"58715151-e1c9-475c-be78-487774704c95\") " pod="openshift-console-operator/console-operator-9d4b6777b-grc82" Apr 23 01:11:51.609544 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.609547 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58715151-e1c9-475c-be78-487774704c95-serving-cert\") pod \"console-operator-9d4b6777b-grc82\" (UID: \"58715151-e1c9-475c-be78-487774704c95\") " pod="openshift-console-operator/console-operator-9d4b6777b-grc82" Apr 23 01:11:51.609750 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.609695 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfrkc\" (UniqueName: \"kubernetes.io/projected/58715151-e1c9-475c-be78-487774704c95-kube-api-access-nfrkc\") pod \"console-operator-9d4b6777b-grc82\" (UID: \"58715151-e1c9-475c-be78-487774704c95\") " pod="openshift-console-operator/console-operator-9d4b6777b-grc82" Apr 23 01:11:51.609750 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.609725 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58715151-e1c9-475c-be78-487774704c95-config\") pod \"console-operator-9d4b6777b-grc82\" (UID: \"58715151-e1c9-475c-be78-487774704c95\") " pod="openshift-console-operator/console-operator-9d4b6777b-grc82" Apr 23 01:11:51.626565 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.626533 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-krk48"] Apr 23 01:11:51.629503 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.629488 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-krk48" Apr 23 01:11:51.631511 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.631491 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 23 01:11:51.631511 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.631507 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 23 01:11:51.631657 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.631549 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 23 01:11:51.631905 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.631878 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 23 01:11:51.632055 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.632041 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-bscls\"" Apr 23 01:11:51.637706 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.637679 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-krk48"] Apr 23 01:11:51.710776 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.710740 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nfrkc\" (UniqueName: \"kubernetes.io/projected/58715151-e1c9-475c-be78-487774704c95-kube-api-access-nfrkc\") pod \"console-operator-9d4b6777b-grc82\" (UID: \"58715151-e1c9-475c-be78-487774704c95\") " pod="openshift-console-operator/console-operator-9d4b6777b-grc82" Apr 23 01:11:51.710966 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.710786 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58715151-e1c9-475c-be78-487774704c95-config\") pod \"console-operator-9d4b6777b-grc82\" (UID: \"58715151-e1c9-475c-be78-487774704c95\") " pod="openshift-console-operator/console-operator-9d4b6777b-grc82" Apr 23 01:11:51.710966 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.710814 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a988bd2-2ed4-468c-ab78-3be082ee6a61-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-krk48\" (UID: \"2a988bd2-2ed4-468c-ab78-3be082ee6a61\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-krk48" Apr 23 01:11:51.710966 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.710872 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58715151-e1c9-475c-be78-487774704c95-trusted-ca\") pod \"console-operator-9d4b6777b-grc82\" (UID: \"58715151-e1c9-475c-be78-487774704c95\") " pod="openshift-console-operator/console-operator-9d4b6777b-grc82" Apr 23 01:11:51.710966 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.710897 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58715151-e1c9-475c-be78-487774704c95-serving-cert\") pod \"console-operator-9d4b6777b-grc82\" (UID: \"58715151-e1c9-475c-be78-487774704c95\") " pod="openshift-console-operator/console-operator-9d4b6777b-grc82" Apr 23 01:11:51.711174 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.710980 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sj57\" (UniqueName: \"kubernetes.io/projected/2a988bd2-2ed4-468c-ab78-3be082ee6a61-kube-api-access-7sj57\") pod \"kube-storage-version-migrator-operator-6769c5d45-krk48\" (UID: \"2a988bd2-2ed4-468c-ab78-3be082ee6a61\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-krk48" Apr 23 01:11:51.711174 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.711036 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a988bd2-2ed4-468c-ab78-3be082ee6a61-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-krk48\" (UID: \"2a988bd2-2ed4-468c-ab78-3be082ee6a61\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-krk48" Apr 23 01:11:51.711413 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.711391 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58715151-e1c9-475c-be78-487774704c95-config\") pod \"console-operator-9d4b6777b-grc82\" (UID: \"58715151-e1c9-475c-be78-487774704c95\") " pod="openshift-console-operator/console-operator-9d4b6777b-grc82" Apr 23 01:11:51.711526 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.711509 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58715151-e1c9-475c-be78-487774704c95-trusted-ca\") pod \"console-operator-9d4b6777b-grc82\" (UID: \"58715151-e1c9-475c-be78-487774704c95\") " pod="openshift-console-operator/console-operator-9d4b6777b-grc82" Apr 23 01:11:51.713052 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.713031 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58715151-e1c9-475c-be78-487774704c95-serving-cert\") pod \"console-operator-9d4b6777b-grc82\" (UID: \"58715151-e1c9-475c-be78-487774704c95\") " pod="openshift-console-operator/console-operator-9d4b6777b-grc82" Apr 23 01:11:51.718003 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.717981 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfrkc\" (UniqueName: \"kubernetes.io/projected/58715151-e1c9-475c-be78-487774704c95-kube-api-access-nfrkc\") pod \"console-operator-9d4b6777b-grc82\" (UID: \"58715151-e1c9-475c-be78-487774704c95\") " pod="openshift-console-operator/console-operator-9d4b6777b-grc82" Apr 23 01:11:51.812424 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.812348 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a988bd2-2ed4-468c-ab78-3be082ee6a61-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-krk48\" (UID: \"2a988bd2-2ed4-468c-ab78-3be082ee6a61\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-krk48" Apr 23 01:11:51.812564 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.812458 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7sj57\" (UniqueName: \"kubernetes.io/projected/2a988bd2-2ed4-468c-ab78-3be082ee6a61-kube-api-access-7sj57\") pod \"kube-storage-version-migrator-operator-6769c5d45-krk48\" (UID: \"2a988bd2-2ed4-468c-ab78-3be082ee6a61\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-krk48" Apr 23 01:11:51.812564 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.812490 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a988bd2-2ed4-468c-ab78-3be082ee6a61-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-krk48\" (UID: \"2a988bd2-2ed4-468c-ab78-3be082ee6a61\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-krk48" Apr 23 01:11:51.812898 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.812866 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a988bd2-2ed4-468c-ab78-3be082ee6a61-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-krk48\" (UID: \"2a988bd2-2ed4-468c-ab78-3be082ee6a61\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-krk48" Apr 23 01:11:51.814935 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.814913 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a988bd2-2ed4-468c-ab78-3be082ee6a61-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-krk48\" (UID: \"2a988bd2-2ed4-468c-ab78-3be082ee6a61\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-krk48" Apr 23 01:11:51.820121 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.820102 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sj57\" (UniqueName: \"kubernetes.io/projected/2a988bd2-2ed4-468c-ab78-3be082ee6a61-kube-api-access-7sj57\") pod \"kube-storage-version-migrator-operator-6769c5d45-krk48\" (UID: \"2a988bd2-2ed4-468c-ab78-3be082ee6a61\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-krk48" Apr 23 01:11:51.840152 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.840126 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-grc82" Apr 23 01:11:51.939017 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.938984 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-krk48" Apr 23 01:11:51.963070 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:51.963033 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-grc82"] Apr 23 01:11:51.966072 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:11:51.966039 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58715151_e1c9_475c_be78_487774704c95.slice/crio-ec72000327e4498212cecc6889405191a862e1748799ab4c632627167e28a9b9 WatchSource:0}: Error finding container ec72000327e4498212cecc6889405191a862e1748799ab4c632627167e28a9b9: Status 404 returned error can't find the container with id ec72000327e4498212cecc6889405191a862e1748799ab4c632627167e28a9b9 Apr 23 01:11:52.066797 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:52.066722 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-krk48"] Apr 23 01:11:52.070951 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:11:52.070922 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a988bd2_2ed4_468c_ab78_3be082ee6a61.slice/crio-8b3b88da7a15e8f0e31de22cec9df82fef19e12ab279d1112a1dad9ef098b236 WatchSource:0}: Error finding container 8b3b88da7a15e8f0e31de22cec9df82fef19e12ab279d1112a1dad9ef098b236: Status 404 returned error can't find the container with id 8b3b88da7a15e8f0e31de22cec9df82fef19e12ab279d1112a1dad9ef098b236 Apr 23 01:11:52.105041 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:52.105010 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-krk48" event={"ID":"2a988bd2-2ed4-468c-ab78-3be082ee6a61","Type":"ContainerStarted","Data":"8b3b88da7a15e8f0e31de22cec9df82fef19e12ab279d1112a1dad9ef098b236"} Apr 23 01:11:52.105912 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:52.105890 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-grc82" event={"ID":"58715151-e1c9-475c-be78-487774704c95","Type":"ContainerStarted","Data":"ec72000327e4498212cecc6889405191a862e1748799ab4c632627167e28a9b9"} Apr 23 01:11:52.215931 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:52.215892 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cc66612f-a771-483f-99ec-74f874ac2d4d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4f877\" (UID: \"cc66612f-a771-483f-99ec-74f874ac2d4d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4f877" Apr 23 01:11:52.216143 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:11:52.216042 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 01:11:52.216143 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:11:52.216115 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc66612f-a771-483f-99ec-74f874ac2d4d-samples-operator-tls podName:cc66612f-a771-483f-99ec-74f874ac2d4d nodeName:}" failed. No retries permitted until 2026-04-23 01:11:54.216095941 +0000 UTC m=+122.001770287 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/cc66612f-a771-483f-99ec-74f874ac2d4d-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-4f877" (UID: "cc66612f-a771-483f-99ec-74f874ac2d4d") : secret "samples-operator-tls" not found Apr 23 01:11:53.108936 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:53.108851 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-jnpmf" event={"ID":"b4aa80dc-3684-4d0e-b46a-70d65c9c0782","Type":"ContainerStarted","Data":"3e3871bada94e4d7d79fe67488b0432b9508c68bba0713d2d23c06c862322e74"} Apr 23 01:11:53.122160 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:53.122112 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-jnpmf" podStartSLOduration=1.5100990140000001 podStartE2EDuration="3.122096219s" podCreationTimestamp="2026-04-23 01:11:50 +0000 UTC" firstStartedPulling="2026-04-23 01:11:51.053491113 +0000 UTC m=+118.839165448" lastFinishedPulling="2026-04-23 01:11:52.665488305 +0000 UTC m=+120.451162653" observedRunningTime="2026-04-23 01:11:53.121007727 +0000 UTC m=+120.906682085" watchObservedRunningTime="2026-04-23 01:11:53.122096219 +0000 UTC m=+120.907770620" Apr 23 01:11:54.233251 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:54.233207 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cc66612f-a771-483f-99ec-74f874ac2d4d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4f877\" (UID: \"cc66612f-a771-483f-99ec-74f874ac2d4d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4f877" Apr 23 01:11:54.233739 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:11:54.233363 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 01:11:54.233739 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:11:54.233446 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc66612f-a771-483f-99ec-74f874ac2d4d-samples-operator-tls podName:cc66612f-a771-483f-99ec-74f874ac2d4d nodeName:}" failed. No retries permitted until 2026-04-23 01:11:58.233427087 +0000 UTC m=+126.019101444 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/cc66612f-a771-483f-99ec-74f874ac2d4d-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-4f877" (UID: "cc66612f-a771-483f-99ec-74f874ac2d4d") : secret "samples-operator-tls" not found Apr 23 01:11:55.114869 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:55.114835 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-grc82_58715151-e1c9-475c-be78-487774704c95/console-operator/0.log" Apr 23 01:11:55.115011 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:55.114887 2569 generic.go:358] "Generic (PLEG): container finished" podID="58715151-e1c9-475c-be78-487774704c95" containerID="d2a13489170477d3e459d6b18c5df3cf32b3ea223b70b7ef006a86aa63e8661c" exitCode=255 Apr 23 01:11:55.115011 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:55.114981 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-grc82" event={"ID":"58715151-e1c9-475c-be78-487774704c95","Type":"ContainerDied","Data":"d2a13489170477d3e459d6b18c5df3cf32b3ea223b70b7ef006a86aa63e8661c"} Apr 23 01:11:55.115227 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:55.115205 2569 scope.go:117] "RemoveContainer" containerID="d2a13489170477d3e459d6b18c5df3cf32b3ea223b70b7ef006a86aa63e8661c" Apr 23 01:11:55.116371 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:55.116350 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-krk48" event={"ID":"2a988bd2-2ed4-468c-ab78-3be082ee6a61","Type":"ContainerStarted","Data":"3c2db433206bef6a3de94d8e2af1cf55cd4ec521c7cc6393f5e1a024cccf0c97"} Apr 23 01:11:55.141884 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:55.141841 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-krk48" podStartSLOduration=1.33522075 podStartE2EDuration="4.141827953s" podCreationTimestamp="2026-04-23 01:11:51 +0000 UTC" firstStartedPulling="2026-04-23 01:11:52.073071329 +0000 UTC m=+119.858745665" lastFinishedPulling="2026-04-23 01:11:54.879678529 +0000 UTC m=+122.665352868" observedRunningTime="2026-04-23 01:11:55.140673941 +0000 UTC m=+122.926348302" watchObservedRunningTime="2026-04-23 01:11:55.141827953 +0000 UTC m=+122.927502306" Apr 23 01:11:56.122726 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:56.122699 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-grc82_58715151-e1c9-475c-be78-487774704c95/console-operator/1.log" Apr 23 01:11:56.123179 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:56.123087 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-grc82_58715151-e1c9-475c-be78-487774704c95/console-operator/0.log" Apr 23 01:11:56.123179 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:56.123119 2569 generic.go:358] "Generic (PLEG): container finished" podID="58715151-e1c9-475c-be78-487774704c95" containerID="f2c81ff0863fd3188c0ba33e94f26538901e5e41fa03304e32d82d221aee3f70" exitCode=255 Apr 23 01:11:56.123287 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:56.123211 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-grc82" event={"ID":"58715151-e1c9-475c-be78-487774704c95","Type":"ContainerDied","Data":"f2c81ff0863fd3188c0ba33e94f26538901e5e41fa03304e32d82d221aee3f70"} Apr 23 01:11:56.123287 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:56.123249 2569 scope.go:117] "RemoveContainer" containerID="d2a13489170477d3e459d6b18c5df3cf32b3ea223b70b7ef006a86aa63e8661c" Apr 23 01:11:56.123482 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:56.123463 2569 scope.go:117] "RemoveContainer" containerID="f2c81ff0863fd3188c0ba33e94f26538901e5e41fa03304e32d82d221aee3f70" Apr 23 01:11:56.123731 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:11:56.123712 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-grc82_openshift-console-operator(58715151-e1c9-475c-be78-487774704c95)\"" pod="openshift-console-operator/console-operator-9d4b6777b-grc82" podUID="58715151-e1c9-475c-be78-487774704c95" Apr 23 01:11:57.127075 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:57.127046 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-grc82_58715151-e1c9-475c-be78-487774704c95/console-operator/1.log" Apr 23 01:11:57.127543 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:57.127501 2569 scope.go:117] "RemoveContainer" containerID="f2c81ff0863fd3188c0ba33e94f26538901e5e41fa03304e32d82d221aee3f70" Apr 23 01:11:57.127742 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:11:57.127721 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-grc82_openshift-console-operator(58715151-e1c9-475c-be78-487774704c95)\"" pod="openshift-console-operator/console-operator-9d4b6777b-grc82" podUID="58715151-e1c9-475c-be78-487774704c95" Apr 23 01:11:57.359895 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:57.359835 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/287bea1a-68cd-49c1-a36a-2fc24dbc7719-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qgtjh\" (UID: \"287bea1a-68cd-49c1-a36a-2fc24dbc7719\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qgtjh" Apr 23 01:11:57.360091 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:11:57.359995 2569 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 01:11:57.360091 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:11:57.360072 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/287bea1a-68cd-49c1-a36a-2fc24dbc7719-cluster-monitoring-operator-tls podName:287bea1a-68cd-49c1-a36a-2fc24dbc7719 nodeName:}" failed. No retries permitted until 2026-04-23 01:12:13.360055199 +0000 UTC m=+141.145729546 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/287bea1a-68cd-49c1-a36a-2fc24dbc7719-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-qgtjh" (UID: "287bea1a-68cd-49c1-a36a-2fc24dbc7719") : secret "cluster-monitoring-operator-tls" not found Apr 23 01:11:58.269122 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:58.269068 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cc66612f-a771-483f-99ec-74f874ac2d4d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4f877\" (UID: \"cc66612f-a771-483f-99ec-74f874ac2d4d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4f877" Apr 23 01:11:58.269529 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:11:58.269222 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 01:11:58.269529 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:11:58.269287 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc66612f-a771-483f-99ec-74f874ac2d4d-samples-operator-tls podName:cc66612f-a771-483f-99ec-74f874ac2d4d nodeName:}" failed. No retries permitted until 2026-04-23 01:12:06.269271703 +0000 UTC m=+134.054946038 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/cc66612f-a771-483f-99ec-74f874ac2d4d-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-4f877" (UID: "cc66612f-a771-483f-99ec-74f874ac2d4d") : secret "samples-operator-tls" not found Apr 23 01:11:58.701386 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:58.701346 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-sjl95"] Apr 23 01:11:58.705484 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:58.705461 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-sjl95" Apr 23 01:11:58.707729 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:58.707705 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 23 01:11:58.708412 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:58.708391 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-tdggg\"" Apr 23 01:11:58.708499 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:58.708414 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 23 01:11:58.708499 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:58.708413 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 23 01:11:58.708499 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:58.708440 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 23 01:11:58.711197 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:58.711172 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-sjl95"] Apr 23 01:11:58.873135 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:58.873100 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzzml\" (UniqueName: \"kubernetes.io/projected/6fb1014e-f651-45c2-a9c4-0daec10ce4e1-kube-api-access-bzzml\") pod \"service-ca-865cb79987-sjl95\" (UID: \"6fb1014e-f651-45c2-a9c4-0daec10ce4e1\") " pod="openshift-service-ca/service-ca-865cb79987-sjl95" Apr 23 01:11:58.873322 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:58.873181 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6fb1014e-f651-45c2-a9c4-0daec10ce4e1-signing-key\") pod \"service-ca-865cb79987-sjl95\" (UID: \"6fb1014e-f651-45c2-a9c4-0daec10ce4e1\") " pod="openshift-service-ca/service-ca-865cb79987-sjl95" Apr 23 01:11:58.873322 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:58.873260 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6fb1014e-f651-45c2-a9c4-0daec10ce4e1-signing-cabundle\") pod \"service-ca-865cb79987-sjl95\" (UID: \"6fb1014e-f651-45c2-a9c4-0daec10ce4e1\") " pod="openshift-service-ca/service-ca-865cb79987-sjl95" Apr 23 01:11:58.974649 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:58.974559 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6fb1014e-f651-45c2-a9c4-0daec10ce4e1-signing-key\") pod \"service-ca-865cb79987-sjl95\" (UID: \"6fb1014e-f651-45c2-a9c4-0daec10ce4e1\") " pod="openshift-service-ca/service-ca-865cb79987-sjl95" Apr 23 01:11:58.974649 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:58.974602 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6fb1014e-f651-45c2-a9c4-0daec10ce4e1-signing-cabundle\") pod \"service-ca-865cb79987-sjl95\" (UID: \"6fb1014e-f651-45c2-a9c4-0daec10ce4e1\") " pod="openshift-service-ca/service-ca-865cb79987-sjl95" Apr 23 01:11:58.974869 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:58.974848 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bzzml\" (UniqueName: \"kubernetes.io/projected/6fb1014e-f651-45c2-a9c4-0daec10ce4e1-kube-api-access-bzzml\") pod \"service-ca-865cb79987-sjl95\" (UID: \"6fb1014e-f651-45c2-a9c4-0daec10ce4e1\") " pod="openshift-service-ca/service-ca-865cb79987-sjl95" Apr 23 01:11:58.975647 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:58.975606 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6fb1014e-f651-45c2-a9c4-0daec10ce4e1-signing-cabundle\") pod \"service-ca-865cb79987-sjl95\" (UID: \"6fb1014e-f651-45c2-a9c4-0daec10ce4e1\") " pod="openshift-service-ca/service-ca-865cb79987-sjl95" Apr 23 01:11:58.976861 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:58.976842 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6fb1014e-f651-45c2-a9c4-0daec10ce4e1-signing-key\") pod \"service-ca-865cb79987-sjl95\" (UID: \"6fb1014e-f651-45c2-a9c4-0daec10ce4e1\") " pod="openshift-service-ca/service-ca-865cb79987-sjl95" Apr 23 01:11:58.982820 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:58.982798 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzzml\" (UniqueName: \"kubernetes.io/projected/6fb1014e-f651-45c2-a9c4-0daec10ce4e1-kube-api-access-bzzml\") pod \"service-ca-865cb79987-sjl95\" (UID: \"6fb1014e-f651-45c2-a9c4-0daec10ce4e1\") " pod="openshift-service-ca/service-ca-865cb79987-sjl95" Apr 23 01:11:59.014538 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:59.014517 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-sjl95" Apr 23 01:11:59.126070 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:59.126036 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-sjl95"] Apr 23 01:11:59.128903 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:11:59.128866 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fb1014e_f651_45c2_a9c4_0daec10ce4e1.slice/crio-f48ff0e543b4bf15e8329f507c31982d85f819ebf34ef39245903c520afc247c WatchSource:0}: Error finding container f48ff0e543b4bf15e8329f507c31982d85f819ebf34ef39245903c520afc247c: Status 404 returned error can't find the container with id f48ff0e543b4bf15e8329f507c31982d85f819ebf34ef39245903c520afc247c Apr 23 01:11:59.133361 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:11:59.133317 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-sjl95" event={"ID":"6fb1014e-f651-45c2-a9c4-0daec10ce4e1","Type":"ContainerStarted","Data":"f48ff0e543b4bf15e8329f507c31982d85f819ebf34ef39245903c520afc247c"} Apr 23 01:12:01.140764 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:01.140726 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-sjl95" event={"ID":"6fb1014e-f651-45c2-a9c4-0daec10ce4e1","Type":"ContainerStarted","Data":"10a9e1e5752bbdbd8abd0ed8ddd170d3525bb06f19a6c42866090259e63ca6d3"} Apr 23 01:12:01.155897 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:01.155842 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-sjl95" podStartSLOduration=1.530363605 podStartE2EDuration="3.155828052s" podCreationTimestamp="2026-04-23 01:11:58 +0000 UTC" firstStartedPulling="2026-04-23 01:11:59.131259681 +0000 UTC m=+126.916934015" lastFinishedPulling="2026-04-23 01:12:00.75672412 +0000 UTC m=+128.542398462" observedRunningTime="2026-04-23 01:12:01.154950483 +0000 UTC m=+128.940624840" watchObservedRunningTime="2026-04-23 01:12:01.155828052 +0000 UTC m=+128.941502459" Apr 23 01:12:01.840953 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:01.840913 2569 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-grc82" Apr 23 01:12:01.840953 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:01.840957 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-grc82" Apr 23 01:12:01.841405 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:01.841391 2569 scope.go:117] "RemoveContainer" containerID="f2c81ff0863fd3188c0ba33e94f26538901e5e41fa03304e32d82d221aee3f70" Apr 23 01:12:01.841646 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:12:01.841600 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-grc82_openshift-console-operator(58715151-e1c9-475c-be78-487774704c95)\"" pod="openshift-console-operator/console-operator-9d4b6777b-grc82" podUID="58715151-e1c9-475c-be78-487774704c95" Apr 23 01:12:02.607021 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:02.606979 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/608e8d52-e2cd-48e3-b524-0f0d764d9501-metrics-certs\") pod \"network-metrics-daemon-5mm4v\" (UID: \"608e8d52-e2cd-48e3-b524-0f0d764d9501\") " pod="openshift-multus/network-metrics-daemon-5mm4v" Apr 23 01:12:02.608016 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:12:02.607991 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 01:12:02.608210 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:12:02.608199 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/608e8d52-e2cd-48e3-b524-0f0d764d9501-metrics-certs podName:608e8d52-e2cd-48e3-b524-0f0d764d9501 nodeName:}" failed. No retries permitted until 2026-04-23 01:14:04.608179203 +0000 UTC m=+252.393853555 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/608e8d52-e2cd-48e3-b524-0f0d764d9501-metrics-certs") pod "network-metrics-daemon-5mm4v" (UID: "608e8d52-e2cd-48e3-b524-0f0d764d9501") : secret "metrics-daemon-secret" not found Apr 23 01:12:06.337233 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:06.337196 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cc66612f-a771-483f-99ec-74f874ac2d4d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4f877\" (UID: \"cc66612f-a771-483f-99ec-74f874ac2d4d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4f877" Apr 23 01:12:06.339576 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:06.339543 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cc66612f-a771-483f-99ec-74f874ac2d4d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4f877\" (UID: \"cc66612f-a771-483f-99ec-74f874ac2d4d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4f877" Apr 23 01:12:06.437908 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:06.437881 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-vpsm2\"" Apr 23 01:12:06.445530 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:06.445503 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4f877" Apr 23 01:12:06.560708 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:06.560588 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4f877"] Apr 23 01:12:07.155925 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:07.155879 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4f877" event={"ID":"cc66612f-a771-483f-99ec-74f874ac2d4d","Type":"ContainerStarted","Data":"eb4fa11d8e43cf71f5044c21a6a064bbf4a090d426168571ce090f70bed1cb3e"} Apr 23 01:12:09.164594 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:09.164558 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4f877" event={"ID":"cc66612f-a771-483f-99ec-74f874ac2d4d","Type":"ContainerStarted","Data":"7e1aa37957f0f650289a4a751ed3ea81227a48946daea75dab6329a7268735f4"} Apr 23 01:12:09.164594 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:09.164594 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4f877" event={"ID":"cc66612f-a771-483f-99ec-74f874ac2d4d","Type":"ContainerStarted","Data":"9a49ebc7fe0618984646bf24c0a20d8c6e9f8f4751ab3eaebd6d11c355fcf7a3"} Apr 23 01:12:09.181686 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:09.181642 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4f877" podStartSLOduration=17.067661336 podStartE2EDuration="19.181625377s" podCreationTimestamp="2026-04-23 01:11:50 +0000 UTC" firstStartedPulling="2026-04-23 01:12:06.611996066 +0000 UTC m=+134.397670400" lastFinishedPulling="2026-04-23 01:12:08.725960106 +0000 UTC m=+136.511634441" observedRunningTime="2026-04-23 01:12:09.18123279 +0000 UTC m=+136.966907147" watchObservedRunningTime="2026-04-23 01:12:09.181625377 +0000 UTC m=+136.967299724" Apr 23 01:12:13.394006 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:13.393962 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/287bea1a-68cd-49c1-a36a-2fc24dbc7719-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qgtjh\" (UID: \"287bea1a-68cd-49c1-a36a-2fc24dbc7719\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qgtjh" Apr 23 01:12:13.396324 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:13.396298 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/287bea1a-68cd-49c1-a36a-2fc24dbc7719-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qgtjh\" (UID: \"287bea1a-68cd-49c1-a36a-2fc24dbc7719\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qgtjh" Apr 23 01:12:13.607154 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:13.607124 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-7bzlt\"" Apr 23 01:12:13.614896 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:13.614878 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qgtjh" Apr 23 01:12:13.728539 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:13.728506 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-qgtjh"] Apr 23 01:12:13.731636 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:12:13.731586 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod287bea1a_68cd_49c1_a36a_2fc24dbc7719.slice/crio-94925003b0790f93b79d211f044acd1d0dec987991ad8fa46b9c0e24c9547b67 WatchSource:0}: Error finding container 94925003b0790f93b79d211f044acd1d0dec987991ad8fa46b9c0e24c9547b67: Status 404 returned error can't find the container with id 94925003b0790f93b79d211f044acd1d0dec987991ad8fa46b9c0e24c9547b67 Apr 23 01:12:14.178580 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:14.178544 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qgtjh" event={"ID":"287bea1a-68cd-49c1-a36a-2fc24dbc7719","Type":"ContainerStarted","Data":"94925003b0790f93b79d211f044acd1d0dec987991ad8fa46b9c0e24c9547b67"} Apr 23 01:12:15.762474 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:15.762386 2569 scope.go:117] "RemoveContainer" containerID="f2c81ff0863fd3188c0ba33e94f26538901e5e41fa03304e32d82d221aee3f70" Apr 23 01:12:16.185513 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:16.185478 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qgtjh" event={"ID":"287bea1a-68cd-49c1-a36a-2fc24dbc7719","Type":"ContainerStarted","Data":"7dba4d92949ced6b83a50d477343595d37d149d398cdc189e79de34c90fbd375"} Apr 23 01:12:16.187166 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:16.187143 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-grc82_58715151-e1c9-475c-be78-487774704c95/console-operator/1.log" Apr 23 01:12:16.187288 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:16.187238 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-grc82" event={"ID":"58715151-e1c9-475c-be78-487774704c95","Type":"ContainerStarted","Data":"1daa3d0be313fe5c7cd924310eabeee48c0d241391f48484a11fea88fc17c3ab"} Apr 23 01:12:16.187513 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:16.187496 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-grc82" Apr 23 01:12:16.200036 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:16.199979 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qgtjh" podStartSLOduration=33.454782992 podStartE2EDuration="35.199966065s" podCreationTimestamp="2026-04-23 01:11:41 +0000 UTC" firstStartedPulling="2026-04-23 01:12:13.733522451 +0000 UTC m=+141.519196786" lastFinishedPulling="2026-04-23 01:12:15.478705509 +0000 UTC m=+143.264379859" observedRunningTime="2026-04-23 01:12:16.199017858 +0000 UTC m=+143.984692214" watchObservedRunningTime="2026-04-23 01:12:16.199966065 +0000 UTC m=+143.985640422" Apr 23 01:12:16.214367 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:16.214326 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-grc82" podStartSLOduration=22.305328875 podStartE2EDuration="25.214316225s" podCreationTimestamp="2026-04-23 01:11:51 +0000 UTC" firstStartedPulling="2026-04-23 01:11:51.968396952 +0000 UTC m=+119.754071301" lastFinishedPulling="2026-04-23 01:11:54.877384301 +0000 UTC m=+122.663058651" observedRunningTime="2026-04-23 01:12:16.213561122 +0000 UTC m=+143.999235474" watchObservedRunningTime="2026-04-23 01:12:16.214316225 +0000 UTC m=+143.999990581" Apr 23 01:12:17.187550 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:17.187500 2569 patch_prober.go:28] interesting pod/console-operator-9d4b6777b-grc82 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.132.0.13:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Apr 23 01:12:17.187967 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:17.187583 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-9d4b6777b-grc82" podUID="58715151-e1c9-475c-be78-487774704c95" containerName="console-operator" probeResult="failure" output="Get \"https://10.132.0.13:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 23 01:12:17.305203 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:17.305165 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-grc82" Apr 23 01:12:20.394157 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.394125 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-j5x9h"] Apr 23 01:12:20.396772 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.396751 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-j5x9h" Apr 23 01:12:20.400123 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.400099 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-8pwvt\"" Apr 23 01:12:20.400123 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.400117 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 01:12:20.400298 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.400165 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 01:12:20.408330 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.408309 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-j5x9h"] Apr 23 01:12:20.455601 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.455571 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/55cf13db-98bb-4d59-9791-81984d001d0c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-j5x9h\" (UID: \"55cf13db-98bb-4d59-9791-81984d001d0c\") " pod="openshift-insights/insights-runtime-extractor-j5x9h" Apr 23 01:12:20.455774 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.455655 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfbwq\" (UniqueName: \"kubernetes.io/projected/55cf13db-98bb-4d59-9791-81984d001d0c-kube-api-access-nfbwq\") pod \"insights-runtime-extractor-j5x9h\" (UID: \"55cf13db-98bb-4d59-9791-81984d001d0c\") " pod="openshift-insights/insights-runtime-extractor-j5x9h" Apr 23 01:12:20.455774 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.455689 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/55cf13db-98bb-4d59-9791-81984d001d0c-data-volume\") pod \"insights-runtime-extractor-j5x9h\" (UID: \"55cf13db-98bb-4d59-9791-81984d001d0c\") " pod="openshift-insights/insights-runtime-extractor-j5x9h" Apr 23 01:12:20.455774 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.455723 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/55cf13db-98bb-4d59-9791-81984d001d0c-crio-socket\") pod \"insights-runtime-extractor-j5x9h\" (UID: \"55cf13db-98bb-4d59-9791-81984d001d0c\") " pod="openshift-insights/insights-runtime-extractor-j5x9h" Apr 23 01:12:20.455774 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.455740 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/55cf13db-98bb-4d59-9791-81984d001d0c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-j5x9h\" (UID: \"55cf13db-98bb-4d59-9791-81984d001d0c\") " pod="openshift-insights/insights-runtime-extractor-j5x9h" Apr 23 01:12:20.487632 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.487589 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-vttwn"] Apr 23 01:12:20.489500 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.489480 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-td9fq"] Apr 23 01:12:20.489658 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.489640 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-vttwn" Apr 23 01:12:20.491101 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.491081 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-j8mwv"] Apr 23 01:12:20.491276 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.491209 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-td9fq" Apr 23 01:12:20.492454 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.492436 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 23 01:12:20.492561 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.492483 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 23 01:12:20.492793 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.492775 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-j8mwv" Apr 23 01:12:20.492883 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.492841 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-ln5jb\"" Apr 23 01:12:20.493248 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.493233 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 01:12:20.493988 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.493970 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-gfw2z\"" Apr 23 01:12:20.494070 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.494042 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 01:12:20.495037 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.495019 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-lwf5v\"" Apr 23 01:12:20.495103 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.495071 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 23 01:12:20.501113 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.501091 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-vttwn"] Apr 23 01:12:20.502394 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.502348 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-td9fq"] Apr 23 01:12:20.503773 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.503753 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-j8mwv"] Apr 23 01:12:20.556525 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.556493 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/55cf13db-98bb-4d59-9791-81984d001d0c-crio-socket\") pod \"insights-runtime-extractor-j5x9h\" (UID: \"55cf13db-98bb-4d59-9791-81984d001d0c\") " pod="openshift-insights/insights-runtime-extractor-j5x9h" Apr 23 01:12:20.556525 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.556526 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/55cf13db-98bb-4d59-9791-81984d001d0c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-j5x9h\" (UID: \"55cf13db-98bb-4d59-9791-81984d001d0c\") " pod="openshift-insights/insights-runtime-extractor-j5x9h" Apr 23 01:12:20.556740 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.556562 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/55cf13db-98bb-4d59-9791-81984d001d0c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-j5x9h\" (UID: \"55cf13db-98bb-4d59-9791-81984d001d0c\") " pod="openshift-insights/insights-runtime-extractor-j5x9h" Apr 23 01:12:20.556740 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.556596 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/fb37d61f-d23a-4a04-a8cc-a5b2f18a0faf-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-j8mwv\" (UID: \"fb37d61f-d23a-4a04-a8cc-a5b2f18a0faf\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-j8mwv" Apr 23 01:12:20.556740 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.556638 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/55cf13db-98bb-4d59-9791-81984d001d0c-crio-socket\") pod \"insights-runtime-extractor-j5x9h\" (UID: \"55cf13db-98bb-4d59-9791-81984d001d0c\") " pod="openshift-insights/insights-runtime-extractor-j5x9h" Apr 23 01:12:20.556740 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.556653 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0fbe8b71-af85-4c24-a839-1dc68b57173b-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vttwn\" (UID: \"0fbe8b71-af85-4c24-a839-1dc68b57173b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vttwn" Apr 23 01:12:20.556740 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.556724 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nfbwq\" (UniqueName: \"kubernetes.io/projected/55cf13db-98bb-4d59-9791-81984d001d0c-kube-api-access-nfbwq\") pod \"insights-runtime-extractor-j5x9h\" (UID: \"55cf13db-98bb-4d59-9791-81984d001d0c\") " pod="openshift-insights/insights-runtime-extractor-j5x9h" Apr 23 01:12:20.556919 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.556757 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5hh2\" (UniqueName: \"kubernetes.io/projected/74ab69de-e0f4-4c2e-9254-d9fc69aed149-kube-api-access-c5hh2\") pod \"downloads-6bcc868b7-td9fq\" (UID: \"74ab69de-e0f4-4c2e-9254-d9fc69aed149\") " pod="openshift-console/downloads-6bcc868b7-td9fq" Apr 23 01:12:20.556919 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.556790 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/55cf13db-98bb-4d59-9791-81984d001d0c-data-volume\") pod \"insights-runtime-extractor-j5x9h\" (UID: \"55cf13db-98bb-4d59-9791-81984d001d0c\") " pod="openshift-insights/insights-runtime-extractor-j5x9h" Apr 23 01:12:20.556919 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.556819 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0fbe8b71-af85-4c24-a839-1dc68b57173b-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-vttwn\" (UID: \"0fbe8b71-af85-4c24-a839-1dc68b57173b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vttwn" Apr 23 01:12:20.557184 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.557164 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/55cf13db-98bb-4d59-9791-81984d001d0c-data-volume\") pod \"insights-runtime-extractor-j5x9h\" (UID: \"55cf13db-98bb-4d59-9791-81984d001d0c\") " pod="openshift-insights/insights-runtime-extractor-j5x9h" Apr 23 01:12:20.557446 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.557404 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/55cf13db-98bb-4d59-9791-81984d001d0c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-j5x9h\" (UID: \"55cf13db-98bb-4d59-9791-81984d001d0c\") " pod="openshift-insights/insights-runtime-extractor-j5x9h" Apr 23 01:12:20.559222 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.559197 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/55cf13db-98bb-4d59-9791-81984d001d0c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-j5x9h\" (UID: \"55cf13db-98bb-4d59-9791-81984d001d0c\") " pod="openshift-insights/insights-runtime-extractor-j5x9h" Apr 23 01:12:20.572324 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.572303 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfbwq\" (UniqueName: \"kubernetes.io/projected/55cf13db-98bb-4d59-9791-81984d001d0c-kube-api-access-nfbwq\") pod \"insights-runtime-extractor-j5x9h\" (UID: \"55cf13db-98bb-4d59-9791-81984d001d0c\") " pod="openshift-insights/insights-runtime-extractor-j5x9h" Apr 23 01:12:20.657588 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.657508 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0fbe8b71-af85-4c24-a839-1dc68b57173b-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-vttwn\" (UID: \"0fbe8b71-af85-4c24-a839-1dc68b57173b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vttwn" Apr 23 01:12:20.657743 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.657636 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/fb37d61f-d23a-4a04-a8cc-a5b2f18a0faf-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-j8mwv\" (UID: \"fb37d61f-d23a-4a04-a8cc-a5b2f18a0faf\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-j8mwv" Apr 23 01:12:20.657743 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.657688 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0fbe8b71-af85-4c24-a839-1dc68b57173b-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vttwn\" (UID: \"0fbe8b71-af85-4c24-a839-1dc68b57173b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vttwn" Apr 23 01:12:20.657854 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.657750 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c5hh2\" (UniqueName: \"kubernetes.io/projected/74ab69de-e0f4-4c2e-9254-d9fc69aed149-kube-api-access-c5hh2\") pod \"downloads-6bcc868b7-td9fq\" (UID: \"74ab69de-e0f4-4c2e-9254-d9fc69aed149\") " pod="openshift-console/downloads-6bcc868b7-td9fq" Apr 23 01:12:20.658345 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.658321 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0fbe8b71-af85-4c24-a839-1dc68b57173b-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-vttwn\" (UID: \"0fbe8b71-af85-4c24-a839-1dc68b57173b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vttwn" Apr 23 01:12:20.660028 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.660006 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0fbe8b71-af85-4c24-a839-1dc68b57173b-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vttwn\" (UID: \"0fbe8b71-af85-4c24-a839-1dc68b57173b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vttwn" Apr 23 01:12:20.660110 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.660026 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/fb37d61f-d23a-4a04-a8cc-a5b2f18a0faf-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-j8mwv\" (UID: \"fb37d61f-d23a-4a04-a8cc-a5b2f18a0faf\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-j8mwv" Apr 23 01:12:20.665578 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.665557 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5hh2\" (UniqueName: \"kubernetes.io/projected/74ab69de-e0f4-4c2e-9254-d9fc69aed149-kube-api-access-c5hh2\") pod \"downloads-6bcc868b7-td9fq\" (UID: \"74ab69de-e0f4-4c2e-9254-d9fc69aed149\") " pod="openshift-console/downloads-6bcc868b7-td9fq" Apr 23 01:12:20.708009 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.707971 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-j5x9h" Apr 23 01:12:20.800757 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.800723 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-vttwn" Apr 23 01:12:20.807542 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.807515 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-td9fq" Apr 23 01:12:20.812316 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.812294 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-j8mwv" Apr 23 01:12:20.829568 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.829505 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-j5x9h"] Apr 23 01:12:20.833296 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:12:20.833254 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55cf13db_98bb_4d59_9791_81984d001d0c.slice/crio-badf1a24cbc63d59d8ccab99ac407e55d39e942deadeefdd92e27f02dc97dd3a WatchSource:0}: Error finding container badf1a24cbc63d59d8ccab99ac407e55d39e942deadeefdd92e27f02dc97dd3a: Status 404 returned error can't find the container with id badf1a24cbc63d59d8ccab99ac407e55d39e942deadeefdd92e27f02dc97dd3a Apr 23 01:12:20.944381 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.943543 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-vttwn"] Apr 23 01:12:20.947905 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:12:20.947865 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fbe8b71_af85_4c24_a839_1dc68b57173b.slice/crio-fe854ddc505ca1345f0aff6397c42ba492a93f580d380e09e0fd93bff7c4de82 WatchSource:0}: Error finding container fe854ddc505ca1345f0aff6397c42ba492a93f580d380e09e0fd93bff7c4de82: Status 404 returned error can't find the container with id fe854ddc505ca1345f0aff6397c42ba492a93f580d380e09e0fd93bff7c4de82 Apr 23 01:12:20.964505 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.964476 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-td9fq"] Apr 23 01:12:20.967378 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:12:20.967347 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74ab69de_e0f4_4c2e_9254_d9fc69aed149.slice/crio-c83918ff448174d4a0191bb770bef41a33aa6f4f30968b3f0e23b03c43a22f03 WatchSource:0}: Error finding container c83918ff448174d4a0191bb770bef41a33aa6f4f30968b3f0e23b03c43a22f03: Status 404 returned error can't find the container with id c83918ff448174d4a0191bb770bef41a33aa6f4f30968b3f0e23b03c43a22f03 Apr 23 01:12:20.977979 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:20.977956 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-j8mwv"] Apr 23 01:12:20.980496 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:12:20.980471 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb37d61f_d23a_4a04_a8cc_a5b2f18a0faf.slice/crio-6b6a910efe8e56d1d5f4e1d1cff45ab0f2c62f44301e8ed4df8f30728d7094a9 WatchSource:0}: Error finding container 6b6a910efe8e56d1d5f4e1d1cff45ab0f2c62f44301e8ed4df8f30728d7094a9: Status 404 returned error can't find the container with id 6b6a910efe8e56d1d5f4e1d1cff45ab0f2c62f44301e8ed4df8f30728d7094a9 Apr 23 01:12:21.199965 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:21.199874 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-vttwn" event={"ID":"0fbe8b71-af85-4c24-a839-1dc68b57173b","Type":"ContainerStarted","Data":"fe854ddc505ca1345f0aff6397c42ba492a93f580d380e09e0fd93bff7c4de82"} Apr 23 01:12:21.201188 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:21.201159 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-j5x9h" event={"ID":"55cf13db-98bb-4d59-9791-81984d001d0c","Type":"ContainerStarted","Data":"1455d64d1086943224860c7f1d404326bdf0c4ccc5dd5fa5d6242628cf14250f"} Apr 23 01:12:21.201349 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:21.201195 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-j5x9h" event={"ID":"55cf13db-98bb-4d59-9791-81984d001d0c","Type":"ContainerStarted","Data":"badf1a24cbc63d59d8ccab99ac407e55d39e942deadeefdd92e27f02dc97dd3a"} Apr 23 01:12:21.202155 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:21.202134 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-j8mwv" event={"ID":"fb37d61f-d23a-4a04-a8cc-a5b2f18a0faf","Type":"ContainerStarted","Data":"6b6a910efe8e56d1d5f4e1d1cff45ab0f2c62f44301e8ed4df8f30728d7094a9"} Apr 23 01:12:21.203022 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:21.202998 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-td9fq" event={"ID":"74ab69de-e0f4-4c2e-9254-d9fc69aed149","Type":"ContainerStarted","Data":"c83918ff448174d4a0191bb770bef41a33aa6f4f30968b3f0e23b03c43a22f03"} Apr 23 01:12:22.208363 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:22.208325 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-j5x9h" event={"ID":"55cf13db-98bb-4d59-9791-81984d001d0c","Type":"ContainerStarted","Data":"1360d25ff1c3ffaaf3a23c54e4e6c156c01787772941e66609386e6cf1c0b74a"} Apr 23 01:12:23.213658 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:23.213599 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-j8mwv" event={"ID":"fb37d61f-d23a-4a04-a8cc-a5b2f18a0faf","Type":"ContainerStarted","Data":"798b3f48d0b0396c859f8fc375426b7d943db9659a96e8cb3e2d465fee34e2ae"} Apr 23 01:12:23.214120 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:23.213748 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-j8mwv" Apr 23 01:12:23.215870 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:23.215838 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-vttwn" event={"ID":"0fbe8b71-af85-4c24-a839-1dc68b57173b","Type":"ContainerStarted","Data":"c53e3518aa0467aa87157d235ae9f58f120b84358e0d04049d920e8e73753ade"} Apr 23 01:12:23.219657 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:23.219631 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-j8mwv" Apr 23 01:12:23.228087 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:23.228027 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-j8mwv" podStartSLOduration=1.794632038 podStartE2EDuration="3.228011663s" podCreationTimestamp="2026-04-23 01:12:20 +0000 UTC" firstStartedPulling="2026-04-23 01:12:20.982357325 +0000 UTC m=+148.768031664" lastFinishedPulling="2026-04-23 01:12:22.415736943 +0000 UTC m=+150.201411289" observedRunningTime="2026-04-23 01:12:23.227351324 +0000 UTC m=+151.013025682" watchObservedRunningTime="2026-04-23 01:12:23.228011663 +0000 UTC m=+151.013686022" Apr 23 01:12:23.245769 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:23.245713 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-vttwn" podStartSLOduration=1.782460887 podStartE2EDuration="3.245695676s" podCreationTimestamp="2026-04-23 01:12:20 +0000 UTC" firstStartedPulling="2026-04-23 01:12:20.950311481 +0000 UTC m=+148.735985829" lastFinishedPulling="2026-04-23 01:12:22.413546279 +0000 UTC m=+150.199220618" observedRunningTime="2026-04-23 01:12:23.245374624 +0000 UTC m=+151.031048974" watchObservedRunningTime="2026-04-23 01:12:23.245695676 +0000 UTC m=+151.031370033" Apr 23 01:12:24.026054 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:24.026018 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-b4wg8"] Apr 23 01:12:24.028358 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:24.028335 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-b4wg8" Apr 23 01:12:24.031367 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:24.031346 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 23 01:12:24.031808 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:24.031502 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-h5vt2\"" Apr 23 01:12:24.031808 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:24.031555 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 23 01:12:24.031808 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:24.031624 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 01:12:24.036872 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:24.036535 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-b4wg8"] Apr 23 01:12:24.088782 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:24.088741 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f3158a19-531a-473b-9d1f-1b765e094c1b-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-b4wg8\" (UID: \"f3158a19-531a-473b-9d1f-1b765e094c1b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-b4wg8" Apr 23 01:12:24.088941 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:24.088847 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3158a19-531a-473b-9d1f-1b765e094c1b-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-b4wg8\" (UID: \"f3158a19-531a-473b-9d1f-1b765e094c1b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-b4wg8" Apr 23 01:12:24.088941 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:24.088879 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxpzr\" (UniqueName: \"kubernetes.io/projected/f3158a19-531a-473b-9d1f-1b765e094c1b-kube-api-access-cxpzr\") pod \"prometheus-operator-5676c8c784-b4wg8\" (UID: \"f3158a19-531a-473b-9d1f-1b765e094c1b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-b4wg8" Apr 23 01:12:24.088941 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:24.088917 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f3158a19-531a-473b-9d1f-1b765e094c1b-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-b4wg8\" (UID: \"f3158a19-531a-473b-9d1f-1b765e094c1b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-b4wg8" Apr 23 01:12:24.190301 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:24.190246 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f3158a19-531a-473b-9d1f-1b765e094c1b-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-b4wg8\" (UID: \"f3158a19-531a-473b-9d1f-1b765e094c1b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-b4wg8" Apr 23 01:12:24.190447 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:24.190369 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3158a19-531a-473b-9d1f-1b765e094c1b-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-b4wg8\" (UID: \"f3158a19-531a-473b-9d1f-1b765e094c1b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-b4wg8" Apr 23 01:12:24.190447 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:24.190402 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxpzr\" (UniqueName: \"kubernetes.io/projected/f3158a19-531a-473b-9d1f-1b765e094c1b-kube-api-access-cxpzr\") pod \"prometheus-operator-5676c8c784-b4wg8\" (UID: \"f3158a19-531a-473b-9d1f-1b765e094c1b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-b4wg8" Apr 23 01:12:24.190447 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:24.190442 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f3158a19-531a-473b-9d1f-1b765e094c1b-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-b4wg8\" (UID: \"f3158a19-531a-473b-9d1f-1b765e094c1b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-b4wg8" Apr 23 01:12:24.191777 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:24.191754 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3158a19-531a-473b-9d1f-1b765e094c1b-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-b4wg8\" (UID: \"f3158a19-531a-473b-9d1f-1b765e094c1b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-b4wg8" Apr 23 01:12:24.193233 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:24.193214 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f3158a19-531a-473b-9d1f-1b765e094c1b-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-b4wg8\" (UID: \"f3158a19-531a-473b-9d1f-1b765e094c1b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-b4wg8" Apr 23 01:12:24.193373 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:24.193352 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f3158a19-531a-473b-9d1f-1b765e094c1b-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-b4wg8\" (UID: \"f3158a19-531a-473b-9d1f-1b765e094c1b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-b4wg8" Apr 23 01:12:24.198333 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:24.198308 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxpzr\" (UniqueName: \"kubernetes.io/projected/f3158a19-531a-473b-9d1f-1b765e094c1b-kube-api-access-cxpzr\") pod \"prometheus-operator-5676c8c784-b4wg8\" (UID: \"f3158a19-531a-473b-9d1f-1b765e094c1b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-b4wg8" Apr 23 01:12:24.220522 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:24.220486 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-j5x9h" event={"ID":"55cf13db-98bb-4d59-9791-81984d001d0c","Type":"ContainerStarted","Data":"8682f5db9c4bee133b7ace2ad53385984357ade756157c134e124571c261f04b"} Apr 23 01:12:24.236263 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:24.236210 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-j5x9h" podStartSLOduration=1.528119944 podStartE2EDuration="4.236191222s" podCreationTimestamp="2026-04-23 01:12:20 +0000 UTC" firstStartedPulling="2026-04-23 01:12:20.916724772 +0000 UTC m=+148.702399114" lastFinishedPulling="2026-04-23 01:12:23.624796041 +0000 UTC m=+151.410470392" observedRunningTime="2026-04-23 01:12:24.235335059 +0000 UTC m=+152.021009417" watchObservedRunningTime="2026-04-23 01:12:24.236191222 +0000 UTC m=+152.021865582" Apr 23 01:12:24.338994 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:24.338909 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-b4wg8" Apr 23 01:12:24.468667 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:24.468630 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-b4wg8"] Apr 23 01:12:24.471867 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:12:24.471841 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3158a19_531a_473b_9d1f_1b765e094c1b.slice/crio-56efc9eebad7795e6a41df6a15fdd32063bc2d19d50b9111b58998fa2aed4359 WatchSource:0}: Error finding container 56efc9eebad7795e6a41df6a15fdd32063bc2d19d50b9111b58998fa2aed4359: Status 404 returned error can't find the container with id 56efc9eebad7795e6a41df6a15fdd32063bc2d19d50b9111b58998fa2aed4359 Apr 23 01:12:25.225947 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:25.225894 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-b4wg8" event={"ID":"f3158a19-531a-473b-9d1f-1b765e094c1b","Type":"ContainerStarted","Data":"56efc9eebad7795e6a41df6a15fdd32063bc2d19d50b9111b58998fa2aed4359"} Apr 23 01:12:26.230626 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:26.230524 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-b4wg8" event={"ID":"f3158a19-531a-473b-9d1f-1b765e094c1b","Type":"ContainerStarted","Data":"ed7ae9874f5c951778d88f81b192d93e3ea5376f8cae99fe7b6f5ceb576026ab"} Apr 23 01:12:26.230626 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:26.230572 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-b4wg8" event={"ID":"f3158a19-531a-473b-9d1f-1b765e094c1b","Type":"ContainerStarted","Data":"8bec0953014c9f30f17296b826b447833a4648f563dc39eb925c969a356b8c62"} Apr 23 01:12:26.246828 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:26.246776 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-b4wg8" podStartSLOduration=0.80354726 podStartE2EDuration="2.246761156s" podCreationTimestamp="2026-04-23 01:12:24 +0000 UTC" firstStartedPulling="2026-04-23 01:12:24.474036814 +0000 UTC m=+152.259711148" lastFinishedPulling="2026-04-23 01:12:25.917250703 +0000 UTC m=+153.702925044" observedRunningTime="2026-04-23 01:12:26.244954253 +0000 UTC m=+154.030628611" watchObservedRunningTime="2026-04-23 01:12:26.246761156 +0000 UTC m=+154.032435513" Apr 23 01:12:27.190041 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:27.190001 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6c7ccd58b9-dsg9x"] Apr 23 01:12:27.192308 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:27.192286 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c7ccd58b9-dsg9x" Apr 23 01:12:27.194545 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:27.194520 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 01:12:27.194716 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:27.194521 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-cm99s\"" Apr 23 01:12:27.195284 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:27.195257 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 01:12:27.195595 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:27.195428 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 01:12:27.195595 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:27.195428 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 01:12:27.195595 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:27.195489 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 01:12:27.201977 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:27.201960 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c7ccd58b9-dsg9x"] Apr 23 01:12:27.317736 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:27.317694 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b06f04f0-7ac2-4abd-91c1-11d56af6cdd9-service-ca\") pod \"console-6c7ccd58b9-dsg9x\" (UID: \"b06f04f0-7ac2-4abd-91c1-11d56af6cdd9\") " pod="openshift-console/console-6c7ccd58b9-dsg9x" Apr 23 01:12:27.318198 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:27.317765 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b06f04f0-7ac2-4abd-91c1-11d56af6cdd9-oauth-serving-cert\") pod \"console-6c7ccd58b9-dsg9x\" (UID: \"b06f04f0-7ac2-4abd-91c1-11d56af6cdd9\") " pod="openshift-console/console-6c7ccd58b9-dsg9x" Apr 23 01:12:27.318198 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:27.317935 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b06f04f0-7ac2-4abd-91c1-11d56af6cdd9-console-config\") pod \"console-6c7ccd58b9-dsg9x\" (UID: \"b06f04f0-7ac2-4abd-91c1-11d56af6cdd9\") " pod="openshift-console/console-6c7ccd58b9-dsg9x" Apr 23 01:12:27.318198 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:27.317979 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b06f04f0-7ac2-4abd-91c1-11d56af6cdd9-console-serving-cert\") pod \"console-6c7ccd58b9-dsg9x\" (UID: \"b06f04f0-7ac2-4abd-91c1-11d56af6cdd9\") " pod="openshift-console/console-6c7ccd58b9-dsg9x" Apr 23 01:12:27.318198 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:27.318041 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b06f04f0-7ac2-4abd-91c1-11d56af6cdd9-console-oauth-config\") pod \"console-6c7ccd58b9-dsg9x\" (UID: \"b06f04f0-7ac2-4abd-91c1-11d56af6cdd9\") " pod="openshift-console/console-6c7ccd58b9-dsg9x" Apr 23 01:12:27.318198 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:27.318079 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dhnw\" (UniqueName: \"kubernetes.io/projected/b06f04f0-7ac2-4abd-91c1-11d56af6cdd9-kube-api-access-8dhnw\") pod \"console-6c7ccd58b9-dsg9x\" (UID: \"b06f04f0-7ac2-4abd-91c1-11d56af6cdd9\") " pod="openshift-console/console-6c7ccd58b9-dsg9x" Apr 23 01:12:27.419251 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:27.419208 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b06f04f0-7ac2-4abd-91c1-11d56af6cdd9-console-config\") pod \"console-6c7ccd58b9-dsg9x\" (UID: \"b06f04f0-7ac2-4abd-91c1-11d56af6cdd9\") " pod="openshift-console/console-6c7ccd58b9-dsg9x" Apr 23 01:12:27.419251 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:27.419251 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b06f04f0-7ac2-4abd-91c1-11d56af6cdd9-console-serving-cert\") pod \"console-6c7ccd58b9-dsg9x\" (UID: \"b06f04f0-7ac2-4abd-91c1-11d56af6cdd9\") " pod="openshift-console/console-6c7ccd58b9-dsg9x" Apr 23 01:12:27.419453 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:27.419274 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b06f04f0-7ac2-4abd-91c1-11d56af6cdd9-console-oauth-config\") pod \"console-6c7ccd58b9-dsg9x\" (UID: \"b06f04f0-7ac2-4abd-91c1-11d56af6cdd9\") " pod="openshift-console/console-6c7ccd58b9-dsg9x" Apr 23 01:12:27.419453 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:27.419305 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8dhnw\" (UniqueName: \"kubernetes.io/projected/b06f04f0-7ac2-4abd-91c1-11d56af6cdd9-kube-api-access-8dhnw\") pod \"console-6c7ccd58b9-dsg9x\" (UID: \"b06f04f0-7ac2-4abd-91c1-11d56af6cdd9\") " pod="openshift-console/console-6c7ccd58b9-dsg9x" Apr 23 01:12:27.419453 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:27.419376 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b06f04f0-7ac2-4abd-91c1-11d56af6cdd9-service-ca\") pod \"console-6c7ccd58b9-dsg9x\" (UID: \"b06f04f0-7ac2-4abd-91c1-11d56af6cdd9\") " pod="openshift-console/console-6c7ccd58b9-dsg9x" Apr 23 01:12:27.419453 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:27.419425 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b06f04f0-7ac2-4abd-91c1-11d56af6cdd9-oauth-serving-cert\") pod \"console-6c7ccd58b9-dsg9x\" (UID: \"b06f04f0-7ac2-4abd-91c1-11d56af6cdd9\") " pod="openshift-console/console-6c7ccd58b9-dsg9x" Apr 23 01:12:27.419972 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:27.419950 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b06f04f0-7ac2-4abd-91c1-11d56af6cdd9-console-config\") pod \"console-6c7ccd58b9-dsg9x\" (UID: \"b06f04f0-7ac2-4abd-91c1-11d56af6cdd9\") " pod="openshift-console/console-6c7ccd58b9-dsg9x" Apr 23 01:12:27.420064 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:27.420028 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b06f04f0-7ac2-4abd-91c1-11d56af6cdd9-oauth-serving-cert\") pod \"console-6c7ccd58b9-dsg9x\" (UID: \"b06f04f0-7ac2-4abd-91c1-11d56af6cdd9\") " pod="openshift-console/console-6c7ccd58b9-dsg9x" Apr 23 01:12:27.420214 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:27.420190 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b06f04f0-7ac2-4abd-91c1-11d56af6cdd9-service-ca\") pod \"console-6c7ccd58b9-dsg9x\" (UID: \"b06f04f0-7ac2-4abd-91c1-11d56af6cdd9\") " pod="openshift-console/console-6c7ccd58b9-dsg9x" Apr 23 01:12:27.421728 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:27.421709 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b06f04f0-7ac2-4abd-91c1-11d56af6cdd9-console-oauth-config\") pod \"console-6c7ccd58b9-dsg9x\" (UID: \"b06f04f0-7ac2-4abd-91c1-11d56af6cdd9\") " pod="openshift-console/console-6c7ccd58b9-dsg9x" Apr 23 01:12:27.421821 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:27.421805 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b06f04f0-7ac2-4abd-91c1-11d56af6cdd9-console-serving-cert\") pod \"console-6c7ccd58b9-dsg9x\" (UID: \"b06f04f0-7ac2-4abd-91c1-11d56af6cdd9\") " pod="openshift-console/console-6c7ccd58b9-dsg9x" Apr 23 01:12:27.427279 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:27.427257 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dhnw\" (UniqueName: \"kubernetes.io/projected/b06f04f0-7ac2-4abd-91c1-11d56af6cdd9-kube-api-access-8dhnw\") pod \"console-6c7ccd58b9-dsg9x\" (UID: \"b06f04f0-7ac2-4abd-91c1-11d56af6cdd9\") " pod="openshift-console/console-6c7ccd58b9-dsg9x" Apr 23 01:12:27.503229 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:27.503112 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c7ccd58b9-dsg9x" Apr 23 01:12:27.630268 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:27.630229 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c7ccd58b9-dsg9x"] Apr 23 01:12:27.633204 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:12:27.633172 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb06f04f0_7ac2_4abd_91c1_11d56af6cdd9.slice/crio-d8137b01f7b8f2731d80a49dcb8b1c6be0102526e77f5507b117d0d135ca178b WatchSource:0}: Error finding container d8137b01f7b8f2731d80a49dcb8b1c6be0102526e77f5507b117d0d135ca178b: Status 404 returned error can't find the container with id d8137b01f7b8f2731d80a49dcb8b1c6be0102526e77f5507b117d0d135ca178b Apr 23 01:12:28.153008 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:12:28.152960 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" podUID="1679ef2e-e9c0-4738-a19c-35582013fe18" Apr 23 01:12:28.167263 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:12:28.167217 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-rpd44" podUID="746bf4ef-4ba5-45d1-9cc6-ab6354c10b18" Apr 23 01:12:28.237922 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:28.237879 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c7ccd58b9-dsg9x" event={"ID":"b06f04f0-7ac2-4abd-91c1-11d56af6cdd9","Type":"ContainerStarted","Data":"d8137b01f7b8f2731d80a49dcb8b1c6be0102526e77f5507b117d0d135ca178b"} Apr 23 01:12:28.237922 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:28.237913 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:12:28.247428 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:12:28.247394 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-bk66m" podUID="79d4f338-f964-4fa6-985e-50bbb3b105a9" Apr 23 01:12:28.390203 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:28.390163 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-kdwf9"] Apr 23 01:12:28.393122 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:28.393101 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-kdwf9" Apr 23 01:12:28.395416 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:28.395390 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 01:12:28.395685 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:28.395668 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-85prc\"" Apr 23 01:12:28.395979 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:28.395957 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 01:12:28.396080 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:28.395981 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 01:12:28.532179 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:28.531586 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4a3638c0-7a9c-4827-92b6-19af4e48804e-node-exporter-textfile\") pod \"node-exporter-kdwf9\" (UID: \"4a3638c0-7a9c-4827-92b6-19af4e48804e\") " pod="openshift-monitoring/node-exporter-kdwf9" Apr 23 01:12:28.532179 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:28.531658 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4a3638c0-7a9c-4827-92b6-19af4e48804e-node-exporter-tls\") pod \"node-exporter-kdwf9\" (UID: \"4a3638c0-7a9c-4827-92b6-19af4e48804e\") " pod="openshift-monitoring/node-exporter-kdwf9" Apr 23 01:12:28.532179 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:28.531705 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4a3638c0-7a9c-4827-92b6-19af4e48804e-root\") pod \"node-exporter-kdwf9\" (UID: \"4a3638c0-7a9c-4827-92b6-19af4e48804e\") " pod="openshift-monitoring/node-exporter-kdwf9" Apr 23 01:12:28.532179 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:28.531766 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4a3638c0-7a9c-4827-92b6-19af4e48804e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-kdwf9\" (UID: \"4a3638c0-7a9c-4827-92b6-19af4e48804e\") " pod="openshift-monitoring/node-exporter-kdwf9" Apr 23 01:12:28.532179 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:28.531804 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4a3638c0-7a9c-4827-92b6-19af4e48804e-node-exporter-wtmp\") pod \"node-exporter-kdwf9\" (UID: \"4a3638c0-7a9c-4827-92b6-19af4e48804e\") " pod="openshift-monitoring/node-exporter-kdwf9" Apr 23 01:12:28.532179 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:28.531908 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4a3638c0-7a9c-4827-92b6-19af4e48804e-metrics-client-ca\") pod \"node-exporter-kdwf9\" (UID: \"4a3638c0-7a9c-4827-92b6-19af4e48804e\") " pod="openshift-monitoring/node-exporter-kdwf9" Apr 23 01:12:28.532179 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:28.531938 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a3638c0-7a9c-4827-92b6-19af4e48804e-sys\") pod \"node-exporter-kdwf9\" (UID: \"4a3638c0-7a9c-4827-92b6-19af4e48804e\") " pod="openshift-monitoring/node-exporter-kdwf9" Apr 23 01:12:28.532179 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:28.532006 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4a3638c0-7a9c-4827-92b6-19af4e48804e-node-exporter-accelerators-collector-config\") pod \"node-exporter-kdwf9\" (UID: \"4a3638c0-7a9c-4827-92b6-19af4e48804e\") " pod="openshift-monitoring/node-exporter-kdwf9" Apr 23 01:12:28.532179 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:28.532088 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vbz2\" (UniqueName: \"kubernetes.io/projected/4a3638c0-7a9c-4827-92b6-19af4e48804e-kube-api-access-6vbz2\") pod \"node-exporter-kdwf9\" (UID: \"4a3638c0-7a9c-4827-92b6-19af4e48804e\") " pod="openshift-monitoring/node-exporter-kdwf9" Apr 23 01:12:28.634954 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:28.632784 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4a3638c0-7a9c-4827-92b6-19af4e48804e-root\") pod \"node-exporter-kdwf9\" (UID: \"4a3638c0-7a9c-4827-92b6-19af4e48804e\") " pod="openshift-monitoring/node-exporter-kdwf9" Apr 23 01:12:28.634954 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:28.632867 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4a3638c0-7a9c-4827-92b6-19af4e48804e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-kdwf9\" (UID: \"4a3638c0-7a9c-4827-92b6-19af4e48804e\") " pod="openshift-monitoring/node-exporter-kdwf9" Apr 23 01:12:28.634954 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:28.632899 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4a3638c0-7a9c-4827-92b6-19af4e48804e-node-exporter-wtmp\") pod \"node-exporter-kdwf9\" (UID: \"4a3638c0-7a9c-4827-92b6-19af4e48804e\") " pod="openshift-monitoring/node-exporter-kdwf9" Apr 23 01:12:28.634954 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:28.632938 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4a3638c0-7a9c-4827-92b6-19af4e48804e-metrics-client-ca\") pod \"node-exporter-kdwf9\" (UID: \"4a3638c0-7a9c-4827-92b6-19af4e48804e\") " pod="openshift-monitoring/node-exporter-kdwf9" Apr 23 01:12:28.634954 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:28.632964 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a3638c0-7a9c-4827-92b6-19af4e48804e-sys\") pod \"node-exporter-kdwf9\" (UID: \"4a3638c0-7a9c-4827-92b6-19af4e48804e\") " pod="openshift-monitoring/node-exporter-kdwf9" Apr 23 01:12:28.634954 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:28.632996 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4a3638c0-7a9c-4827-92b6-19af4e48804e-node-exporter-accelerators-collector-config\") pod \"node-exporter-kdwf9\" (UID: \"4a3638c0-7a9c-4827-92b6-19af4e48804e\") " pod="openshift-monitoring/node-exporter-kdwf9" Apr 23 01:12:28.634954 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:28.633036 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vbz2\" (UniqueName: \"kubernetes.io/projected/4a3638c0-7a9c-4827-92b6-19af4e48804e-kube-api-access-6vbz2\") pod \"node-exporter-kdwf9\" (UID: \"4a3638c0-7a9c-4827-92b6-19af4e48804e\") " pod="openshift-monitoring/node-exporter-kdwf9" Apr 23 01:12:28.634954 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:28.633073 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4a3638c0-7a9c-4827-92b6-19af4e48804e-node-exporter-textfile\") pod \"node-exporter-kdwf9\" (UID: \"4a3638c0-7a9c-4827-92b6-19af4e48804e\") " pod="openshift-monitoring/node-exporter-kdwf9" Apr 23 01:12:28.634954 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:28.633110 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4a3638c0-7a9c-4827-92b6-19af4e48804e-node-exporter-tls\") pod \"node-exporter-kdwf9\" (UID: \"4a3638c0-7a9c-4827-92b6-19af4e48804e\") " pod="openshift-monitoring/node-exporter-kdwf9" Apr 23 01:12:28.634954 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:12:28.633260 2569 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 23 01:12:28.634954 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:12:28.633323 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a3638c0-7a9c-4827-92b6-19af4e48804e-node-exporter-tls podName:4a3638c0-7a9c-4827-92b6-19af4e48804e nodeName:}" failed. No retries permitted until 2026-04-23 01:12:29.133303025 +0000 UTC m=+156.918977363 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/4a3638c0-7a9c-4827-92b6-19af4e48804e-node-exporter-tls") pod "node-exporter-kdwf9" (UID: "4a3638c0-7a9c-4827-92b6-19af4e48804e") : secret "node-exporter-tls" not found Apr 23 01:12:28.634954 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:28.633557 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4a3638c0-7a9c-4827-92b6-19af4e48804e-root\") pod \"node-exporter-kdwf9\" (UID: \"4a3638c0-7a9c-4827-92b6-19af4e48804e\") " pod="openshift-monitoring/node-exporter-kdwf9" Apr 23 01:12:28.634954 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:28.634509 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4a3638c0-7a9c-4827-92b6-19af4e48804e-node-exporter-accelerators-collector-config\") pod \"node-exporter-kdwf9\" (UID: \"4a3638c0-7a9c-4827-92b6-19af4e48804e\") " pod="openshift-monitoring/node-exporter-kdwf9" Apr 23 01:12:28.634954 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:28.634578 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a3638c0-7a9c-4827-92b6-19af4e48804e-sys\") pod \"node-exporter-kdwf9\" (UID: \"4a3638c0-7a9c-4827-92b6-19af4e48804e\") " pod="openshift-monitoring/node-exporter-kdwf9" Apr 23 01:12:28.634954 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:28.634649 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4a3638c0-7a9c-4827-92b6-19af4e48804e-metrics-client-ca\") pod \"node-exporter-kdwf9\" (UID: \"4a3638c0-7a9c-4827-92b6-19af4e48804e\") " pod="openshift-monitoring/node-exporter-kdwf9" Apr 23 01:12:28.634954 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:28.634709 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4a3638c0-7a9c-4827-92b6-19af4e48804e-node-exporter-wtmp\") pod \"node-exporter-kdwf9\" (UID: \"4a3638c0-7a9c-4827-92b6-19af4e48804e\") " pod="openshift-monitoring/node-exporter-kdwf9" Apr 23 01:12:28.635862 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:28.634985 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4a3638c0-7a9c-4827-92b6-19af4e48804e-node-exporter-textfile\") pod \"node-exporter-kdwf9\" (UID: \"4a3638c0-7a9c-4827-92b6-19af4e48804e\") " pod="openshift-monitoring/node-exporter-kdwf9" Apr 23 01:12:28.636347 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:28.636307 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4a3638c0-7a9c-4827-92b6-19af4e48804e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-kdwf9\" (UID: \"4a3638c0-7a9c-4827-92b6-19af4e48804e\") " pod="openshift-monitoring/node-exporter-kdwf9" Apr 23 01:12:28.642876 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:28.642827 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vbz2\" (UniqueName: \"kubernetes.io/projected/4a3638c0-7a9c-4827-92b6-19af4e48804e-kube-api-access-6vbz2\") pod \"node-exporter-kdwf9\" (UID: \"4a3638c0-7a9c-4827-92b6-19af4e48804e\") " pod="openshift-monitoring/node-exporter-kdwf9" Apr 23 01:12:29.138849 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:29.138809 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4a3638c0-7a9c-4827-92b6-19af4e48804e-node-exporter-tls\") pod \"node-exporter-kdwf9\" (UID: \"4a3638c0-7a9c-4827-92b6-19af4e48804e\") " pod="openshift-monitoring/node-exporter-kdwf9" Apr 23 01:12:29.142455 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:29.142425 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4a3638c0-7a9c-4827-92b6-19af4e48804e-node-exporter-tls\") pod \"node-exporter-kdwf9\" (UID: \"4a3638c0-7a9c-4827-92b6-19af4e48804e\") " pod="openshift-monitoring/node-exporter-kdwf9" Apr 23 01:12:29.305774 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:29.305696 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-kdwf9" Apr 23 01:12:29.319118 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:12:29.318317 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a3638c0_7a9c_4827_92b6_19af4e48804e.slice/crio-66d3b7d8a290572166fc53d436e8763a31910145665f63728d700d64bc047f11 WatchSource:0}: Error finding container 66d3b7d8a290572166fc53d436e8763a31910145665f63728d700d64bc047f11: Status 404 returned error can't find the container with id 66d3b7d8a290572166fc53d436e8763a31910145665f63728d700d64bc047f11 Apr 23 01:12:29.774389 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:12:29.774350 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-5mm4v" podUID="608e8d52-e2cd-48e3-b524-0f0d764d9501" Apr 23 01:12:30.246849 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:30.246773 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kdwf9" event={"ID":"4a3638c0-7a9c-4827-92b6-19af4e48804e","Type":"ContainerStarted","Data":"66d3b7d8a290572166fc53d436e8763a31910145665f63728d700d64bc047f11"} Apr 23 01:12:33.082095 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.082050 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/746bf4ef-4ba5-45d1-9cc6-ab6354c10b18-metrics-tls\") pod \"dns-default-rpd44\" (UID: \"746bf4ef-4ba5-45d1-9cc6-ab6354c10b18\") " pod="openshift-dns/dns-default-rpd44" Apr 23 01:12:33.082560 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.082148 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-registry-tls\") pod \"image-registry-65b68c6657-v6kp5\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:12:33.085201 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.085174 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-registry-tls\") pod \"image-registry-65b68c6657-v6kp5\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:12:33.085517 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.085495 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/746bf4ef-4ba5-45d1-9cc6-ab6354c10b18-metrics-tls\") pod \"dns-default-rpd44\" (UID: \"746bf4ef-4ba5-45d1-9cc6-ab6354c10b18\") " pod="openshift-dns/dns-default-rpd44" Apr 23 01:12:33.183302 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.183257 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/79d4f338-f964-4fa6-985e-50bbb3b105a9-cert\") pod \"ingress-canary-bk66m\" (UID: \"79d4f338-f964-4fa6-985e-50bbb3b105a9\") " pod="openshift-ingress-canary/ingress-canary-bk66m" Apr 23 01:12:33.186029 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.186005 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/79d4f338-f964-4fa6-985e-50bbb3b105a9-cert\") pod \"ingress-canary-bk66m\" (UID: \"79d4f338-f964-4fa6-985e-50bbb3b105a9\") " pod="openshift-ingress-canary/ingress-canary-bk66m" Apr 23 01:12:33.341546 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.341467 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-ndfmc\"" Apr 23 01:12:33.350274 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.350245 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:12:33.576823 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.576788 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-549565c777-rzlzc"] Apr 23 01:12:33.581560 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.581537 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-549565c777-rzlzc" Apr 23 01:12:33.584119 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.584061 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 23 01:12:33.584119 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.584071 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 23 01:12:33.584308 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.584151 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 23 01:12:33.584308 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.584292 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-dfgq7\"" Apr 23 01:12:33.584438 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.584360 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 23 01:12:33.584540 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.584525 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 23 01:12:33.589801 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.589779 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-549565c777-rzlzc"] Apr 23 01:12:33.590990 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.590905 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 23 01:12:33.688126 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.688078 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdjsb\" (UniqueName: \"kubernetes.io/projected/7a8c586f-d7a8-4d9d-9816-42be04262414-kube-api-access-tdjsb\") pod \"telemeter-client-549565c777-rzlzc\" (UID: \"7a8c586f-d7a8-4d9d-9816-42be04262414\") " pod="openshift-monitoring/telemeter-client-549565c777-rzlzc" Apr 23 01:12:33.688303 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.688140 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a8c586f-d7a8-4d9d-9816-42be04262414-serving-certs-ca-bundle\") pod \"telemeter-client-549565c777-rzlzc\" (UID: \"7a8c586f-d7a8-4d9d-9816-42be04262414\") " pod="openshift-monitoring/telemeter-client-549565c777-rzlzc" Apr 23 01:12:33.688303 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.688237 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/7a8c586f-d7a8-4d9d-9816-42be04262414-telemeter-client-tls\") pod \"telemeter-client-549565c777-rzlzc\" (UID: \"7a8c586f-d7a8-4d9d-9816-42be04262414\") " pod="openshift-monitoring/telemeter-client-549565c777-rzlzc" Apr 23 01:12:33.688303 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.688273 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7a8c586f-d7a8-4d9d-9816-42be04262414-metrics-client-ca\") pod \"telemeter-client-549565c777-rzlzc\" (UID: \"7a8c586f-d7a8-4d9d-9816-42be04262414\") " pod="openshift-monitoring/telemeter-client-549565c777-rzlzc" Apr 23 01:12:33.688433 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.688379 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/7a8c586f-d7a8-4d9d-9816-42be04262414-federate-client-tls\") pod \"telemeter-client-549565c777-rzlzc\" (UID: \"7a8c586f-d7a8-4d9d-9816-42be04262414\") " pod="openshift-monitoring/telemeter-client-549565c777-rzlzc" Apr 23 01:12:33.688433 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.688405 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7a8c586f-d7a8-4d9d-9816-42be04262414-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-549565c777-rzlzc\" (UID: \"7a8c586f-d7a8-4d9d-9816-42be04262414\") " pod="openshift-monitoring/telemeter-client-549565c777-rzlzc" Apr 23 01:12:33.688433 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.688429 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/7a8c586f-d7a8-4d9d-9816-42be04262414-secret-telemeter-client\") pod \"telemeter-client-549565c777-rzlzc\" (UID: \"7a8c586f-d7a8-4d9d-9816-42be04262414\") " pod="openshift-monitoring/telemeter-client-549565c777-rzlzc" Apr 23 01:12:33.688551 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.688481 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a8c586f-d7a8-4d9d-9816-42be04262414-telemeter-trusted-ca-bundle\") pod \"telemeter-client-549565c777-rzlzc\" (UID: \"7a8c586f-d7a8-4d9d-9816-42be04262414\") " pod="openshift-monitoring/telemeter-client-549565c777-rzlzc" Apr 23 01:12:33.790006 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.789958 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/7a8c586f-d7a8-4d9d-9816-42be04262414-federate-client-tls\") pod \"telemeter-client-549565c777-rzlzc\" (UID: \"7a8c586f-d7a8-4d9d-9816-42be04262414\") " pod="openshift-monitoring/telemeter-client-549565c777-rzlzc" Apr 23 01:12:33.790166 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.790015 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7a8c586f-d7a8-4d9d-9816-42be04262414-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-549565c777-rzlzc\" (UID: \"7a8c586f-d7a8-4d9d-9816-42be04262414\") " pod="openshift-monitoring/telemeter-client-549565c777-rzlzc" Apr 23 01:12:33.790166 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.790055 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/7a8c586f-d7a8-4d9d-9816-42be04262414-secret-telemeter-client\") pod \"telemeter-client-549565c777-rzlzc\" (UID: \"7a8c586f-d7a8-4d9d-9816-42be04262414\") " pod="openshift-monitoring/telemeter-client-549565c777-rzlzc" Apr 23 01:12:33.790166 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.790095 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a8c586f-d7a8-4d9d-9816-42be04262414-telemeter-trusted-ca-bundle\") pod \"telemeter-client-549565c777-rzlzc\" (UID: \"7a8c586f-d7a8-4d9d-9816-42be04262414\") " pod="openshift-monitoring/telemeter-client-549565c777-rzlzc" Apr 23 01:12:33.790332 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.790261 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tdjsb\" (UniqueName: \"kubernetes.io/projected/7a8c586f-d7a8-4d9d-9816-42be04262414-kube-api-access-tdjsb\") pod \"telemeter-client-549565c777-rzlzc\" (UID: \"7a8c586f-d7a8-4d9d-9816-42be04262414\") " pod="openshift-monitoring/telemeter-client-549565c777-rzlzc" Apr 23 01:12:33.790332 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.790305 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a8c586f-d7a8-4d9d-9816-42be04262414-serving-certs-ca-bundle\") pod \"telemeter-client-549565c777-rzlzc\" (UID: \"7a8c586f-d7a8-4d9d-9816-42be04262414\") " pod="openshift-monitoring/telemeter-client-549565c777-rzlzc" Apr 23 01:12:33.790434 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.790410 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/7a8c586f-d7a8-4d9d-9816-42be04262414-telemeter-client-tls\") pod \"telemeter-client-549565c777-rzlzc\" (UID: \"7a8c586f-d7a8-4d9d-9816-42be04262414\") " pod="openshift-monitoring/telemeter-client-549565c777-rzlzc" Apr 23 01:12:33.790485 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.790440 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7a8c586f-d7a8-4d9d-9816-42be04262414-metrics-client-ca\") pod \"telemeter-client-549565c777-rzlzc\" (UID: \"7a8c586f-d7a8-4d9d-9816-42be04262414\") " pod="openshift-monitoring/telemeter-client-549565c777-rzlzc" Apr 23 01:12:33.791187 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.791064 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a8c586f-d7a8-4d9d-9816-42be04262414-telemeter-trusted-ca-bundle\") pod \"telemeter-client-549565c777-rzlzc\" (UID: \"7a8c586f-d7a8-4d9d-9816-42be04262414\") " pod="openshift-monitoring/telemeter-client-549565c777-rzlzc" Apr 23 01:12:33.791483 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.791426 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7a8c586f-d7a8-4d9d-9816-42be04262414-metrics-client-ca\") pod \"telemeter-client-549565c777-rzlzc\" (UID: \"7a8c586f-d7a8-4d9d-9816-42be04262414\") " pod="openshift-monitoring/telemeter-client-549565c777-rzlzc" Apr 23 01:12:33.791483 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.791436 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a8c586f-d7a8-4d9d-9816-42be04262414-serving-certs-ca-bundle\") pod \"telemeter-client-549565c777-rzlzc\" (UID: \"7a8c586f-d7a8-4d9d-9816-42be04262414\") " pod="openshift-monitoring/telemeter-client-549565c777-rzlzc" Apr 23 01:12:33.793212 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.793155 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/7a8c586f-d7a8-4d9d-9816-42be04262414-federate-client-tls\") pod \"telemeter-client-549565c777-rzlzc\" (UID: \"7a8c586f-d7a8-4d9d-9816-42be04262414\") " pod="openshift-monitoring/telemeter-client-549565c777-rzlzc" Apr 23 01:12:33.793485 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.793464 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/7a8c586f-d7a8-4d9d-9816-42be04262414-secret-telemeter-client\") pod \"telemeter-client-549565c777-rzlzc\" (UID: \"7a8c586f-d7a8-4d9d-9816-42be04262414\") " pod="openshift-monitoring/telemeter-client-549565c777-rzlzc" Apr 23 01:12:33.793561 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.793514 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7a8c586f-d7a8-4d9d-9816-42be04262414-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-549565c777-rzlzc\" (UID: \"7a8c586f-d7a8-4d9d-9816-42be04262414\") " pod="openshift-monitoring/telemeter-client-549565c777-rzlzc" Apr 23 01:12:33.793648 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.793627 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/7a8c586f-d7a8-4d9d-9816-42be04262414-telemeter-client-tls\") pod \"telemeter-client-549565c777-rzlzc\" (UID: \"7a8c586f-d7a8-4d9d-9816-42be04262414\") " pod="openshift-monitoring/telemeter-client-549565c777-rzlzc" Apr 23 01:12:33.797579 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.797557 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdjsb\" (UniqueName: \"kubernetes.io/projected/7a8c586f-d7a8-4d9d-9816-42be04262414-kube-api-access-tdjsb\") pod \"telemeter-client-549565c777-rzlzc\" (UID: \"7a8c586f-d7a8-4d9d-9816-42be04262414\") " pod="openshift-monitoring/telemeter-client-549565c777-rzlzc" Apr 23 01:12:33.894397 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:33.894355 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-549565c777-rzlzc" Apr 23 01:12:34.562715 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.562679 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 01:12:34.570195 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.570162 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.573825 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.573797 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 01:12:34.574309 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.574134 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 23 01:12:34.574309 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.574219 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 23 01:12:34.576691 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.574427 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 23 01:12:34.576691 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.574703 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-p7fts\"" Apr 23 01:12:34.576691 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.574706 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 23 01:12:34.576691 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.574738 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 23 01:12:34.576691 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.574760 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 23 01:12:34.576691 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.574898 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-7c14esouppiq2\"" Apr 23 01:12:34.576691 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.574900 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 23 01:12:34.576691 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.574910 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 23 01:12:34.576691 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.574947 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 23 01:12:34.576691 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.574953 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 23 01:12:34.576691 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.575078 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 23 01:12:34.579330 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.579273 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 23 01:12:34.591290 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.591234 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 01:12:34.698901 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.698867 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6fab49af-d632-4250-9079-d2294d443fe1-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.699069 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.698924 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-config\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.699069 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.699021 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.699069 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.699055 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-web-config\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.699242 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.699084 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fab49af-d632-4250-9079-d2294d443fe1-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.699242 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.699112 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.699242 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.699140 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fab49af-d632-4250-9079-d2294d443fe1-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.699242 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.699212 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6fab49af-d632-4250-9079-d2294d443fe1-config-out\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.699446 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.699242 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.699446 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.699270 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.699446 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.699303 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.699446 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.699329 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6fab49af-d632-4250-9079-d2294d443fe1-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.699446 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.699348 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skbl7\" (UniqueName: \"kubernetes.io/projected/6fab49af-d632-4250-9079-d2294d443fe1-kube-api-access-skbl7\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.699446 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.699369 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fab49af-d632-4250-9079-d2294d443fe1-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.699446 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.699393 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6fab49af-d632-4250-9079-d2294d443fe1-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.699823 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.699449 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.699823 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.699469 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6fab49af-d632-4250-9079-d2294d443fe1-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.699823 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.699491 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.800827 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.800790 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.801017 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.800836 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6fab49af-d632-4250-9079-d2294d443fe1-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.801017 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.800863 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-skbl7\" (UniqueName: \"kubernetes.io/projected/6fab49af-d632-4250-9079-d2294d443fe1-kube-api-access-skbl7\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.801017 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.800890 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fab49af-d632-4250-9079-d2294d443fe1-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.801017 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.800912 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6fab49af-d632-4250-9079-d2294d443fe1-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.801017 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.800958 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.801017 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.800992 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6fab49af-d632-4250-9079-d2294d443fe1-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.801017 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.801016 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.805372 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.801473 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6fab49af-d632-4250-9079-d2294d443fe1-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.805372 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.801535 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6fab49af-d632-4250-9079-d2294d443fe1-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.805372 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.801585 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-config\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.805372 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.801660 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.805372 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.801690 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-web-config\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.805372 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.801720 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fab49af-d632-4250-9079-d2294d443fe1-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.805372 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.801753 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.805372 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.801782 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fab49af-d632-4250-9079-d2294d443fe1-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.805372 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.801831 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6fab49af-d632-4250-9079-d2294d443fe1-config-out\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.805372 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.801854 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.805372 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.801882 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.805372 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.801988 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fab49af-d632-4250-9079-d2294d443fe1-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.805372 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.802888 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fab49af-d632-4250-9079-d2294d443fe1-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.805372 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.805057 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fab49af-d632-4250-9079-d2294d443fe1-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.806592 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.806563 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6fab49af-d632-4250-9079-d2294d443fe1-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.807732 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.806770 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.807732 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.807250 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.807732 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.807331 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6fab49af-d632-4250-9079-d2294d443fe1-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.807732 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.807521 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6fab49af-d632-4250-9079-d2294d443fe1-config-out\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.808039 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.807988 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.809255 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.809209 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-web-config\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.809945 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.809893 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.809945 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.809898 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.810626 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.810490 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6fab49af-d632-4250-9079-d2294d443fe1-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.810772 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.810751 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-config\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.811109 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.811086 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-skbl7\" (UniqueName: \"kubernetes.io/projected/6fab49af-d632-4250-9079-d2294d443fe1-kube-api-access-skbl7\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.811310 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.811259 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.812185 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.812144 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:34.885500 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:34.885473 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:38.764764 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:38.764290 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bk66m" Apr 23 01:12:38.770793 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:38.770572 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-jjv8h\"" Apr 23 01:12:38.774499 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:38.774454 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 01:12:38.774747 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:38.774696 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bk66m" Apr 23 01:12:38.786850 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:38.786815 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-549565c777-rzlzc"] Apr 23 01:12:38.789984 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:12:38.789958 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a8c586f_d7a8_4d9d_9816_42be04262414.slice/crio-6b864562190bae00f6343e0484bdebcd7295e238654fa574ffdad83b24af91d9 WatchSource:0}: Error finding container 6b864562190bae00f6343e0484bdebcd7295e238654fa574ffdad83b24af91d9: Status 404 returned error can't find the container with id 6b864562190bae00f6343e0484bdebcd7295e238654fa574ffdad83b24af91d9 Apr 23 01:12:38.803345 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:38.802584 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-65b68c6657-v6kp5"] Apr 23 01:12:38.805724 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:12:38.805691 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1679ef2e_e9c0_4738_a19c_35582013fe18.slice/crio-a4d1c53f7c0bf2269139e8f1989b55ad23742772a8d95b97e2a477328966465c WatchSource:0}: Error finding container a4d1c53f7c0bf2269139e8f1989b55ad23742772a8d95b97e2a477328966465c: Status 404 returned error can't find the container with id a4d1c53f7c0bf2269139e8f1989b55ad23742772a8d95b97e2a477328966465c Apr 23 01:12:38.920303 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:38.920262 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bk66m"] Apr 23 01:12:38.931852 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:12:38.931802 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79d4f338_f964_4fa6_985e_50bbb3b105a9.slice/crio-a60d2b3715163d2ff1f93e602aaa947e6e15469cf566b671c23f71772a363478 WatchSource:0}: Error finding container a60d2b3715163d2ff1f93e602aaa947e6e15469cf566b671c23f71772a363478: Status 404 returned error can't find the container with id a60d2b3715163d2ff1f93e602aaa947e6e15469cf566b671c23f71772a363478 Apr 23 01:12:39.280213 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:39.280146 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bk66m" event={"ID":"79d4f338-f964-4fa6-985e-50bbb3b105a9","Type":"ContainerStarted","Data":"a60d2b3715163d2ff1f93e602aaa947e6e15469cf566b671c23f71772a363478"} Apr 23 01:12:39.281863 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:39.281831 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c7ccd58b9-dsg9x" event={"ID":"b06f04f0-7ac2-4abd-91c1-11d56af6cdd9","Type":"ContainerStarted","Data":"3d298c1cc1d15e49e154359815bba6f57656e61534d1110da267043f24c56b5f"} Apr 23 01:12:39.283330 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:39.283287 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6fab49af-d632-4250-9079-d2294d443fe1","Type":"ContainerStarted","Data":"97c7f0fbc0e10668b076120586f89f3f13640c9d28f48b6d73c6121a4095b1ae"} Apr 23 01:12:39.285129 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:39.285026 2569 generic.go:358] "Generic (PLEG): container finished" podID="4a3638c0-7a9c-4827-92b6-19af4e48804e" containerID="a2a438bff6bc70c7528cb6a23f6fffa4f4c35a7f96cb4a0a2d70416673d84792" exitCode=0 Apr 23 01:12:39.285129 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:39.285098 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kdwf9" event={"ID":"4a3638c0-7a9c-4827-92b6-19af4e48804e","Type":"ContainerDied","Data":"a2a438bff6bc70c7528cb6a23f6fffa4f4c35a7f96cb4a0a2d70416673d84792"} Apr 23 01:12:39.286487 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:39.286447 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-549565c777-rzlzc" event={"ID":"7a8c586f-d7a8-4d9d-9816-42be04262414","Type":"ContainerStarted","Data":"6b864562190bae00f6343e0484bdebcd7295e238654fa574ffdad83b24af91d9"} Apr 23 01:12:39.288003 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:39.287951 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-td9fq" event={"ID":"74ab69de-e0f4-4c2e-9254-d9fc69aed149","Type":"ContainerStarted","Data":"6a1ad2528c91a41f1f6d6466fb45a81e536c3ff9fc5b5046e6ccdffa934859ac"} Apr 23 01:12:39.288216 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:39.288197 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-td9fq" Apr 23 01:12:39.290445 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:39.289932 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" event={"ID":"1679ef2e-e9c0-4738-a19c-35582013fe18","Type":"ContainerStarted","Data":"40d64d94156bd53d43140d2c3b8debde87017900b1b4674ef59017588170db89"} Apr 23 01:12:39.290445 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:39.289960 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" event={"ID":"1679ef2e-e9c0-4738-a19c-35582013fe18","Type":"ContainerStarted","Data":"a4d1c53f7c0bf2269139e8f1989b55ad23742772a8d95b97e2a477328966465c"} Apr 23 01:12:39.290445 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:39.290420 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:12:39.297107 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:39.297048 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6c7ccd58b9-dsg9x" podStartSLOduration=1.356293049 podStartE2EDuration="12.29703189s" podCreationTimestamp="2026-04-23 01:12:27 +0000 UTC" firstStartedPulling="2026-04-23 01:12:27.635209328 +0000 UTC m=+155.420883663" lastFinishedPulling="2026-04-23 01:12:38.575948162 +0000 UTC m=+166.361622504" observedRunningTime="2026-04-23 01:12:39.296201853 +0000 UTC m=+167.081876209" watchObservedRunningTime="2026-04-23 01:12:39.29703189 +0000 UTC m=+167.082706248" Apr 23 01:12:39.301271 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:39.301252 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-td9fq" Apr 23 01:12:39.312192 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:39.310944 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-td9fq" podStartSLOduration=1.652232702 podStartE2EDuration="19.310928549s" podCreationTimestamp="2026-04-23 01:12:20 +0000 UTC" firstStartedPulling="2026-04-23 01:12:20.969669752 +0000 UTC m=+148.755344091" lastFinishedPulling="2026-04-23 01:12:38.628365603 +0000 UTC m=+166.414039938" observedRunningTime="2026-04-23 01:12:39.309827153 +0000 UTC m=+167.095501936" watchObservedRunningTime="2026-04-23 01:12:39.310928549 +0000 UTC m=+167.096602906" Apr 23 01:12:39.348094 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:39.348045 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" podStartSLOduration=166.348024595 podStartE2EDuration="2m46.348024595s" podCreationTimestamp="2026-04-23 01:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 01:12:39.344005078 +0000 UTC m=+167.129679436" watchObservedRunningTime="2026-04-23 01:12:39.348024595 +0000 UTC m=+167.133698953" Apr 23 01:12:40.298773 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:40.298657 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kdwf9" event={"ID":"4a3638c0-7a9c-4827-92b6-19af4e48804e","Type":"ContainerStarted","Data":"bc3fdeb51449f82315509f3717cd7e03d21d591b45d8a924fcf3abec503e04c1"} Apr 23 01:12:40.298773 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:40.298713 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kdwf9" event={"ID":"4a3638c0-7a9c-4827-92b6-19af4e48804e","Type":"ContainerStarted","Data":"d779c3f6519c3fd1e6971e72757e5bcf7b6e41e30ecda51674f1973d36dec001"} Apr 23 01:12:40.317173 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:40.317117 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-kdwf9" podStartSLOduration=3.065877372 podStartE2EDuration="12.317097201s" podCreationTimestamp="2026-04-23 01:12:28 +0000 UTC" firstStartedPulling="2026-04-23 01:12:29.32296155 +0000 UTC m=+157.108635902" lastFinishedPulling="2026-04-23 01:12:38.574181394 +0000 UTC m=+166.359855731" observedRunningTime="2026-04-23 01:12:40.315786762 +0000 UTC m=+168.101461145" watchObservedRunningTime="2026-04-23 01:12:40.317097201 +0000 UTC m=+168.102771559" Apr 23 01:12:41.306371 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:41.304898 2569 generic.go:358] "Generic (PLEG): container finished" podID="6fab49af-d632-4250-9079-d2294d443fe1" containerID="ecc69aa537b7a8aa8317d8e0df4828958167acd7d331047a09e6625b1a51340c" exitCode=0 Apr 23 01:12:41.306371 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:41.306246 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6fab49af-d632-4250-9079-d2294d443fe1","Type":"ContainerDied","Data":"ecc69aa537b7a8aa8317d8e0df4828958167acd7d331047a09e6625b1a51340c"} Apr 23 01:12:41.762684 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:41.762644 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rpd44" Apr 23 01:12:41.762860 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:41.762644 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5mm4v" Apr 23 01:12:41.765216 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:41.765189 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-h4q4b\"" Apr 23 01:12:41.773939 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:41.773916 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rpd44" Apr 23 01:12:42.441279 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:42.441174 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rpd44"] Apr 23 01:12:42.448529 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:12:42.448498 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod746bf4ef_4ba5_45d1_9cc6_ab6354c10b18.slice/crio-7cb123243eea0bb160681814dc254a3c2422461e2925bc669448cbae539648bb WatchSource:0}: Error finding container 7cb123243eea0bb160681814dc254a3c2422461e2925bc669448cbae539648bb: Status 404 returned error can't find the container with id 7cb123243eea0bb160681814dc254a3c2422461e2925bc669448cbae539648bb Apr 23 01:12:43.314391 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:43.314344 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rpd44" event={"ID":"746bf4ef-4ba5-45d1-9cc6-ab6354c10b18","Type":"ContainerStarted","Data":"7cb123243eea0bb160681814dc254a3c2422461e2925bc669448cbae539648bb"} Apr 23 01:12:43.318317 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:43.318279 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-549565c777-rzlzc" event={"ID":"7a8c586f-d7a8-4d9d-9816-42be04262414","Type":"ContainerStarted","Data":"57ff1fd0c6d82b1ea8fe27ec2c5b775934f20c403479596955a46f4580460228"} Apr 23 01:12:43.318459 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:43.318327 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-549565c777-rzlzc" event={"ID":"7a8c586f-d7a8-4d9d-9816-42be04262414","Type":"ContainerStarted","Data":"63b4fb6fff776e573e3ee1c912c8229654b0efb5addd081affef6f364fb24031"} Apr 23 01:12:43.318459 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:43.318343 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-549565c777-rzlzc" event={"ID":"7a8c586f-d7a8-4d9d-9816-42be04262414","Type":"ContainerStarted","Data":"b55c9fc7dd304d7bfec7d577ccc7533660271801d7874be63c6512d90351d98b"} Apr 23 01:12:43.320813 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:43.320753 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bk66m" event={"ID":"79d4f338-f964-4fa6-985e-50bbb3b105a9","Type":"ContainerStarted","Data":"da9889bda134195d092267700e2e5f68bb935427324105dc24e236006dc18a9c"} Apr 23 01:12:43.340453 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:43.338986 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-549565c777-rzlzc" podStartSLOduration=6.833885895 podStartE2EDuration="10.338968563s" podCreationTimestamp="2026-04-23 01:12:33 +0000 UTC" firstStartedPulling="2026-04-23 01:12:38.792185238 +0000 UTC m=+166.577859572" lastFinishedPulling="2026-04-23 01:12:42.297267898 +0000 UTC m=+170.082942240" observedRunningTime="2026-04-23 01:12:43.336773615 +0000 UTC m=+171.122447982" watchObservedRunningTime="2026-04-23 01:12:43.338968563 +0000 UTC m=+171.124642920" Apr 23 01:12:43.352392 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:43.352342 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-bk66m" podStartSLOduration=134.984794752 podStartE2EDuration="2m18.352326309s" podCreationTimestamp="2026-04-23 01:10:25 +0000 UTC" firstStartedPulling="2026-04-23 01:12:38.934271868 +0000 UTC m=+166.719946204" lastFinishedPulling="2026-04-23 01:12:42.301803412 +0000 UTC m=+170.087477761" observedRunningTime="2026-04-23 01:12:43.350475132 +0000 UTC m=+171.136149491" watchObservedRunningTime="2026-04-23 01:12:43.352326309 +0000 UTC m=+171.138000667" Apr 23 01:12:43.439268 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:43.438530 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c7ccd58b9-dsg9x"] Apr 23 01:12:45.330946 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:45.330910 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rpd44" event={"ID":"746bf4ef-4ba5-45d1-9cc6-ab6354c10b18","Type":"ContainerStarted","Data":"baed01107d9152809d526467589e57388b8d7e6ffa2544ee7e1f3163666d7e01"} Apr 23 01:12:46.337453 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:46.337410 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rpd44" event={"ID":"746bf4ef-4ba5-45d1-9cc6-ab6354c10b18","Type":"ContainerStarted","Data":"b74aa6e9156ffa26ef8ac51903af45893d53bdd81b66cb3d7a532018c0af1206"} Apr 23 01:12:46.338384 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:46.338354 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-rpd44" Apr 23 01:12:46.353723 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:46.353663 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-rpd44" podStartSLOduration=138.782610829 podStartE2EDuration="2m21.353642966s" podCreationTimestamp="2026-04-23 01:10:25 +0000 UTC" firstStartedPulling="2026-04-23 01:12:42.451367316 +0000 UTC m=+170.237041657" lastFinishedPulling="2026-04-23 01:12:45.022399445 +0000 UTC m=+172.808073794" observedRunningTime="2026-04-23 01:12:46.353222901 +0000 UTC m=+174.138897267" watchObservedRunningTime="2026-04-23 01:12:46.353642966 +0000 UTC m=+174.139317323" Apr 23 01:12:47.344080 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:47.344045 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6fab49af-d632-4250-9079-d2294d443fe1","Type":"ContainerStarted","Data":"66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85"} Apr 23 01:12:47.344432 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:47.344097 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6fab49af-d632-4250-9079-d2294d443fe1","Type":"ContainerStarted","Data":"84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983"} Apr 23 01:12:47.503261 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:47.503221 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6c7ccd58b9-dsg9x" Apr 23 01:12:51.359346 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:51.359312 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6fab49af-d632-4250-9079-d2294d443fe1","Type":"ContainerStarted","Data":"f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb"} Apr 23 01:12:51.359346 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:51.359352 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6fab49af-d632-4250-9079-d2294d443fe1","Type":"ContainerStarted","Data":"5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a"} Apr 23 01:12:51.359847 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:51.359361 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6fab49af-d632-4250-9079-d2294d443fe1","Type":"ContainerStarted","Data":"7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb"} Apr 23 01:12:51.359847 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:51.359371 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6fab49af-d632-4250-9079-d2294d443fe1","Type":"ContainerStarted","Data":"5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693"} Apr 23 01:12:51.385196 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:51.385152 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=5.653612728 podStartE2EDuration="17.385138727s" podCreationTimestamp="2026-04-23 01:12:34 +0000 UTC" firstStartedPulling="2026-04-23 01:12:38.781315883 +0000 UTC m=+166.566990221" lastFinishedPulling="2026-04-23 01:12:50.512841883 +0000 UTC m=+178.298516220" observedRunningTime="2026-04-23 01:12:51.383307899 +0000 UTC m=+179.168982256" watchObservedRunningTime="2026-04-23 01:12:51.385138727 +0000 UTC m=+179.170813084" Apr 23 01:12:53.354756 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:53.354717 2569 patch_prober.go:28] interesting pod/image-registry-65b68c6657-v6kp5 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 01:12:53.355136 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:53.354784 2569 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" podUID="1679ef2e-e9c0-4738-a19c-35582013fe18" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 01:12:54.886487 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:54.886444 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:12:57.350259 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:12:57.350229 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-rpd44" Apr 23 01:13:01.311425 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:01.311397 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:13:06.412202 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:06.412172 2569 generic.go:358] "Generic (PLEG): container finished" podID="7df9359b-30f8-4806-98db-cf999c7f0ed8" containerID="b59396295ce0c96421f2e3a9d5ec6e9a623378d4e0ba0789648fa275c48df7d7" exitCode=0 Apr 23 01:13:06.412628 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:06.412249 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-sxqx7" event={"ID":"7df9359b-30f8-4806-98db-cf999c7f0ed8","Type":"ContainerDied","Data":"b59396295ce0c96421f2e3a9d5ec6e9a623378d4e0ba0789648fa275c48df7d7"} Apr 23 01:13:06.412628 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:06.412594 2569 scope.go:117] "RemoveContainer" containerID="b59396295ce0c96421f2e3a9d5ec6e9a623378d4e0ba0789648fa275c48df7d7" Apr 23 01:13:07.416964 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:07.416931 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-sxqx7" event={"ID":"7df9359b-30f8-4806-98db-cf999c7f0ed8","Type":"ContainerStarted","Data":"3aea515ccfe1a76c612128d86217bae4c459dd2fc5afb013fa4305ae9a87a60b"} Apr 23 01:13:08.466862 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:08.466802 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6c7ccd58b9-dsg9x" podUID="b06f04f0-7ac2-4abd-91c1-11d56af6cdd9" containerName="console" containerID="cri-o://3d298c1cc1d15e49e154359815bba6f57656e61534d1110da267043f24c56b5f" gracePeriod=15 Apr 23 01:13:08.700402 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:08.700380 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c7ccd58b9-dsg9x_b06f04f0-7ac2-4abd-91c1-11d56af6cdd9/console/0.log" Apr 23 01:13:08.700513 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:08.700449 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c7ccd58b9-dsg9x" Apr 23 01:13:08.813951 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:08.813876 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b06f04f0-7ac2-4abd-91c1-11d56af6cdd9-console-oauth-config\") pod \"b06f04f0-7ac2-4abd-91c1-11d56af6cdd9\" (UID: \"b06f04f0-7ac2-4abd-91c1-11d56af6cdd9\") " Apr 23 01:13:08.813951 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:08.813912 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b06f04f0-7ac2-4abd-91c1-11d56af6cdd9-console-serving-cert\") pod \"b06f04f0-7ac2-4abd-91c1-11d56af6cdd9\" (UID: \"b06f04f0-7ac2-4abd-91c1-11d56af6cdd9\") " Apr 23 01:13:08.813951 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:08.813945 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b06f04f0-7ac2-4abd-91c1-11d56af6cdd9-console-config\") pod \"b06f04f0-7ac2-4abd-91c1-11d56af6cdd9\" (UID: \"b06f04f0-7ac2-4abd-91c1-11d56af6cdd9\") " Apr 23 01:13:08.814212 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:08.813964 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b06f04f0-7ac2-4abd-91c1-11d56af6cdd9-service-ca\") pod \"b06f04f0-7ac2-4abd-91c1-11d56af6cdd9\" (UID: \"b06f04f0-7ac2-4abd-91c1-11d56af6cdd9\") " Apr 23 01:13:08.814212 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:08.814111 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dhnw\" (UniqueName: \"kubernetes.io/projected/b06f04f0-7ac2-4abd-91c1-11d56af6cdd9-kube-api-access-8dhnw\") pod \"b06f04f0-7ac2-4abd-91c1-11d56af6cdd9\" (UID: \"b06f04f0-7ac2-4abd-91c1-11d56af6cdd9\") " Apr 23 01:13:08.814212 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:08.814185 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b06f04f0-7ac2-4abd-91c1-11d56af6cdd9-oauth-serving-cert\") pod \"b06f04f0-7ac2-4abd-91c1-11d56af6cdd9\" (UID: \"b06f04f0-7ac2-4abd-91c1-11d56af6cdd9\") " Apr 23 01:13:08.814431 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:08.814403 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b06f04f0-7ac2-4abd-91c1-11d56af6cdd9-service-ca" (OuterVolumeSpecName: "service-ca") pod "b06f04f0-7ac2-4abd-91c1-11d56af6cdd9" (UID: "b06f04f0-7ac2-4abd-91c1-11d56af6cdd9"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 01:13:08.814514 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:08.814495 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b06f04f0-7ac2-4abd-91c1-11d56af6cdd9-service-ca\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:13:08.814565 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:08.814550 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b06f04f0-7ac2-4abd-91c1-11d56af6cdd9-console-config" (OuterVolumeSpecName: "console-config") pod "b06f04f0-7ac2-4abd-91c1-11d56af6cdd9" (UID: "b06f04f0-7ac2-4abd-91c1-11d56af6cdd9"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 01:13:08.814650 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:08.814557 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b06f04f0-7ac2-4abd-91c1-11d56af6cdd9-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b06f04f0-7ac2-4abd-91c1-11d56af6cdd9" (UID: "b06f04f0-7ac2-4abd-91c1-11d56af6cdd9"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 01:13:08.816289 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:08.816263 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b06f04f0-7ac2-4abd-91c1-11d56af6cdd9-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b06f04f0-7ac2-4abd-91c1-11d56af6cdd9" (UID: "b06f04f0-7ac2-4abd-91c1-11d56af6cdd9"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 01:13:08.816385 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:08.816287 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b06f04f0-7ac2-4abd-91c1-11d56af6cdd9-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b06f04f0-7ac2-4abd-91c1-11d56af6cdd9" (UID: "b06f04f0-7ac2-4abd-91c1-11d56af6cdd9"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 01:13:08.816385 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:08.816278 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b06f04f0-7ac2-4abd-91c1-11d56af6cdd9-kube-api-access-8dhnw" (OuterVolumeSpecName: "kube-api-access-8dhnw") pod "b06f04f0-7ac2-4abd-91c1-11d56af6cdd9" (UID: "b06f04f0-7ac2-4abd-91c1-11d56af6cdd9"). InnerVolumeSpecName "kube-api-access-8dhnw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:13:08.915528 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:08.915488 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b06f04f0-7ac2-4abd-91c1-11d56af6cdd9-console-oauth-config\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:13:08.915528 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:08.915523 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b06f04f0-7ac2-4abd-91c1-11d56af6cdd9-console-serving-cert\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:13:08.915781 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:08.915537 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b06f04f0-7ac2-4abd-91c1-11d56af6cdd9-console-config\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:13:08.915781 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:08.915552 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8dhnw\" (UniqueName: \"kubernetes.io/projected/b06f04f0-7ac2-4abd-91c1-11d56af6cdd9-kube-api-access-8dhnw\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:13:08.915781 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:08.915567 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b06f04f0-7ac2-4abd-91c1-11d56af6cdd9-oauth-serving-cert\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:13:09.423218 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:09.423190 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c7ccd58b9-dsg9x_b06f04f0-7ac2-4abd-91c1-11d56af6cdd9/console/0.log" Apr 23 01:13:09.423420 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:09.423229 2569 generic.go:358] "Generic (PLEG): container finished" podID="b06f04f0-7ac2-4abd-91c1-11d56af6cdd9" containerID="3d298c1cc1d15e49e154359815bba6f57656e61534d1110da267043f24c56b5f" exitCode=2 Apr 23 01:13:09.423420 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:09.423267 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c7ccd58b9-dsg9x" event={"ID":"b06f04f0-7ac2-4abd-91c1-11d56af6cdd9","Type":"ContainerDied","Data":"3d298c1cc1d15e49e154359815bba6f57656e61534d1110da267043f24c56b5f"} Apr 23 01:13:09.423420 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:09.423292 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c7ccd58b9-dsg9x" Apr 23 01:13:09.423420 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:09.423318 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c7ccd58b9-dsg9x" event={"ID":"b06f04f0-7ac2-4abd-91c1-11d56af6cdd9","Type":"ContainerDied","Data":"d8137b01f7b8f2731d80a49dcb8b1c6be0102526e77f5507b117d0d135ca178b"} Apr 23 01:13:09.423420 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:09.423339 2569 scope.go:117] "RemoveContainer" containerID="3d298c1cc1d15e49e154359815bba6f57656e61534d1110da267043f24c56b5f" Apr 23 01:13:09.431704 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:09.431686 2569 scope.go:117] "RemoveContainer" containerID="3d298c1cc1d15e49e154359815bba6f57656e61534d1110da267043f24c56b5f" Apr 23 01:13:09.431973 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:13:09.431955 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d298c1cc1d15e49e154359815bba6f57656e61534d1110da267043f24c56b5f\": container with ID starting with 3d298c1cc1d15e49e154359815bba6f57656e61534d1110da267043f24c56b5f not found: ID does not exist" containerID="3d298c1cc1d15e49e154359815bba6f57656e61534d1110da267043f24c56b5f" Apr 23 01:13:09.432032 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:09.431979 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d298c1cc1d15e49e154359815bba6f57656e61534d1110da267043f24c56b5f"} err="failed to get container status \"3d298c1cc1d15e49e154359815bba6f57656e61534d1110da267043f24c56b5f\": rpc error: code = NotFound desc = could not find container \"3d298c1cc1d15e49e154359815bba6f57656e61534d1110da267043f24c56b5f\": container with ID starting with 3d298c1cc1d15e49e154359815bba6f57656e61534d1110da267043f24c56b5f not found: ID does not exist" Apr 23 01:13:09.442592 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:09.442567 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c7ccd58b9-dsg9x"] Apr 23 01:13:09.445075 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:09.445052 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6c7ccd58b9-dsg9x"] Apr 23 01:13:10.766130 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:10.766093 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b06f04f0-7ac2-4abd-91c1-11d56af6cdd9" path="/var/lib/kubelet/pods/b06f04f0-7ac2-4abd-91c1-11d56af6cdd9/volumes" Apr 23 01:13:10.879401 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:10.879370 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-65b68c6657-v6kp5"] Apr 23 01:13:26.481304 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:26.481264 2569 generic.go:358] "Generic (PLEG): container finished" podID="2a988bd2-2ed4-468c-ab78-3be082ee6a61" containerID="3c2db433206bef6a3de94d8e2af1cf55cd4ec521c7cc6393f5e1a024cccf0c97" exitCode=0 Apr 23 01:13:26.481844 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:26.481334 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-krk48" event={"ID":"2a988bd2-2ed4-468c-ab78-3be082ee6a61","Type":"ContainerDied","Data":"3c2db433206bef6a3de94d8e2af1cf55cd4ec521c7cc6393f5e1a024cccf0c97"} Apr 23 01:13:26.481844 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:26.481694 2569 scope.go:117] "RemoveContainer" containerID="3c2db433206bef6a3de94d8e2af1cf55cd4ec521c7cc6393f5e1a024cccf0c97" Apr 23 01:13:27.486559 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:27.486528 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-krk48" event={"ID":"2a988bd2-2ed4-468c-ab78-3be082ee6a61","Type":"ContainerStarted","Data":"33948131d95da2984736c658ada6d2ced873421bd57460f59f208a6643d173c8"} Apr 23 01:13:34.886139 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:34.886105 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:34.905214 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:34.905186 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:35.524216 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:35.524191 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:35.903053 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:35.903011 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" podUID="1679ef2e-e9c0-4738-a19c-35582013fe18" containerName="registry" containerID="cri-o://40d64d94156bd53d43140d2c3b8debde87017900b1b4674ef59017588170db89" gracePeriod=30 Apr 23 01:13:37.159251 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:37.159230 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:13:37.260912 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:37.260818 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1679ef2e-e9c0-4738-a19c-35582013fe18-ca-trust-extracted\") pod \"1679ef2e-e9c0-4738-a19c-35582013fe18\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " Apr 23 01:13:37.260912 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:37.260870 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1679ef2e-e9c0-4738-a19c-35582013fe18-trusted-ca\") pod \"1679ef2e-e9c0-4738-a19c-35582013fe18\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " Apr 23 01:13:37.260912 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:37.260903 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-registry-tls\") pod \"1679ef2e-e9c0-4738-a19c-35582013fe18\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " Apr 23 01:13:37.261191 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:37.260922 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1679ef2e-e9c0-4738-a19c-35582013fe18-registry-certificates\") pod \"1679ef2e-e9c0-4738-a19c-35582013fe18\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " Apr 23 01:13:37.261191 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:37.260944 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-bound-sa-token\") pod \"1679ef2e-e9c0-4738-a19c-35582013fe18\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " Apr 23 01:13:37.261191 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:37.260977 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1679ef2e-e9c0-4738-a19c-35582013fe18-installation-pull-secrets\") pod \"1679ef2e-e9c0-4738-a19c-35582013fe18\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " Apr 23 01:13:37.261191 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:37.261030 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1679ef2e-e9c0-4738-a19c-35582013fe18-image-registry-private-configuration\") pod \"1679ef2e-e9c0-4738-a19c-35582013fe18\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " Apr 23 01:13:37.261191 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:37.261052 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8nns\" (UniqueName: \"kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-kube-api-access-v8nns\") pod \"1679ef2e-e9c0-4738-a19c-35582013fe18\" (UID: \"1679ef2e-e9c0-4738-a19c-35582013fe18\") " Apr 23 01:13:37.261441 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:37.261397 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1679ef2e-e9c0-4738-a19c-35582013fe18-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "1679ef2e-e9c0-4738-a19c-35582013fe18" (UID: "1679ef2e-e9c0-4738-a19c-35582013fe18"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 01:13:37.261441 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:37.261408 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1679ef2e-e9c0-4738-a19c-35582013fe18-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "1679ef2e-e9c0-4738-a19c-35582013fe18" (UID: "1679ef2e-e9c0-4738-a19c-35582013fe18"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 01:13:37.263778 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:37.263744 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1679ef2e-e9c0-4738-a19c-35582013fe18-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "1679ef2e-e9c0-4738-a19c-35582013fe18" (UID: "1679ef2e-e9c0-4738-a19c-35582013fe18"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 01:13:37.263778 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:37.263765 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "1679ef2e-e9c0-4738-a19c-35582013fe18" (UID: "1679ef2e-e9c0-4738-a19c-35582013fe18"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:13:37.263961 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:37.263751 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "1679ef2e-e9c0-4738-a19c-35582013fe18" (UID: "1679ef2e-e9c0-4738-a19c-35582013fe18"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:13:37.263961 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:37.263893 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-kube-api-access-v8nns" (OuterVolumeSpecName: "kube-api-access-v8nns") pod "1679ef2e-e9c0-4738-a19c-35582013fe18" (UID: "1679ef2e-e9c0-4738-a19c-35582013fe18"). InnerVolumeSpecName "kube-api-access-v8nns". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:13:37.263961 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:37.263924 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1679ef2e-e9c0-4738-a19c-35582013fe18-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "1679ef2e-e9c0-4738-a19c-35582013fe18" (UID: "1679ef2e-e9c0-4738-a19c-35582013fe18"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 01:13:37.269826 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:37.269803 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1679ef2e-e9c0-4738-a19c-35582013fe18-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "1679ef2e-e9c0-4738-a19c-35582013fe18" (UID: "1679ef2e-e9c0-4738-a19c-35582013fe18"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 01:13:37.362392 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:37.362332 2569 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-registry-tls\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:13:37.362392 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:37.362386 2569 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1679ef2e-e9c0-4738-a19c-35582013fe18-registry-certificates\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:13:37.362392 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:37.362401 2569 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-bound-sa-token\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:13:37.362711 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:37.362414 2569 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1679ef2e-e9c0-4738-a19c-35582013fe18-installation-pull-secrets\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:13:37.362711 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:37.362427 2569 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1679ef2e-e9c0-4738-a19c-35582013fe18-image-registry-private-configuration\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:13:37.362711 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:37.362441 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v8nns\" (UniqueName: \"kubernetes.io/projected/1679ef2e-e9c0-4738-a19c-35582013fe18-kube-api-access-v8nns\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:13:37.362711 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:37.362452 2569 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1679ef2e-e9c0-4738-a19c-35582013fe18-ca-trust-extracted\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:13:37.362711 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:37.362463 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1679ef2e-e9c0-4738-a19c-35582013fe18-trusted-ca\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:13:37.515403 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:37.515306 2569 generic.go:358] "Generic (PLEG): container finished" podID="1679ef2e-e9c0-4738-a19c-35582013fe18" containerID="40d64d94156bd53d43140d2c3b8debde87017900b1b4674ef59017588170db89" exitCode=0 Apr 23 01:13:37.515403 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:37.515373 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" event={"ID":"1679ef2e-e9c0-4738-a19c-35582013fe18","Type":"ContainerDied","Data":"40d64d94156bd53d43140d2c3b8debde87017900b1b4674ef59017588170db89"} Apr 23 01:13:37.515403 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:37.515399 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" event={"ID":"1679ef2e-e9c0-4738-a19c-35582013fe18","Type":"ContainerDied","Data":"a4d1c53f7c0bf2269139e8f1989b55ad23742772a8d95b97e2a477328966465c"} Apr 23 01:13:37.515403 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:37.515398 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-65b68c6657-v6kp5" Apr 23 01:13:37.515757 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:37.515415 2569 scope.go:117] "RemoveContainer" containerID="40d64d94156bd53d43140d2c3b8debde87017900b1b4674ef59017588170db89" Apr 23 01:13:37.523785 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:37.523766 2569 scope.go:117] "RemoveContainer" containerID="40d64d94156bd53d43140d2c3b8debde87017900b1b4674ef59017588170db89" Apr 23 01:13:37.524045 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:13:37.524025 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40d64d94156bd53d43140d2c3b8debde87017900b1b4674ef59017588170db89\": container with ID starting with 40d64d94156bd53d43140d2c3b8debde87017900b1b4674ef59017588170db89 not found: ID does not exist" containerID="40d64d94156bd53d43140d2c3b8debde87017900b1b4674ef59017588170db89" Apr 23 01:13:37.524103 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:37.524054 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40d64d94156bd53d43140d2c3b8debde87017900b1b4674ef59017588170db89"} err="failed to get container status \"40d64d94156bd53d43140d2c3b8debde87017900b1b4674ef59017588170db89\": rpc error: code = NotFound desc = could not find container \"40d64d94156bd53d43140d2c3b8debde87017900b1b4674ef59017588170db89\": container with ID starting with 40d64d94156bd53d43140d2c3b8debde87017900b1b4674ef59017588170db89 not found: ID does not exist" Apr 23 01:13:37.534492 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:37.534466 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-65b68c6657-v6kp5"] Apr 23 01:13:37.537721 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:37.537695 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-65b68c6657-v6kp5"] Apr 23 01:13:38.766307 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:38.766273 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1679ef2e-e9c0-4738-a19c-35582013fe18" path="/var/lib/kubelet/pods/1679ef2e-e9c0-4738-a19c-35582013fe18/volumes" Apr 23 01:13:52.863761 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:52.863723 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 01:13:52.864633 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:52.864373 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6fab49af-d632-4250-9079-d2294d443fe1" containerName="kube-rbac-proxy" containerID="cri-o://5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a" gracePeriod=600 Apr 23 01:13:52.864633 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:52.864373 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6fab49af-d632-4250-9079-d2294d443fe1" containerName="prometheus" containerID="cri-o://84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983" gracePeriod=600 Apr 23 01:13:52.864633 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:52.864515 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6fab49af-d632-4250-9079-d2294d443fe1" containerName="kube-rbac-proxy-web" containerID="cri-o://7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb" gracePeriod=600 Apr 23 01:13:52.864633 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:52.864568 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6fab49af-d632-4250-9079-d2294d443fe1" containerName="thanos-sidecar" containerID="cri-o://5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693" gracePeriod=600 Apr 23 01:13:52.864633 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:52.864589 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6fab49af-d632-4250-9079-d2294d443fe1" containerName="config-reloader" containerID="cri-o://66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85" gracePeriod=600 Apr 23 01:13:52.864936 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:52.864602 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6fab49af-d632-4250-9079-d2294d443fe1" containerName="kube-rbac-proxy-thanos" containerID="cri-o://f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb" gracePeriod=600 Apr 23 01:13:53.116090 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.116066 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.188768 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.188736 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6fab49af-d632-4250-9079-d2294d443fe1-prometheus-k8s-rulefiles-0\") pod \"6fab49af-d632-4250-9079-d2294d443fe1\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " Apr 23 01:13:53.188960 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.188777 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fab49af-d632-4250-9079-d2294d443fe1-configmap-serving-certs-ca-bundle\") pod \"6fab49af-d632-4250-9079-d2294d443fe1\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " Apr 23 01:13:53.188960 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.188808 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"6fab49af-d632-4250-9079-d2294d443fe1\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " Apr 23 01:13:53.188960 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.188839 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6fab49af-d632-4250-9079-d2294d443fe1-prometheus-k8s-db\") pod \"6fab49af-d632-4250-9079-d2294d443fe1\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " Apr 23 01:13:53.188960 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.188865 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6fab49af-d632-4250-9079-d2294d443fe1-config-out\") pod \"6fab49af-d632-4250-9079-d2294d443fe1\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " Apr 23 01:13:53.188960 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.188896 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"6fab49af-d632-4250-9079-d2294d443fe1\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " Apr 23 01:13:53.188960 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.188945 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skbl7\" (UniqueName: \"kubernetes.io/projected/6fab49af-d632-4250-9079-d2294d443fe1-kube-api-access-skbl7\") pod \"6fab49af-d632-4250-9079-d2294d443fe1\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " Apr 23 01:13:53.189258 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.188972 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6fab49af-d632-4250-9079-d2294d443fe1-tls-assets\") pod \"6fab49af-d632-4250-9079-d2294d443fe1\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " Apr 23 01:13:53.189258 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.189003 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6fab49af-d632-4250-9079-d2294d443fe1-configmap-metrics-client-ca\") pod \"6fab49af-d632-4250-9079-d2294d443fe1\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " Apr 23 01:13:53.189258 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.189026 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-web-config\") pod \"6fab49af-d632-4250-9079-d2294d443fe1\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " Apr 23 01:13:53.189258 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.189050 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fab49af-d632-4250-9079-d2294d443fe1-configmap-kubelet-serving-ca-bundle\") pod \"6fab49af-d632-4250-9079-d2294d443fe1\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " Apr 23 01:13:53.189258 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.189079 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-secret-kube-rbac-proxy\") pod \"6fab49af-d632-4250-9079-d2294d443fe1\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " Apr 23 01:13:53.189258 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.189109 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-config\") pod \"6fab49af-d632-4250-9079-d2294d443fe1\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " Apr 23 01:13:53.189258 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.189135 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-secret-grpc-tls\") pod \"6fab49af-d632-4250-9079-d2294d443fe1\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " Apr 23 01:13:53.189258 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.189178 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fab49af-d632-4250-9079-d2294d443fe1-prometheus-trusted-ca-bundle\") pod \"6fab49af-d632-4250-9079-d2294d443fe1\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " Apr 23 01:13:53.189258 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.189203 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-secret-metrics-client-certs\") pod \"6fab49af-d632-4250-9079-d2294d443fe1\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " Apr 23 01:13:53.189258 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.189230 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-secret-prometheus-k8s-tls\") pod \"6fab49af-d632-4250-9079-d2294d443fe1\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " Apr 23 01:13:53.189258 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.189233 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fab49af-d632-4250-9079-d2294d443fe1-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "6fab49af-d632-4250-9079-d2294d443fe1" (UID: "6fab49af-d632-4250-9079-d2294d443fe1"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 01:13:53.189809 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.189301 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-thanos-prometheus-http-client-file\") pod \"6fab49af-d632-4250-9079-d2294d443fe1\" (UID: \"6fab49af-d632-4250-9079-d2294d443fe1\") " Apr 23 01:13:53.189809 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.189531 2569 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fab49af-d632-4250-9079-d2294d443fe1-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:13:53.191533 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.189895 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fab49af-d632-4250-9079-d2294d443fe1-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "6fab49af-d632-4250-9079-d2294d443fe1" (UID: "6fab49af-d632-4250-9079-d2294d443fe1"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 01:13:53.191533 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.190239 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fab49af-d632-4250-9079-d2294d443fe1-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "6fab49af-d632-4250-9079-d2294d443fe1" (UID: "6fab49af-d632-4250-9079-d2294d443fe1"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 01:13:53.191533 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.190352 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fab49af-d632-4250-9079-d2294d443fe1-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "6fab49af-d632-4250-9079-d2294d443fe1" (UID: "6fab49af-d632-4250-9079-d2294d443fe1"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 01:13:53.191533 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.190595 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fab49af-d632-4250-9079-d2294d443fe1-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "6fab49af-d632-4250-9079-d2294d443fe1" (UID: "6fab49af-d632-4250-9079-d2294d443fe1"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 01:13:53.191533 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.191267 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fab49af-d632-4250-9079-d2294d443fe1-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "6fab49af-d632-4250-9079-d2294d443fe1" (UID: "6fab49af-d632-4250-9079-d2294d443fe1"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 01:13:53.193989 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.193949 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "6fab49af-d632-4250-9079-d2294d443fe1" (UID: "6fab49af-d632-4250-9079-d2294d443fe1"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 01:13:53.193989 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.193967 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "6fab49af-d632-4250-9079-d2294d443fe1" (UID: "6fab49af-d632-4250-9079-d2294d443fe1"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 01:13:53.194179 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.193972 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fab49af-d632-4250-9079-d2294d443fe1-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "6fab49af-d632-4250-9079-d2294d443fe1" (UID: "6fab49af-d632-4250-9079-d2294d443fe1"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:13:53.194348 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.194328 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fab49af-d632-4250-9079-d2294d443fe1-config-out" (OuterVolumeSpecName: "config-out") pod "6fab49af-d632-4250-9079-d2294d443fe1" (UID: "6fab49af-d632-4250-9079-d2294d443fe1"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 01:13:53.194489 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.194460 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "6fab49af-d632-4250-9079-d2294d443fe1" (UID: "6fab49af-d632-4250-9079-d2294d443fe1"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 01:13:53.194555 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.194478 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "6fab49af-d632-4250-9079-d2294d443fe1" (UID: "6fab49af-d632-4250-9079-d2294d443fe1"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 01:13:53.194665 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.194638 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "6fab49af-d632-4250-9079-d2294d443fe1" (UID: "6fab49af-d632-4250-9079-d2294d443fe1"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 01:13:53.194792 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.194771 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-config" (OuterVolumeSpecName: "config") pod "6fab49af-d632-4250-9079-d2294d443fe1" (UID: "6fab49af-d632-4250-9079-d2294d443fe1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 01:13:53.194899 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.194825 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "6fab49af-d632-4250-9079-d2294d443fe1" (UID: "6fab49af-d632-4250-9079-d2294d443fe1"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 01:13:53.195331 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.195303 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fab49af-d632-4250-9079-d2294d443fe1-kube-api-access-skbl7" (OuterVolumeSpecName: "kube-api-access-skbl7") pod "6fab49af-d632-4250-9079-d2294d443fe1" (UID: "6fab49af-d632-4250-9079-d2294d443fe1"). InnerVolumeSpecName "kube-api-access-skbl7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:13:53.195456 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.195441 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "6fab49af-d632-4250-9079-d2294d443fe1" (UID: "6fab49af-d632-4250-9079-d2294d443fe1"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 01:13:53.203864 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.203840 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-web-config" (OuterVolumeSpecName: "web-config") pod "6fab49af-d632-4250-9079-d2294d443fe1" (UID: "6fab49af-d632-4250-9079-d2294d443fe1"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 01:13:53.290589 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.290553 2569 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-thanos-prometheus-http-client-file\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:13:53.290589 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.290587 2569 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6fab49af-d632-4250-9079-d2294d443fe1-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:13:53.290841 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.290603 2569 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:13:53.290841 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.290639 2569 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6fab49af-d632-4250-9079-d2294d443fe1-prometheus-k8s-db\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:13:53.290841 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.290653 2569 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6fab49af-d632-4250-9079-d2294d443fe1-config-out\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:13:53.290841 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.290667 2569 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:13:53.290841 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.290680 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-skbl7\" (UniqueName: \"kubernetes.io/projected/6fab49af-d632-4250-9079-d2294d443fe1-kube-api-access-skbl7\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:13:53.290841 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.290692 2569 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6fab49af-d632-4250-9079-d2294d443fe1-tls-assets\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:13:53.290841 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.290704 2569 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6fab49af-d632-4250-9079-d2294d443fe1-configmap-metrics-client-ca\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:13:53.290841 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.290716 2569 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-web-config\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:13:53.290841 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.290729 2569 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fab49af-d632-4250-9079-d2294d443fe1-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:13:53.290841 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.290741 2569 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-secret-kube-rbac-proxy\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:13:53.290841 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.290755 2569 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-config\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:13:53.290841 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.290767 2569 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-secret-grpc-tls\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:13:53.290841 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.290780 2569 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fab49af-d632-4250-9079-d2294d443fe1-prometheus-trusted-ca-bundle\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:13:53.290841 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.290793 2569 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-secret-metrics-client-certs\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:13:53.290841 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.290806 2569 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6fab49af-d632-4250-9079-d2294d443fe1-secret-prometheus-k8s-tls\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:13:53.569568 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.569480 2569 generic.go:358] "Generic (PLEG): container finished" podID="6fab49af-d632-4250-9079-d2294d443fe1" containerID="f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb" exitCode=0 Apr 23 01:13:53.569568 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.569507 2569 generic.go:358] "Generic (PLEG): container finished" podID="6fab49af-d632-4250-9079-d2294d443fe1" containerID="5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a" exitCode=0 Apr 23 01:13:53.569568 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.569513 2569 generic.go:358] "Generic (PLEG): container finished" podID="6fab49af-d632-4250-9079-d2294d443fe1" containerID="7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb" exitCode=0 Apr 23 01:13:53.569568 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.569519 2569 generic.go:358] "Generic (PLEG): container finished" podID="6fab49af-d632-4250-9079-d2294d443fe1" containerID="5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693" exitCode=0 Apr 23 01:13:53.569568 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.569525 2569 generic.go:358] "Generic (PLEG): container finished" podID="6fab49af-d632-4250-9079-d2294d443fe1" containerID="66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85" exitCode=0 Apr 23 01:13:53.569568 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.569532 2569 generic.go:358] "Generic (PLEG): container finished" podID="6fab49af-d632-4250-9079-d2294d443fe1" containerID="84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983" exitCode=0 Apr 23 01:13:53.569929 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.569578 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.569929 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.569571 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6fab49af-d632-4250-9079-d2294d443fe1","Type":"ContainerDied","Data":"f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb"} Apr 23 01:13:53.569929 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.569650 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6fab49af-d632-4250-9079-d2294d443fe1","Type":"ContainerDied","Data":"5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a"} Apr 23 01:13:53.569929 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.569662 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6fab49af-d632-4250-9079-d2294d443fe1","Type":"ContainerDied","Data":"7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb"} Apr 23 01:13:53.569929 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.569672 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6fab49af-d632-4250-9079-d2294d443fe1","Type":"ContainerDied","Data":"5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693"} Apr 23 01:13:53.569929 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.569680 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6fab49af-d632-4250-9079-d2294d443fe1","Type":"ContainerDied","Data":"66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85"} Apr 23 01:13:53.569929 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.569689 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6fab49af-d632-4250-9079-d2294d443fe1","Type":"ContainerDied","Data":"84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983"} Apr 23 01:13:53.569929 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.569697 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6fab49af-d632-4250-9079-d2294d443fe1","Type":"ContainerDied","Data":"97c7f0fbc0e10668b076120586f89f3f13640c9d28f48b6d73c6121a4095b1ae"} Apr 23 01:13:53.569929 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.569722 2569 scope.go:117] "RemoveContainer" containerID="f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb" Apr 23 01:13:53.581770 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.581555 2569 scope.go:117] "RemoveContainer" containerID="5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a" Apr 23 01:13:53.588887 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.588866 2569 scope.go:117] "RemoveContainer" containerID="7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb" Apr 23 01:13:53.594255 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.594234 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 01:13:53.595695 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.595680 2569 scope.go:117] "RemoveContainer" containerID="5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693" Apr 23 01:13:53.598431 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.598411 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 01:13:53.602260 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.602243 2569 scope.go:117] "RemoveContainer" containerID="66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85" Apr 23 01:13:53.608762 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.608736 2569 scope.go:117] "RemoveContainer" containerID="84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983" Apr 23 01:13:53.615680 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.615661 2569 scope.go:117] "RemoveContainer" containerID="ecc69aa537b7a8aa8317d8e0df4828958167acd7d331047a09e6625b1a51340c" Apr 23 01:13:53.620020 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.619999 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 01:13:53.620399 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.620379 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6fab49af-d632-4250-9079-d2294d443fe1" containerName="kube-rbac-proxy-web" Apr 23 01:13:53.620467 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.620404 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fab49af-d632-4250-9079-d2294d443fe1" containerName="kube-rbac-proxy-web" Apr 23 01:13:53.620467 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.620417 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6fab49af-d632-4250-9079-d2294d443fe1" containerName="thanos-sidecar" Apr 23 01:13:53.620467 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.620425 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fab49af-d632-4250-9079-d2294d443fe1" containerName="thanos-sidecar" Apr 23 01:13:53.620467 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.620434 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6fab49af-d632-4250-9079-d2294d443fe1" containerName="kube-rbac-proxy-thanos" Apr 23 01:13:53.620467 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.620443 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fab49af-d632-4250-9079-d2294d443fe1" containerName="kube-rbac-proxy-thanos" Apr 23 01:13:53.620467 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.620453 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6fab49af-d632-4250-9079-d2294d443fe1" containerName="config-reloader" Apr 23 01:13:53.620467 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.620461 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fab49af-d632-4250-9079-d2294d443fe1" containerName="config-reloader" Apr 23 01:13:53.620697 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.620471 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1679ef2e-e9c0-4738-a19c-35582013fe18" containerName="registry" Apr 23 01:13:53.620697 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.620479 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="1679ef2e-e9c0-4738-a19c-35582013fe18" containerName="registry" Apr 23 01:13:53.620697 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.620492 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6fab49af-d632-4250-9079-d2294d443fe1" containerName="kube-rbac-proxy" Apr 23 01:13:53.620697 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.620499 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fab49af-d632-4250-9079-d2294d443fe1" containerName="kube-rbac-proxy" Apr 23 01:13:53.620697 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.620510 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b06f04f0-7ac2-4abd-91c1-11d56af6cdd9" containerName="console" Apr 23 01:13:53.620697 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.620517 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b06f04f0-7ac2-4abd-91c1-11d56af6cdd9" containerName="console" Apr 23 01:13:53.620697 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.620531 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6fab49af-d632-4250-9079-d2294d443fe1" containerName="prometheus" Apr 23 01:13:53.620697 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.620539 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fab49af-d632-4250-9079-d2294d443fe1" containerName="prometheus" Apr 23 01:13:53.620697 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.620551 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6fab49af-d632-4250-9079-d2294d443fe1" containerName="init-config-reloader" Apr 23 01:13:53.620697 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.620559 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fab49af-d632-4250-9079-d2294d443fe1" containerName="init-config-reloader" Apr 23 01:13:53.620697 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.620643 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="6fab49af-d632-4250-9079-d2294d443fe1" containerName="kube-rbac-proxy" Apr 23 01:13:53.620697 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.620655 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="6fab49af-d632-4250-9079-d2294d443fe1" containerName="prometheus" Apr 23 01:13:53.620697 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.620663 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="6fab49af-d632-4250-9079-d2294d443fe1" containerName="thanos-sidecar" Apr 23 01:13:53.620697 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.620675 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="6fab49af-d632-4250-9079-d2294d443fe1" containerName="config-reloader" Apr 23 01:13:53.620697 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.620683 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="6fab49af-d632-4250-9079-d2294d443fe1" containerName="kube-rbac-proxy-web" Apr 23 01:13:53.620697 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.620694 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b06f04f0-7ac2-4abd-91c1-11d56af6cdd9" containerName="console" Apr 23 01:13:53.621158 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.620704 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="6fab49af-d632-4250-9079-d2294d443fe1" containerName="kube-rbac-proxy-thanos" Apr 23 01:13:53.621158 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.620715 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="1679ef2e-e9c0-4738-a19c-35582013fe18" containerName="registry" Apr 23 01:13:53.623381 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.623322 2569 scope.go:117] "RemoveContainer" containerID="f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb" Apr 23 01:13:53.623678 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:13:53.623657 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb\": container with ID starting with f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb not found: ID does not exist" containerID="f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb" Apr 23 01:13:53.623743 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.623687 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb"} err="failed to get container status \"f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb\": rpc error: code = NotFound desc = could not find container \"f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb\": container with ID starting with f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb not found: ID does not exist" Apr 23 01:13:53.623743 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.623706 2569 scope.go:117] "RemoveContainer" containerID="5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a" Apr 23 01:13:53.623960 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:13:53.623936 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a\": container with ID starting with 5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a not found: ID does not exist" containerID="5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a" Apr 23 01:13:53.624010 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.623966 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a"} err="failed to get container status \"5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a\": rpc error: code = NotFound desc = could not find container \"5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a\": container with ID starting with 5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a not found: ID does not exist" Apr 23 01:13:53.624010 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.623982 2569 scope.go:117] "RemoveContainer" containerID="7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb" Apr 23 01:13:53.624206 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:13:53.624191 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb\": container with ID starting with 7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb not found: ID does not exist" containerID="7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb" Apr 23 01:13:53.624246 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.624209 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb"} err="failed to get container status \"7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb\": rpc error: code = NotFound desc = could not find container \"7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb\": container with ID starting with 7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb not found: ID does not exist" Apr 23 01:13:53.624246 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.624220 2569 scope.go:117] "RemoveContainer" containerID="5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693" Apr 23 01:13:53.624427 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:13:53.624412 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693\": container with ID starting with 5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693 not found: ID does not exist" containerID="5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693" Apr 23 01:13:53.624468 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.624430 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693"} err="failed to get container status \"5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693\": rpc error: code = NotFound desc = could not find container \"5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693\": container with ID starting with 5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693 not found: ID does not exist" Apr 23 01:13:53.624468 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.624442 2569 scope.go:117] "RemoveContainer" containerID="66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85" Apr 23 01:13:53.624712 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:13:53.624687 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85\": container with ID starting with 66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85 not found: ID does not exist" containerID="66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85" Apr 23 01:13:53.624763 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.624717 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85"} err="failed to get container status \"66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85\": rpc error: code = NotFound desc = could not find container \"66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85\": container with ID starting with 66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85 not found: ID does not exist" Apr 23 01:13:53.624763 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.624733 2569 scope.go:117] "RemoveContainer" containerID="84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983" Apr 23 01:13:53.624929 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:13:53.624915 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983\": container with ID starting with 84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983 not found: ID does not exist" containerID="84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983" Apr 23 01:13:53.624971 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.624932 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983"} err="failed to get container status \"84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983\": rpc error: code = NotFound desc = could not find container \"84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983\": container with ID starting with 84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983 not found: ID does not exist" Apr 23 01:13:53.624971 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.624944 2569 scope.go:117] "RemoveContainer" containerID="ecc69aa537b7a8aa8317d8e0df4828958167acd7d331047a09e6625b1a51340c" Apr 23 01:13:53.625157 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:13:53.625140 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecc69aa537b7a8aa8317d8e0df4828958167acd7d331047a09e6625b1a51340c\": container with ID starting with ecc69aa537b7a8aa8317d8e0df4828958167acd7d331047a09e6625b1a51340c not found: ID does not exist" containerID="ecc69aa537b7a8aa8317d8e0df4828958167acd7d331047a09e6625b1a51340c" Apr 23 01:13:53.625215 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.625166 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecc69aa537b7a8aa8317d8e0df4828958167acd7d331047a09e6625b1a51340c"} err="failed to get container status \"ecc69aa537b7a8aa8317d8e0df4828958167acd7d331047a09e6625b1a51340c\": rpc error: code = NotFound desc = could not find container \"ecc69aa537b7a8aa8317d8e0df4828958167acd7d331047a09e6625b1a51340c\": container with ID starting with ecc69aa537b7a8aa8317d8e0df4828958167acd7d331047a09e6625b1a51340c not found: ID does not exist" Apr 23 01:13:53.625215 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.625188 2569 scope.go:117] "RemoveContainer" containerID="f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb" Apr 23 01:13:53.625409 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.625392 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb"} err="failed to get container status \"f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb\": rpc error: code = NotFound desc = could not find container \"f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb\": container with ID starting with f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb not found: ID does not exist" Apr 23 01:13:53.625454 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.625411 2569 scope.go:117] "RemoveContainer" containerID="5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a" Apr 23 01:13:53.625640 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.625603 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a"} err="failed to get container status \"5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a\": rpc error: code = NotFound desc = could not find container \"5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a\": container with ID starting with 5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a not found: ID does not exist" Apr 23 01:13:53.625686 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.625641 2569 scope.go:117] "RemoveContainer" containerID="7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb" Apr 23 01:13:53.625875 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.625857 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb"} err="failed to get container status \"7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb\": rpc error: code = NotFound desc = could not find container \"7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb\": container with ID starting with 7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb not found: ID does not exist" Apr 23 01:13:53.625951 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.625877 2569 scope.go:117] "RemoveContainer" containerID="5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693" Apr 23 01:13:53.626152 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.626128 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693"} err="failed to get container status \"5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693\": rpc error: code = NotFound desc = could not find container \"5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693\": container with ID starting with 5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693 not found: ID does not exist" Apr 23 01:13:53.626152 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.626151 2569 scope.go:117] "RemoveContainer" containerID="66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85" Apr 23 01:13:53.626423 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.626399 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85"} err="failed to get container status \"66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85\": rpc error: code = NotFound desc = could not find container \"66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85\": container with ID starting with 66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85 not found: ID does not exist" Apr 23 01:13:53.626497 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.626424 2569 scope.go:117] "RemoveContainer" containerID="84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983" Apr 23 01:13:53.626497 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.626469 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.626664 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.626643 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983"} err="failed to get container status \"84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983\": rpc error: code = NotFound desc = could not find container \"84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983\": container with ID starting with 84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983 not found: ID does not exist" Apr 23 01:13:53.626742 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.626665 2569 scope.go:117] "RemoveContainer" containerID="ecc69aa537b7a8aa8317d8e0df4828958167acd7d331047a09e6625b1a51340c" Apr 23 01:13:53.627062 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.627040 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecc69aa537b7a8aa8317d8e0df4828958167acd7d331047a09e6625b1a51340c"} err="failed to get container status \"ecc69aa537b7a8aa8317d8e0df4828958167acd7d331047a09e6625b1a51340c\": rpc error: code = NotFound desc = could not find container \"ecc69aa537b7a8aa8317d8e0df4828958167acd7d331047a09e6625b1a51340c\": container with ID starting with ecc69aa537b7a8aa8317d8e0df4828958167acd7d331047a09e6625b1a51340c not found: ID does not exist" Apr 23 01:13:53.627062 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.627061 2569 scope.go:117] "RemoveContainer" containerID="f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb" Apr 23 01:13:53.627326 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.627303 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb"} err="failed to get container status \"f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb\": rpc error: code = NotFound desc = could not find container \"f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb\": container with ID starting with f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb not found: ID does not exist" Apr 23 01:13:53.627326 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.627326 2569 scope.go:117] "RemoveContainer" containerID="5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a" Apr 23 01:13:53.627603 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.627577 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a"} err="failed to get container status \"5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a\": rpc error: code = NotFound desc = could not find container \"5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a\": container with ID starting with 5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a not found: ID does not exist" Apr 23 01:13:53.627603 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.627603 2569 scope.go:117] "RemoveContainer" containerID="7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb" Apr 23 01:13:53.627941 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.627912 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb"} err="failed to get container status \"7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb\": rpc error: code = NotFound desc = could not find container \"7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb\": container with ID starting with 7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb not found: ID does not exist" Apr 23 01:13:53.628079 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.627949 2569 scope.go:117] "RemoveContainer" containerID="5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693" Apr 23 01:13:53.628252 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.628210 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693"} err="failed to get container status \"5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693\": rpc error: code = NotFound desc = could not find container \"5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693\": container with ID starting with 5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693 not found: ID does not exist" Apr 23 01:13:53.628252 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.628237 2569 scope.go:117] "RemoveContainer" containerID="66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85" Apr 23 01:13:53.628534 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.628483 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85"} err="failed to get container status \"66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85\": rpc error: code = NotFound desc = could not find container \"66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85\": container with ID starting with 66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85 not found: ID does not exist" Apr 23 01:13:53.628534 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.628516 2569 scope.go:117] "RemoveContainer" containerID="84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983" Apr 23 01:13:53.628736 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.628670 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 23 01:13:53.628736 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.628640 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-p7fts\"" Apr 23 01:13:53.628873 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.628830 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983"} err="failed to get container status \"84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983\": rpc error: code = NotFound desc = could not find container \"84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983\": container with ID starting with 84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983 not found: ID does not exist" Apr 23 01:13:53.628873 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.628863 2569 scope.go:117] "RemoveContainer" containerID="ecc69aa537b7a8aa8317d8e0df4828958167acd7d331047a09e6625b1a51340c" Apr 23 01:13:53.629079 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.629060 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 23 01:13:53.629178 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.629111 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 23 01:13:53.629178 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.629107 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecc69aa537b7a8aa8317d8e0df4828958167acd7d331047a09e6625b1a51340c"} err="failed to get container status \"ecc69aa537b7a8aa8317d8e0df4828958167acd7d331047a09e6625b1a51340c\": rpc error: code = NotFound desc = could not find container \"ecc69aa537b7a8aa8317d8e0df4828958167acd7d331047a09e6625b1a51340c\": container with ID starting with ecc69aa537b7a8aa8317d8e0df4828958167acd7d331047a09e6625b1a51340c not found: ID does not exist" Apr 23 01:13:53.629178 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.629135 2569 scope.go:117] "RemoveContainer" containerID="f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb" Apr 23 01:13:53.629178 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.629138 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 01:13:53.629178 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.629154 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 23 01:13:53.629178 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.629173 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 23 01:13:53.629740 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.629064 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 23 01:13:53.629740 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.629062 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 23 01:13:53.629740 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.629305 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 23 01:13:53.629740 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.629409 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb"} err="failed to get container status \"f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb\": rpc error: code = NotFound desc = could not find container \"f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb\": container with ID starting with f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb not found: ID does not exist" Apr 23 01:13:53.629740 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.629428 2569 scope.go:117] "RemoveContainer" containerID="5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a" Apr 23 01:13:53.629740 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.629683 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a"} err="failed to get container status \"5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a\": rpc error: code = NotFound desc = could not find container \"5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a\": container with ID starting with 5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a not found: ID does not exist" Apr 23 01:13:53.629740 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.629702 2569 scope.go:117] "RemoveContainer" containerID="7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb" Apr 23 01:13:53.630064 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.629892 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb"} err="failed to get container status \"7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb\": rpc error: code = NotFound desc = could not find container \"7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb\": container with ID starting with 7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb not found: ID does not exist" Apr 23 01:13:53.630064 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.629913 2569 scope.go:117] "RemoveContainer" containerID="5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693" Apr 23 01:13:53.630064 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.629946 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 23 01:13:53.630064 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.629977 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 23 01:13:53.630064 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.629986 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-7c14esouppiq2\"" Apr 23 01:13:53.630347 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.630323 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693"} err="failed to get container status \"5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693\": rpc error: code = NotFound desc = could not find container \"5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693\": container with ID starting with 5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693 not found: ID does not exist" Apr 23 01:13:53.630412 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.630350 2569 scope.go:117] "RemoveContainer" containerID="66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85" Apr 23 01:13:53.630682 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.630587 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85"} err="failed to get container status \"66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85\": rpc error: code = NotFound desc = could not find container \"66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85\": container with ID starting with 66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85 not found: ID does not exist" Apr 23 01:13:53.630682 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.630646 2569 scope.go:117] "RemoveContainer" containerID="84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983" Apr 23 01:13:53.630931 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.630904 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983"} err="failed to get container status \"84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983\": rpc error: code = NotFound desc = could not find container \"84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983\": container with ID starting with 84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983 not found: ID does not exist" Apr 23 01:13:53.630998 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.630933 2569 scope.go:117] "RemoveContainer" containerID="ecc69aa537b7a8aa8317d8e0df4828958167acd7d331047a09e6625b1a51340c" Apr 23 01:13:53.631206 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.631181 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecc69aa537b7a8aa8317d8e0df4828958167acd7d331047a09e6625b1a51340c"} err="failed to get container status \"ecc69aa537b7a8aa8317d8e0df4828958167acd7d331047a09e6625b1a51340c\": rpc error: code = NotFound desc = could not find container \"ecc69aa537b7a8aa8317d8e0df4828958167acd7d331047a09e6625b1a51340c\": container with ID starting with ecc69aa537b7a8aa8317d8e0df4828958167acd7d331047a09e6625b1a51340c not found: ID does not exist" Apr 23 01:13:53.631280 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.631208 2569 scope.go:117] "RemoveContainer" containerID="f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb" Apr 23 01:13:53.631491 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.631466 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb"} err="failed to get container status \"f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb\": rpc error: code = NotFound desc = could not find container \"f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb\": container with ID starting with f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb not found: ID does not exist" Apr 23 01:13:53.631569 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.631492 2569 scope.go:117] "RemoveContainer" containerID="5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a" Apr 23 01:13:53.631688 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.631673 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 23 01:13:53.631809 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.631790 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a"} err="failed to get container status \"5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a\": rpc error: code = NotFound desc = could not find container \"5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a\": container with ID starting with 5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a not found: ID does not exist" Apr 23 01:13:53.631873 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.631812 2569 scope.go:117] "RemoveContainer" containerID="7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb" Apr 23 01:13:53.632121 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.632083 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb"} err="failed to get container status \"7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb\": rpc error: code = NotFound desc = could not find container \"7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb\": container with ID starting with 7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb not found: ID does not exist" Apr 23 01:13:53.632121 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.632119 2569 scope.go:117] "RemoveContainer" containerID="5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693" Apr 23 01:13:53.632462 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.632417 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693"} err="failed to get container status \"5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693\": rpc error: code = NotFound desc = could not find container \"5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693\": container with ID starting with 5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693 not found: ID does not exist" Apr 23 01:13:53.632462 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.632443 2569 scope.go:117] "RemoveContainer" containerID="66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85" Apr 23 01:13:53.632920 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.632796 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85"} err="failed to get container status \"66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85\": rpc error: code = NotFound desc = could not find container \"66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85\": container with ID starting with 66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85 not found: ID does not exist" Apr 23 01:13:53.632920 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.632843 2569 scope.go:117] "RemoveContainer" containerID="84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983" Apr 23 01:13:53.633419 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.633391 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983"} err="failed to get container status \"84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983\": rpc error: code = NotFound desc = could not find container \"84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983\": container with ID starting with 84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983 not found: ID does not exist" Apr 23 01:13:53.633419 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.633414 2569 scope.go:117] "RemoveContainer" containerID="ecc69aa537b7a8aa8317d8e0df4828958167acd7d331047a09e6625b1a51340c" Apr 23 01:13:53.633813 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.633755 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecc69aa537b7a8aa8317d8e0df4828958167acd7d331047a09e6625b1a51340c"} err="failed to get container status \"ecc69aa537b7a8aa8317d8e0df4828958167acd7d331047a09e6625b1a51340c\": rpc error: code = NotFound desc = could not find container \"ecc69aa537b7a8aa8317d8e0df4828958167acd7d331047a09e6625b1a51340c\": container with ID starting with ecc69aa537b7a8aa8317d8e0df4828958167acd7d331047a09e6625b1a51340c not found: ID does not exist" Apr 23 01:13:53.633813 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.633783 2569 scope.go:117] "RemoveContainer" containerID="f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb" Apr 23 01:13:53.634081 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.634020 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb"} err="failed to get container status \"f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb\": rpc error: code = NotFound desc = could not find container \"f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb\": container with ID starting with f2f0a72b5dab96814dd0f827628a43a1a699793fada5a37d2bfcf03693318bfb not found: ID does not exist" Apr 23 01:13:53.634081 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.634043 2569 scope.go:117] "RemoveContainer" containerID="5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a" Apr 23 01:13:53.636392 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.635488 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 01:13:53.637969 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.636858 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a"} err="failed to get container status \"5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a\": rpc error: code = NotFound desc = could not find container \"5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a\": container with ID starting with 5fab9edf1038bdd9d4814d92e7d988fa6e0b72fab22fcfe413bb85f793f9172a not found: ID does not exist" Apr 23 01:13:53.637969 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.636884 2569 scope.go:117] "RemoveContainer" containerID="7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb" Apr 23 01:13:53.637969 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.637601 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 23 01:13:53.639281 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.638283 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb"} err="failed to get container status \"7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb\": rpc error: code = NotFound desc = could not find container \"7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb\": container with ID starting with 7561693df22bd99cd48f89c818cbd29d4ae10f386fdd70c5558e496f09182fdb not found: ID does not exist" Apr 23 01:13:53.639281 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.638307 2569 scope.go:117] "RemoveContainer" containerID="5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693" Apr 23 01:13:53.639281 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.638521 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693"} err="failed to get container status \"5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693\": rpc error: code = NotFound desc = could not find container \"5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693\": container with ID starting with 5e3d7e2f3400f64ccda38089126fa78d7abb994637085a6782524657b7b7e693 not found: ID does not exist" Apr 23 01:13:53.639281 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.638541 2569 scope.go:117] "RemoveContainer" containerID="66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85" Apr 23 01:13:53.639281 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.638748 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85"} err="failed to get container status \"66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85\": rpc error: code = NotFound desc = could not find container \"66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85\": container with ID starting with 66a71d7fd2dc51780da61928c323508744bc5abefcb958e5cd2746c58a8d3a85 not found: ID does not exist" Apr 23 01:13:53.639281 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.638766 2569 scope.go:117] "RemoveContainer" containerID="84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983" Apr 23 01:13:53.639281 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.638940 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983"} err="failed to get container status \"84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983\": rpc error: code = NotFound desc = could not find container \"84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983\": container with ID starting with 84c35dfc6666afe0f8d88fa99a5ba72fe961770783713d768166a360bd4ea983 not found: ID does not exist" Apr 23 01:13:53.639281 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.638960 2569 scope.go:117] "RemoveContainer" containerID="ecc69aa537b7a8aa8317d8e0df4828958167acd7d331047a09e6625b1a51340c" Apr 23 01:13:53.639281 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.639227 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecc69aa537b7a8aa8317d8e0df4828958167acd7d331047a09e6625b1a51340c"} err="failed to get container status \"ecc69aa537b7a8aa8317d8e0df4828958167acd7d331047a09e6625b1a51340c\": rpc error: code = NotFound desc = could not find container \"ecc69aa537b7a8aa8317d8e0df4828958167acd7d331047a09e6625b1a51340c\": container with ID starting with ecc69aa537b7a8aa8317d8e0df4828958167acd7d331047a09e6625b1a51340c not found: ID does not exist" Apr 23 01:13:53.693627 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.693577 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/35985415-770a-4cf2-a83a-2a1bbef2d634-config-out\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.693781 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.693636 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqcmg\" (UniqueName: \"kubernetes.io/projected/35985415-770a-4cf2-a83a-2a1bbef2d634-kube-api-access-bqcmg\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.693781 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.693690 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/35985415-770a-4cf2-a83a-2a1bbef2d634-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.693781 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.693720 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/35985415-770a-4cf2-a83a-2a1bbef2d634-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.693781 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.693747 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/35985415-770a-4cf2-a83a-2a1bbef2d634-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.693781 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.693766 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35985415-770a-4cf2-a83a-2a1bbef2d634-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.693932 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.693784 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/35985415-770a-4cf2-a83a-2a1bbef2d634-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.693932 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.693817 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/35985415-770a-4cf2-a83a-2a1bbef2d634-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.693932 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.693834 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/35985415-770a-4cf2-a83a-2a1bbef2d634-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.693932 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.693853 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/35985415-770a-4cf2-a83a-2a1bbef2d634-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.693932 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.693867 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/35985415-770a-4cf2-a83a-2a1bbef2d634-config\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.693932 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.693889 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/35985415-770a-4cf2-a83a-2a1bbef2d634-web-config\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.693932 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.693922 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/35985415-770a-4cf2-a83a-2a1bbef2d634-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.694121 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.693938 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35985415-770a-4cf2-a83a-2a1bbef2d634-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.694121 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.693954 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35985415-770a-4cf2-a83a-2a1bbef2d634-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.694121 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.693971 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/35985415-770a-4cf2-a83a-2a1bbef2d634-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.694121 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.693987 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/35985415-770a-4cf2-a83a-2a1bbef2d634-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.694121 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.694002 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/35985415-770a-4cf2-a83a-2a1bbef2d634-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.794871 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.794825 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/35985415-770a-4cf2-a83a-2a1bbef2d634-config\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.794871 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.794874 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/35985415-770a-4cf2-a83a-2a1bbef2d634-web-config\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.795046 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.794896 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/35985415-770a-4cf2-a83a-2a1bbef2d634-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.795046 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.794914 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35985415-770a-4cf2-a83a-2a1bbef2d634-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.795046 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.794933 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35985415-770a-4cf2-a83a-2a1bbef2d634-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.795046 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.794951 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/35985415-770a-4cf2-a83a-2a1bbef2d634-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.795046 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.794977 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/35985415-770a-4cf2-a83a-2a1bbef2d634-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.795046 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.795001 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/35985415-770a-4cf2-a83a-2a1bbef2d634-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.795293 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.795049 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/35985415-770a-4cf2-a83a-2a1bbef2d634-config-out\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.795293 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.795091 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bqcmg\" (UniqueName: \"kubernetes.io/projected/35985415-770a-4cf2-a83a-2a1bbef2d634-kube-api-access-bqcmg\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.795293 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.795144 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/35985415-770a-4cf2-a83a-2a1bbef2d634-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.795293 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.795176 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/35985415-770a-4cf2-a83a-2a1bbef2d634-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.795293 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.795207 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/35985415-770a-4cf2-a83a-2a1bbef2d634-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.795293 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.795234 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35985415-770a-4cf2-a83a-2a1bbef2d634-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.795293 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.795285 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/35985415-770a-4cf2-a83a-2a1bbef2d634-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.795716 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.795315 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/35985415-770a-4cf2-a83a-2a1bbef2d634-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.795716 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.795341 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/35985415-770a-4cf2-a83a-2a1bbef2d634-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.795716 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.795376 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/35985415-770a-4cf2-a83a-2a1bbef2d634-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.795716 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.795422 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/35985415-770a-4cf2-a83a-2a1bbef2d634-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.795908 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.795745 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35985415-770a-4cf2-a83a-2a1bbef2d634-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.796550 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.796204 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35985415-770a-4cf2-a83a-2a1bbef2d634-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.796550 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.796359 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/35985415-770a-4cf2-a83a-2a1bbef2d634-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.797763 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.797356 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35985415-770a-4cf2-a83a-2a1bbef2d634-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.798533 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.797980 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/35985415-770a-4cf2-a83a-2a1bbef2d634-config\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.798533 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.798074 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/35985415-770a-4cf2-a83a-2a1bbef2d634-web-config\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.798533 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.798133 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/35985415-770a-4cf2-a83a-2a1bbef2d634-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.798533 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.798194 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/35985415-770a-4cf2-a83a-2a1bbef2d634-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.798533 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.798286 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/35985415-770a-4cf2-a83a-2a1bbef2d634-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.798533 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.798288 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/35985415-770a-4cf2-a83a-2a1bbef2d634-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.799824 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.799805 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/35985415-770a-4cf2-a83a-2a1bbef2d634-config-out\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.800153 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.800134 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/35985415-770a-4cf2-a83a-2a1bbef2d634-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.800316 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.800297 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/35985415-770a-4cf2-a83a-2a1bbef2d634-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.800518 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.800500 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/35985415-770a-4cf2-a83a-2a1bbef2d634-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.800654 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.800632 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/35985415-770a-4cf2-a83a-2a1bbef2d634-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.800708 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.800695 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/35985415-770a-4cf2-a83a-2a1bbef2d634-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.804587 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.804569 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqcmg\" (UniqueName: \"kubernetes.io/projected/35985415-770a-4cf2-a83a-2a1bbef2d634-kube-api-access-bqcmg\") pod \"prometheus-k8s-0\" (UID: \"35985415-770a-4cf2-a83a-2a1bbef2d634\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:53.964205 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:53.964167 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:13:54.088067 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:54.088029 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 01:13:54.092049 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:13:54.092012 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35985415_770a_4cf2_a83a_2a1bbef2d634.slice/crio-e40a4f3078420cd91d223b19c4ebb688c4a84d9e890f46633c25bbad7f6d6d27 WatchSource:0}: Error finding container e40a4f3078420cd91d223b19c4ebb688c4a84d9e890f46633c25bbad7f6d6d27: Status 404 returned error can't find the container with id e40a4f3078420cd91d223b19c4ebb688c4a84d9e890f46633c25bbad7f6d6d27 Apr 23 01:13:54.574396 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:54.574318 2569 generic.go:358] "Generic (PLEG): container finished" podID="35985415-770a-4cf2-a83a-2a1bbef2d634" containerID="7b2d848d9c7fedfa7165c6f528dcf8a8d4f05b2a2b3830e2410bfa52a2703f55" exitCode=0 Apr 23 01:13:54.574396 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:54.574364 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"35985415-770a-4cf2-a83a-2a1bbef2d634","Type":"ContainerDied","Data":"7b2d848d9c7fedfa7165c6f528dcf8a8d4f05b2a2b3830e2410bfa52a2703f55"} Apr 23 01:13:54.574586 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:54.574400 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"35985415-770a-4cf2-a83a-2a1bbef2d634","Type":"ContainerStarted","Data":"e40a4f3078420cd91d223b19c4ebb688c4a84d9e890f46633c25bbad7f6d6d27"} Apr 23 01:13:54.770790 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:54.768834 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fab49af-d632-4250-9079-d2294d443fe1" path="/var/lib/kubelet/pods/6fab49af-d632-4250-9079-d2294d443fe1/volumes" Apr 23 01:13:55.584385 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:55.584350 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"35985415-770a-4cf2-a83a-2a1bbef2d634","Type":"ContainerStarted","Data":"5e306ab0d2d110cf7ad3109dd960a5dc1e430f3498051ac80b702c86ca1fdd82"} Apr 23 01:13:55.584385 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:55.584385 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"35985415-770a-4cf2-a83a-2a1bbef2d634","Type":"ContainerStarted","Data":"d8e856b6a59edd696f0e992b684a95a2b14ef0dbbbf4fd55c50db8521acaccf5"} Apr 23 01:13:55.584798 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:55.584396 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"35985415-770a-4cf2-a83a-2a1bbef2d634","Type":"ContainerStarted","Data":"68ec8976d45703bc4fe2ad6ba0abafcc0f8e080f196072871598e407aa396f8b"} Apr 23 01:13:55.584798 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:55.584404 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"35985415-770a-4cf2-a83a-2a1bbef2d634","Type":"ContainerStarted","Data":"0288c18de2a821b643cd8eb070bf8e0e97eb525ef8dbe17f78022ecc210a7b90"} Apr 23 01:13:55.584798 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:55.584412 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"35985415-770a-4cf2-a83a-2a1bbef2d634","Type":"ContainerStarted","Data":"5cb6009ea9e008089a0a06e50c4cf7f550585e83240f68239eb2a2133f37c270"} Apr 23 01:13:55.584798 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:55.584420 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"35985415-770a-4cf2-a83a-2a1bbef2d634","Type":"ContainerStarted","Data":"e0013ee5d25af7d254beb2630e96dfbfae221451fbd1322d31cceae7f733f7eb"} Apr 23 01:13:55.610911 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:55.610783 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.610766779 podStartE2EDuration="2.610766779s" podCreationTimestamp="2026-04-23 01:13:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 01:13:55.609311384 +0000 UTC m=+243.394985742" watchObservedRunningTime="2026-04-23 01:13:55.610766779 +0000 UTC m=+243.396441141" Apr 23 01:13:58.965134 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:13:58.965102 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:14:04.690939 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:14:04.690895 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/608e8d52-e2cd-48e3-b524-0f0d764d9501-metrics-certs\") pod \"network-metrics-daemon-5mm4v\" (UID: \"608e8d52-e2cd-48e3-b524-0f0d764d9501\") " pod="openshift-multus/network-metrics-daemon-5mm4v" Apr 23 01:14:04.693162 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:14:04.693127 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/608e8d52-e2cd-48e3-b524-0f0d764d9501-metrics-certs\") pod \"network-metrics-daemon-5mm4v\" (UID: \"608e8d52-e2cd-48e3-b524-0f0d764d9501\") " pod="openshift-multus/network-metrics-daemon-5mm4v" Apr 23 01:14:04.866123 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:14:04.866090 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lj44d\"" Apr 23 01:14:04.874927 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:14:04.874905 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5mm4v" Apr 23 01:14:04.995483 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:14:04.993491 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5mm4v"] Apr 23 01:14:04.996974 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:14:04.996940 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod608e8d52_e2cd_48e3_b524_0f0d764d9501.slice/crio-7dff3c706f759cb65a2c77403021d30cc353cecc994cdc28bcfcb663f41ebdb1 WatchSource:0}: Error finding container 7dff3c706f759cb65a2c77403021d30cc353cecc994cdc28bcfcb663f41ebdb1: Status 404 returned error can't find the container with id 7dff3c706f759cb65a2c77403021d30cc353cecc994cdc28bcfcb663f41ebdb1 Apr 23 01:14:05.614749 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:14:05.614707 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5mm4v" event={"ID":"608e8d52-e2cd-48e3-b524-0f0d764d9501","Type":"ContainerStarted","Data":"7dff3c706f759cb65a2c77403021d30cc353cecc994cdc28bcfcb663f41ebdb1"} Apr 23 01:14:06.619279 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:14:06.619247 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5mm4v" event={"ID":"608e8d52-e2cd-48e3-b524-0f0d764d9501","Type":"ContainerStarted","Data":"9039fa7edd5445c7a73fa054e873443f611ebc98f0a6f76a362e4d0e35fa6377"} Apr 23 01:14:06.619279 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:14:06.619280 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5mm4v" event={"ID":"608e8d52-e2cd-48e3-b524-0f0d764d9501","Type":"ContainerStarted","Data":"05206d6a3eaf79de0a491d267074e73562c2236b219f30574c9ff22320841458"} Apr 23 01:14:06.634698 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:14:06.634648 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-5mm4v" podStartSLOduration=253.696186035 podStartE2EDuration="4m14.634636161s" podCreationTimestamp="2026-04-23 01:09:52 +0000 UTC" firstStartedPulling="2026-04-23 01:14:04.99877557 +0000 UTC m=+252.784449906" lastFinishedPulling="2026-04-23 01:14:05.937225697 +0000 UTC m=+253.722900032" observedRunningTime="2026-04-23 01:14:06.632695126 +0000 UTC m=+254.418369498" watchObservedRunningTime="2026-04-23 01:14:06.634636161 +0000 UTC m=+254.420310518" Apr 23 01:14:52.627438 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:14:52.627410 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-grc82_58715151-e1c9-475c-be78-487774704c95/console-operator/1.log" Apr 23 01:14:52.628110 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:14:52.628091 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-grc82_58715151-e1c9-475c-be78-487774704c95/console-operator/1.log" Apr 23 01:14:52.632767 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:14:52.632737 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79v47_52115d8e-033b-485d-aa67-434e7ae395d5/ovn-acl-logging/0.log" Apr 23 01:14:52.633751 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:14:52.633727 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79v47_52115d8e-033b-485d-aa67-434e7ae395d5/ovn-acl-logging/0.log" Apr 23 01:14:52.646294 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:14:52.646273 2569 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 01:14:53.964433 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:14:53.964398 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:14:53.979764 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:14:53.979738 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:14:54.772660 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:14:54.772626 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 01:15:34.639299 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:34.639265 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-7mtq5"] Apr 23 01:15:34.642557 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:34.642541 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-7mtq5" Apr 23 01:15:34.644633 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:34.644590 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 23 01:15:34.645383 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:34.645369 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 23 01:15:34.645441 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:34.645369 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-ww9m7\"" Apr 23 01:15:34.651727 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:34.651703 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-7mtq5"] Apr 23 01:15:34.749870 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:34.749840 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/45831f9c-10ca-4d6e-a7fe-1cff13e1e7a2-tmp\") pod \"openshift-lws-operator-bfc7f696d-7mtq5\" (UID: \"45831f9c-10ca-4d6e-a7fe-1cff13e1e7a2\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-7mtq5" Apr 23 01:15:34.750026 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:34.749889 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv6s9\" (UniqueName: \"kubernetes.io/projected/45831f9c-10ca-4d6e-a7fe-1cff13e1e7a2-kube-api-access-jv6s9\") pod \"openshift-lws-operator-bfc7f696d-7mtq5\" (UID: \"45831f9c-10ca-4d6e-a7fe-1cff13e1e7a2\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-7mtq5" Apr 23 01:15:34.850780 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:34.850741 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/45831f9c-10ca-4d6e-a7fe-1cff13e1e7a2-tmp\") pod \"openshift-lws-operator-bfc7f696d-7mtq5\" (UID: \"45831f9c-10ca-4d6e-a7fe-1cff13e1e7a2\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-7mtq5" Apr 23 01:15:34.850960 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:34.850800 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jv6s9\" (UniqueName: \"kubernetes.io/projected/45831f9c-10ca-4d6e-a7fe-1cff13e1e7a2-kube-api-access-jv6s9\") pod \"openshift-lws-operator-bfc7f696d-7mtq5\" (UID: \"45831f9c-10ca-4d6e-a7fe-1cff13e1e7a2\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-7mtq5" Apr 23 01:15:34.851160 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:34.851138 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/45831f9c-10ca-4d6e-a7fe-1cff13e1e7a2-tmp\") pod \"openshift-lws-operator-bfc7f696d-7mtq5\" (UID: \"45831f9c-10ca-4d6e-a7fe-1cff13e1e7a2\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-7mtq5" Apr 23 01:15:34.858224 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:34.858194 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv6s9\" (UniqueName: \"kubernetes.io/projected/45831f9c-10ca-4d6e-a7fe-1cff13e1e7a2-kube-api-access-jv6s9\") pod \"openshift-lws-operator-bfc7f696d-7mtq5\" (UID: \"45831f9c-10ca-4d6e-a7fe-1cff13e1e7a2\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-7mtq5" Apr 23 01:15:34.965271 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:34.965179 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-7mtq5" Apr 23 01:15:35.083064 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:35.083028 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-7mtq5"] Apr 23 01:15:35.086076 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:15:35.086047 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45831f9c_10ca_4d6e_a7fe_1cff13e1e7a2.slice/crio-7cd9458024453bf05d47669233ce1e6553d28d7c8fb66b7ceeff7069ac4f5294 WatchSource:0}: Error finding container 7cd9458024453bf05d47669233ce1e6553d28d7c8fb66b7ceeff7069ac4f5294: Status 404 returned error can't find the container with id 7cd9458024453bf05d47669233ce1e6553d28d7c8fb66b7ceeff7069ac4f5294 Apr 23 01:15:35.087921 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:35.087903 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 01:15:35.872762 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:35.872720 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-7mtq5" event={"ID":"45831f9c-10ca-4d6e-a7fe-1cff13e1e7a2","Type":"ContainerStarted","Data":"7cd9458024453bf05d47669233ce1e6553d28d7c8fb66b7ceeff7069ac4f5294"} Apr 23 01:15:37.879682 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:37.879589 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-7mtq5" event={"ID":"45831f9c-10ca-4d6e-a7fe-1cff13e1e7a2","Type":"ContainerStarted","Data":"682119d6c171473c78903bfdf5511c550410ba1ac1cf1dab5f27ea2869fec4da"} Apr 23 01:15:37.893225 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:37.893070 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-7mtq5" podStartSLOduration=1.413594139 podStartE2EDuration="3.89305703s" podCreationTimestamp="2026-04-23 01:15:34 +0000 UTC" firstStartedPulling="2026-04-23 01:15:35.088038096 +0000 UTC m=+342.873712431" lastFinishedPulling="2026-04-23 01:15:37.567500983 +0000 UTC m=+345.353175322" observedRunningTime="2026-04-23 01:15:37.892920787 +0000 UTC m=+345.678595147" watchObservedRunningTime="2026-04-23 01:15:37.89305703 +0000 UTC m=+345.678731387" Apr 23 01:15:53.694443 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:53.694360 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5fb5768b86-hvntf"] Apr 23 01:15:53.697980 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:53.697958 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-hvntf" Apr 23 01:15:53.700574 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:53.700545 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 23 01:15:53.700717 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:53.700633 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 23 01:15:53.700973 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:53.700950 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 23 01:15:53.701116 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:53.701062 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 23 01:15:53.701208 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:53.701190 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-hmfjw\"" Apr 23 01:15:53.715138 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:53.715111 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5fb5768b86-hvntf"] Apr 23 01:15:53.818844 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:53.818803 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkwcm\" (UniqueName: \"kubernetes.io/projected/45c1a1fa-47fe-4d79-813d-abe2e5c22d31-kube-api-access-pkwcm\") pod \"opendatahub-operator-controller-manager-5fb5768b86-hvntf\" (UID: \"45c1a1fa-47fe-4d79-813d-abe2e5c22d31\") " pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-hvntf" Apr 23 01:15:53.819028 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:53.818905 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/45c1a1fa-47fe-4d79-813d-abe2e5c22d31-webhook-cert\") pod \"opendatahub-operator-controller-manager-5fb5768b86-hvntf\" (UID: \"45c1a1fa-47fe-4d79-813d-abe2e5c22d31\") " pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-hvntf" Apr 23 01:15:53.819028 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:53.818940 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/45c1a1fa-47fe-4d79-813d-abe2e5c22d31-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5fb5768b86-hvntf\" (UID: \"45c1a1fa-47fe-4d79-813d-abe2e5c22d31\") " pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-hvntf" Apr 23 01:15:53.920005 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:53.919967 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/45c1a1fa-47fe-4d79-813d-abe2e5c22d31-webhook-cert\") pod \"opendatahub-operator-controller-manager-5fb5768b86-hvntf\" (UID: \"45c1a1fa-47fe-4d79-813d-abe2e5c22d31\") " pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-hvntf" Apr 23 01:15:53.920151 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:53.920015 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/45c1a1fa-47fe-4d79-813d-abe2e5c22d31-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5fb5768b86-hvntf\" (UID: \"45c1a1fa-47fe-4d79-813d-abe2e5c22d31\") " pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-hvntf" Apr 23 01:15:53.920151 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:53.920069 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pkwcm\" (UniqueName: \"kubernetes.io/projected/45c1a1fa-47fe-4d79-813d-abe2e5c22d31-kube-api-access-pkwcm\") pod \"opendatahub-operator-controller-manager-5fb5768b86-hvntf\" (UID: \"45c1a1fa-47fe-4d79-813d-abe2e5c22d31\") " pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-hvntf" Apr 23 01:15:53.922376 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:53.922344 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/45c1a1fa-47fe-4d79-813d-abe2e5c22d31-webhook-cert\") pod \"opendatahub-operator-controller-manager-5fb5768b86-hvntf\" (UID: \"45c1a1fa-47fe-4d79-813d-abe2e5c22d31\") " pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-hvntf" Apr 23 01:15:53.922498 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:53.922392 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/45c1a1fa-47fe-4d79-813d-abe2e5c22d31-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5fb5768b86-hvntf\" (UID: \"45c1a1fa-47fe-4d79-813d-abe2e5c22d31\") " pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-hvntf" Apr 23 01:15:53.929647 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:53.929604 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkwcm\" (UniqueName: \"kubernetes.io/projected/45c1a1fa-47fe-4d79-813d-abe2e5c22d31-kube-api-access-pkwcm\") pod \"opendatahub-operator-controller-manager-5fb5768b86-hvntf\" (UID: \"45c1a1fa-47fe-4d79-813d-abe2e5c22d31\") " pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-hvntf" Apr 23 01:15:54.009264 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:54.009194 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-hvntf" Apr 23 01:15:54.129502 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:54.129470 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5fb5768b86-hvntf"] Apr 23 01:15:54.133682 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:15:54.133651 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45c1a1fa_47fe_4d79_813d_abe2e5c22d31.slice/crio-293fe416a62c428b16165a20604809d0a350b700172f58f3afbe0022b909c597 WatchSource:0}: Error finding container 293fe416a62c428b16165a20604809d0a350b700172f58f3afbe0022b909c597: Status 404 returned error can't find the container with id 293fe416a62c428b16165a20604809d0a350b700172f58f3afbe0022b909c597 Apr 23 01:15:54.930095 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:54.930043 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-hvntf" event={"ID":"45c1a1fa-47fe-4d79-813d-abe2e5c22d31","Type":"ContainerStarted","Data":"293fe416a62c428b16165a20604809d0a350b700172f58f3afbe0022b909c597"} Apr 23 01:15:56.938706 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:56.938593 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-hvntf" event={"ID":"45c1a1fa-47fe-4d79-813d-abe2e5c22d31","Type":"ContainerStarted","Data":"de311de07f5ace5eafe764bf517d1a75e71c577535776e1b087bcb30916dd42b"} Apr 23 01:15:56.939040 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:56.938762 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-hvntf" Apr 23 01:15:56.971445 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:15:56.971382 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-hvntf" podStartSLOduration=1.4633743510000001 podStartE2EDuration="3.97136464s" podCreationTimestamp="2026-04-23 01:15:53 +0000 UTC" firstStartedPulling="2026-04-23 01:15:54.136187469 +0000 UTC m=+361.921861804" lastFinishedPulling="2026-04-23 01:15:56.644177758 +0000 UTC m=+364.429852093" observedRunningTime="2026-04-23 01:15:56.96955344 +0000 UTC m=+364.755227832" watchObservedRunningTime="2026-04-23 01:15:56.97136464 +0000 UTC m=+364.757038997" Apr 23 01:16:07.943536 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:07.943505 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-5fb5768b86-hvntf" Apr 23 01:16:13.221464 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:13.221430 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-6b799cbd77-t4mj7"] Apr 23 01:16:13.224565 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:13.224548 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-t4mj7" Apr 23 01:16:13.226894 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:13.226871 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 23 01:16:13.228032 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:13.228011 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-slpqr\"" Apr 23 01:16:13.228032 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:13.228024 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 23 01:16:13.228148 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:13.228093 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 23 01:16:13.232897 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:13.232872 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6b799cbd77-t4mj7"] Apr 23 01:16:13.281413 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:13.281382 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/998ca98b-02e0-4f9b-8e9e-6c4fb7b647ad-manager-config\") pod \"lws-controller-manager-6b799cbd77-t4mj7\" (UID: \"998ca98b-02e0-4f9b-8e9e-6c4fb7b647ad\") " pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-t4mj7" Apr 23 01:16:13.281584 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:13.281429 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/998ca98b-02e0-4f9b-8e9e-6c4fb7b647ad-cert\") pod \"lws-controller-manager-6b799cbd77-t4mj7\" (UID: \"998ca98b-02e0-4f9b-8e9e-6c4fb7b647ad\") " pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-t4mj7" Apr 23 01:16:13.281584 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:13.281452 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7pvl\" (UniqueName: \"kubernetes.io/projected/998ca98b-02e0-4f9b-8e9e-6c4fb7b647ad-kube-api-access-n7pvl\") pod \"lws-controller-manager-6b799cbd77-t4mj7\" (UID: \"998ca98b-02e0-4f9b-8e9e-6c4fb7b647ad\") " pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-t4mj7" Apr 23 01:16:13.281584 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:13.281544 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/998ca98b-02e0-4f9b-8e9e-6c4fb7b647ad-metrics-cert\") pod \"lws-controller-manager-6b799cbd77-t4mj7\" (UID: \"998ca98b-02e0-4f9b-8e9e-6c4fb7b647ad\") " pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-t4mj7" Apr 23 01:16:13.382258 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:13.382222 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/998ca98b-02e0-4f9b-8e9e-6c4fb7b647ad-cert\") pod \"lws-controller-manager-6b799cbd77-t4mj7\" (UID: \"998ca98b-02e0-4f9b-8e9e-6c4fb7b647ad\") " pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-t4mj7" Apr 23 01:16:13.382258 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:13.382263 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7pvl\" (UniqueName: \"kubernetes.io/projected/998ca98b-02e0-4f9b-8e9e-6c4fb7b647ad-kube-api-access-n7pvl\") pod \"lws-controller-manager-6b799cbd77-t4mj7\" (UID: \"998ca98b-02e0-4f9b-8e9e-6c4fb7b647ad\") " pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-t4mj7" Apr 23 01:16:13.382504 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:13.382285 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/998ca98b-02e0-4f9b-8e9e-6c4fb7b647ad-metrics-cert\") pod \"lws-controller-manager-6b799cbd77-t4mj7\" (UID: \"998ca98b-02e0-4f9b-8e9e-6c4fb7b647ad\") " pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-t4mj7" Apr 23 01:16:13.382504 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:13.382362 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/998ca98b-02e0-4f9b-8e9e-6c4fb7b647ad-manager-config\") pod \"lws-controller-manager-6b799cbd77-t4mj7\" (UID: \"998ca98b-02e0-4f9b-8e9e-6c4fb7b647ad\") " pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-t4mj7" Apr 23 01:16:13.383015 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:13.382991 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/998ca98b-02e0-4f9b-8e9e-6c4fb7b647ad-manager-config\") pod \"lws-controller-manager-6b799cbd77-t4mj7\" (UID: \"998ca98b-02e0-4f9b-8e9e-6c4fb7b647ad\") " pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-t4mj7" Apr 23 01:16:13.384833 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:13.384811 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/998ca98b-02e0-4f9b-8e9e-6c4fb7b647ad-cert\") pod \"lws-controller-manager-6b799cbd77-t4mj7\" (UID: \"998ca98b-02e0-4f9b-8e9e-6c4fb7b647ad\") " pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-t4mj7" Apr 23 01:16:13.384962 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:13.384942 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/998ca98b-02e0-4f9b-8e9e-6c4fb7b647ad-metrics-cert\") pod \"lws-controller-manager-6b799cbd77-t4mj7\" (UID: \"998ca98b-02e0-4f9b-8e9e-6c4fb7b647ad\") " pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-t4mj7" Apr 23 01:16:13.389691 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:13.389672 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7pvl\" (UniqueName: \"kubernetes.io/projected/998ca98b-02e0-4f9b-8e9e-6c4fb7b647ad-kube-api-access-n7pvl\") pod \"lws-controller-manager-6b799cbd77-t4mj7\" (UID: \"998ca98b-02e0-4f9b-8e9e-6c4fb7b647ad\") " pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-t4mj7" Apr 23 01:16:13.534755 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:13.534668 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-t4mj7" Apr 23 01:16:13.663099 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:13.663072 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6b799cbd77-t4mj7"] Apr 23 01:16:13.665231 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:16:13.665197 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod998ca98b_02e0_4f9b_8e9e_6c4fb7b647ad.slice/crio-16603550b5acf5c45093d9b356b1beab9ce540acc1ef7c9e9a10357844032ff6 WatchSource:0}: Error finding container 16603550b5acf5c45093d9b356b1beab9ce540acc1ef7c9e9a10357844032ff6: Status 404 returned error can't find the container with id 16603550b5acf5c45093d9b356b1beab9ce540acc1ef7c9e9a10357844032ff6 Apr 23 01:16:13.992024 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:13.991994 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-t4mj7" event={"ID":"998ca98b-02e0-4f9b-8e9e-6c4fb7b647ad","Type":"ContainerStarted","Data":"16603550b5acf5c45093d9b356b1beab9ce540acc1ef7c9e9a10357844032ff6"} Apr 23 01:16:16.000474 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:16.000388 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-t4mj7" event={"ID":"998ca98b-02e0-4f9b-8e9e-6c4fb7b647ad","Type":"ContainerStarted","Data":"eae41759d2c008a359abb01c6a93f53e7c68dd69703d47691763575608007fa0"} Apr 23 01:16:16.000865 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:16.000502 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-t4mj7" Apr 23 01:16:16.019297 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:16.019256 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-t4mj7" podStartSLOduration=0.947494915 podStartE2EDuration="3.019241474s" podCreationTimestamp="2026-04-23 01:16:13 +0000 UTC" firstStartedPulling="2026-04-23 01:16:13.667467204 +0000 UTC m=+381.453141538" lastFinishedPulling="2026-04-23 01:16:15.739213762 +0000 UTC m=+383.524888097" observedRunningTime="2026-04-23 01:16:16.017271661 +0000 UTC m=+383.802946029" watchObservedRunningTime="2026-04-23 01:16:16.019241474 +0000 UTC m=+383.804915832" Apr 23 01:16:27.007509 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:27.007477 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-6b799cbd77-t4mj7" Apr 23 01:16:41.590002 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:41.589943 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln"] Apr 23 01:16:41.593423 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:41.593398 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln" Apr 23 01:16:41.595788 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:41.595748 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 23 01:16:41.595946 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:41.595846 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 23 01:16:41.595946 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:41.595857 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 23 01:16:41.595946 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:41.595898 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-5rxc6\"" Apr 23 01:16:41.605492 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:41.605468 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln"] Apr 23 01:16:41.730769 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:41.730737 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/814b8694-9550-460f-a692-dce74660f64d-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4qzln\" (UID: \"814b8694-9550-460f-a692-dce74660f64d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln" Apr 23 01:16:41.730769 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:41.730771 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/814b8694-9550-460f-a692-dce74660f64d-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4qzln\" (UID: \"814b8694-9550-460f-a692-dce74660f64d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln" Apr 23 01:16:41.730988 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:41.730794 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/814b8694-9550-460f-a692-dce74660f64d-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4qzln\" (UID: \"814b8694-9550-460f-a692-dce74660f64d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln" Apr 23 01:16:41.730988 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:41.730846 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/814b8694-9550-460f-a692-dce74660f64d-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4qzln\" (UID: \"814b8694-9550-460f-a692-dce74660f64d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln" Apr 23 01:16:41.730988 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:41.730864 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/814b8694-9550-460f-a692-dce74660f64d-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4qzln\" (UID: \"814b8694-9550-460f-a692-dce74660f64d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln" Apr 23 01:16:41.730988 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:41.730883 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/814b8694-9550-460f-a692-dce74660f64d-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4qzln\" (UID: \"814b8694-9550-460f-a692-dce74660f64d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln" Apr 23 01:16:41.730988 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:41.730954 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/814b8694-9550-460f-a692-dce74660f64d-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4qzln\" (UID: \"814b8694-9550-460f-a692-dce74660f64d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln" Apr 23 01:16:41.731200 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:41.731005 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/814b8694-9550-460f-a692-dce74660f64d-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4qzln\" (UID: \"814b8694-9550-460f-a692-dce74660f64d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln" Apr 23 01:16:41.731200 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:41.731094 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97xtd\" (UniqueName: \"kubernetes.io/projected/814b8694-9550-460f-a692-dce74660f64d-kube-api-access-97xtd\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4qzln\" (UID: \"814b8694-9550-460f-a692-dce74660f64d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln" Apr 23 01:16:41.832187 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:41.832151 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97xtd\" (UniqueName: \"kubernetes.io/projected/814b8694-9550-460f-a692-dce74660f64d-kube-api-access-97xtd\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4qzln\" (UID: \"814b8694-9550-460f-a692-dce74660f64d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln" Apr 23 01:16:41.832371 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:41.832202 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/814b8694-9550-460f-a692-dce74660f64d-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4qzln\" (UID: \"814b8694-9550-460f-a692-dce74660f64d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln" Apr 23 01:16:41.832371 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:41.832258 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/814b8694-9550-460f-a692-dce74660f64d-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4qzln\" (UID: \"814b8694-9550-460f-a692-dce74660f64d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln" Apr 23 01:16:41.832371 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:41.832293 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/814b8694-9550-460f-a692-dce74660f64d-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4qzln\" (UID: \"814b8694-9550-460f-a692-dce74660f64d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln" Apr 23 01:16:41.832371 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:41.832354 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/814b8694-9550-460f-a692-dce74660f64d-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4qzln\" (UID: \"814b8694-9550-460f-a692-dce74660f64d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln" Apr 23 01:16:41.832595 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:41.832384 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/814b8694-9550-460f-a692-dce74660f64d-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4qzln\" (UID: \"814b8694-9550-460f-a692-dce74660f64d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln" Apr 23 01:16:41.832595 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:41.832422 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/814b8694-9550-460f-a692-dce74660f64d-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4qzln\" (UID: \"814b8694-9550-460f-a692-dce74660f64d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln" Apr 23 01:16:41.832595 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:41.832453 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/814b8694-9550-460f-a692-dce74660f64d-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4qzln\" (UID: \"814b8694-9550-460f-a692-dce74660f64d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln" Apr 23 01:16:41.832595 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:41.832492 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/814b8694-9550-460f-a692-dce74660f64d-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4qzln\" (UID: \"814b8694-9550-460f-a692-dce74660f64d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln" Apr 23 01:16:41.832834 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:41.832807 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/814b8694-9550-460f-a692-dce74660f64d-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4qzln\" (UID: \"814b8694-9550-460f-a692-dce74660f64d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln" Apr 23 01:16:41.832906 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:41.832859 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/814b8694-9550-460f-a692-dce74660f64d-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4qzln\" (UID: \"814b8694-9550-460f-a692-dce74660f64d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln" Apr 23 01:16:41.833061 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:41.833034 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/814b8694-9550-460f-a692-dce74660f64d-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4qzln\" (UID: \"814b8694-9550-460f-a692-dce74660f64d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln" Apr 23 01:16:41.833184 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:41.833099 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/814b8694-9550-460f-a692-dce74660f64d-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4qzln\" (UID: \"814b8694-9550-460f-a692-dce74660f64d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln" Apr 23 01:16:41.833184 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:41.833155 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/814b8694-9550-460f-a692-dce74660f64d-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4qzln\" (UID: \"814b8694-9550-460f-a692-dce74660f64d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln" Apr 23 01:16:41.834716 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:41.834688 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/814b8694-9550-460f-a692-dce74660f64d-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4qzln\" (UID: \"814b8694-9550-460f-a692-dce74660f64d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln" Apr 23 01:16:41.835027 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:41.835010 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/814b8694-9550-460f-a692-dce74660f64d-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4qzln\" (UID: \"814b8694-9550-460f-a692-dce74660f64d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln" Apr 23 01:16:41.852023 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:41.851939 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/814b8694-9550-460f-a692-dce74660f64d-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4qzln\" (UID: \"814b8694-9550-460f-a692-dce74660f64d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln" Apr 23 01:16:41.852149 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:41.852130 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-97xtd\" (UniqueName: \"kubernetes.io/projected/814b8694-9550-460f-a692-dce74660f64d-kube-api-access-97xtd\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4qzln\" (UID: \"814b8694-9550-460f-a692-dce74660f64d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln" Apr 23 01:16:41.905111 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:41.905065 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln" Apr 23 01:16:42.080515 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:42.080479 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln"] Apr 23 01:16:42.084062 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:16:42.084034 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod814b8694_9550_460f_a692_dce74660f64d.slice/crio-75e6e8a95c6f32359a5c0774500d637c8bcd3ceb7167d5bdcb6b6a426179d333 WatchSource:0}: Error finding container 75e6e8a95c6f32359a5c0774500d637c8bcd3ceb7167d5bdcb6b6a426179d333: Status 404 returned error can't find the container with id 75e6e8a95c6f32359a5c0774500d637c8bcd3ceb7167d5bdcb6b6a426179d333 Apr 23 01:16:43.090271 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:43.090226 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln" event={"ID":"814b8694-9550-460f-a692-dce74660f64d","Type":"ContainerStarted","Data":"75e6e8a95c6f32359a5c0774500d637c8bcd3ceb7167d5bdcb6b6a426179d333"} Apr 23 01:16:44.618816 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:44.618777 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 23 01:16:44.619064 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:44.618854 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 23 01:16:44.619064 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:44.618905 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 23 01:16:45.100247 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:45.100211 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln" event={"ID":"814b8694-9550-460f-a692-dce74660f64d","Type":"ContainerStarted","Data":"27baa2883892e30bc9ca410cc19ee9b7f75575226f77e2b4175acaf74eda1333"} Apr 23 01:16:45.119030 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:45.118977 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln" podStartSLOduration=1.586465257 podStartE2EDuration="4.11896177s" podCreationTimestamp="2026-04-23 01:16:41 +0000 UTC" firstStartedPulling="2026-04-23 01:16:42.085984284 +0000 UTC m=+409.871658619" lastFinishedPulling="2026-04-23 01:16:44.618480788 +0000 UTC m=+412.404155132" observedRunningTime="2026-04-23 01:16:45.118122836 +0000 UTC m=+412.903797193" watchObservedRunningTime="2026-04-23 01:16:45.11896177 +0000 UTC m=+412.904636196" Apr 23 01:16:45.906124 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:45.906083 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln" Apr 23 01:16:45.910807 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:45.910784 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln" Apr 23 01:16:46.103374 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:46.103337 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln" Apr 23 01:16:46.104266 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:16:46.104247 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4qzln" Apr 23 01:17:11.744020 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:11.743926 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-wgqcm"] Apr 23 01:17:11.749396 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:11.749379 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-wgqcm" Apr 23 01:17:11.751746 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:11.751705 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 23 01:17:11.751883 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:11.751705 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-2scdn\"" Apr 23 01:17:11.751971 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:11.751956 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 23 01:17:11.757432 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:11.757409 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-wgqcm"] Apr 23 01:17:11.950183 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:11.950146 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rjhq\" (UniqueName: \"kubernetes.io/projected/f70168e1-7dd1-4b0c-bd66-a8b5a6e79419-kube-api-access-6rjhq\") pod \"kuadrant-operator-catalog-wgqcm\" (UID: \"f70168e1-7dd1-4b0c-bd66-a8b5a6e79419\") " pod="kuadrant-system/kuadrant-operator-catalog-wgqcm" Apr 23 01:17:12.051550 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:12.051454 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6rjhq\" (UniqueName: \"kubernetes.io/projected/f70168e1-7dd1-4b0c-bd66-a8b5a6e79419-kube-api-access-6rjhq\") pod \"kuadrant-operator-catalog-wgqcm\" (UID: \"f70168e1-7dd1-4b0c-bd66-a8b5a6e79419\") " pod="kuadrant-system/kuadrant-operator-catalog-wgqcm" Apr 23 01:17:12.058774 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:12.058747 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rjhq\" (UniqueName: \"kubernetes.io/projected/f70168e1-7dd1-4b0c-bd66-a8b5a6e79419-kube-api-access-6rjhq\") pod \"kuadrant-operator-catalog-wgqcm\" (UID: \"f70168e1-7dd1-4b0c-bd66-a8b5a6e79419\") " pod="kuadrant-system/kuadrant-operator-catalog-wgqcm" Apr 23 01:17:12.058992 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:12.058864 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-wgqcm" Apr 23 01:17:12.120480 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:12.119447 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-wgqcm"] Apr 23 01:17:12.187677 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:12.187643 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-wgqcm"] Apr 23 01:17:12.191328 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:17:12.191301 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf70168e1_7dd1_4b0c_bd66_a8b5a6e79419.slice/crio-99e5070a13ac88cbf97e67f1c03fe8a784281e639655444b63a75048b8b54539 WatchSource:0}: Error finding container 99e5070a13ac88cbf97e67f1c03fe8a784281e639655444b63a75048b8b54539: Status 404 returned error can't find the container with id 99e5070a13ac88cbf97e67f1c03fe8a784281e639655444b63a75048b8b54539 Apr 23 01:17:12.324302 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:12.324234 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-5qqtk"] Apr 23 01:17:12.327311 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:12.327296 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-5qqtk" Apr 23 01:17:12.333928 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:12.333889 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-5qqtk"] Apr 23 01:17:12.353120 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:12.353096 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsfmw\" (UniqueName: \"kubernetes.io/projected/ca2f3745-de45-4910-b99a-354a7ed67843-kube-api-access-tsfmw\") pod \"kuadrant-operator-catalog-5qqtk\" (UID: \"ca2f3745-de45-4910-b99a-354a7ed67843\") " pod="kuadrant-system/kuadrant-operator-catalog-5qqtk" Apr 23 01:17:12.453823 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:12.453790 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tsfmw\" (UniqueName: \"kubernetes.io/projected/ca2f3745-de45-4910-b99a-354a7ed67843-kube-api-access-tsfmw\") pod \"kuadrant-operator-catalog-5qqtk\" (UID: \"ca2f3745-de45-4910-b99a-354a7ed67843\") " pod="kuadrant-system/kuadrant-operator-catalog-5qqtk" Apr 23 01:17:12.461094 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:12.461069 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsfmw\" (UniqueName: \"kubernetes.io/projected/ca2f3745-de45-4910-b99a-354a7ed67843-kube-api-access-tsfmw\") pod \"kuadrant-operator-catalog-5qqtk\" (UID: \"ca2f3745-de45-4910-b99a-354a7ed67843\") " pod="kuadrant-system/kuadrant-operator-catalog-5qqtk" Apr 23 01:17:12.637865 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:12.637824 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-5qqtk" Apr 23 01:17:12.767288 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:12.767264 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-5qqtk"] Apr 23 01:17:12.767770 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:17:12.767747 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca2f3745_de45_4910_b99a_354a7ed67843.slice/crio-ce576941f8023972b0d7ba61a8300dc2e81f0dd454e0c8fc48f808eb14bbd97c WatchSource:0}: Error finding container ce576941f8023972b0d7ba61a8300dc2e81f0dd454e0c8fc48f808eb14bbd97c: Status 404 returned error can't find the container with id ce576941f8023972b0d7ba61a8300dc2e81f0dd454e0c8fc48f808eb14bbd97c Apr 23 01:17:13.194906 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:13.194862 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-5qqtk" event={"ID":"ca2f3745-de45-4910-b99a-354a7ed67843","Type":"ContainerStarted","Data":"ce576941f8023972b0d7ba61a8300dc2e81f0dd454e0c8fc48f808eb14bbd97c"} Apr 23 01:17:13.196453 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:13.196421 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-wgqcm" event={"ID":"f70168e1-7dd1-4b0c-bd66-a8b5a6e79419","Type":"ContainerStarted","Data":"99e5070a13ac88cbf97e67f1c03fe8a784281e639655444b63a75048b8b54539"} Apr 23 01:17:15.204097 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:15.204058 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-5qqtk" event={"ID":"ca2f3745-de45-4910-b99a-354a7ed67843","Type":"ContainerStarted","Data":"bf30281bb8bf52ca4bebf6f3bfdce606d35a70fed84918141f0aa45fc096a711"} Apr 23 01:17:15.205416 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:15.205391 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-wgqcm" event={"ID":"f70168e1-7dd1-4b0c-bd66-a8b5a6e79419","Type":"ContainerStarted","Data":"6e9d28370b370eedff842c538e8748d60ddac18ade78ad161c5cb33adab022b2"} Apr 23 01:17:15.205540 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:15.205453 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-wgqcm" podUID="f70168e1-7dd1-4b0c-bd66-a8b5a6e79419" containerName="registry-server" containerID="cri-o://6e9d28370b370eedff842c538e8748d60ddac18ade78ad161c5cb33adab022b2" gracePeriod=2 Apr 23 01:17:15.217655 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:15.217574 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-5qqtk" podStartSLOduration=1.763325487 podStartE2EDuration="3.217545561s" podCreationTimestamp="2026-04-23 01:17:12 +0000 UTC" firstStartedPulling="2026-04-23 01:17:12.769297309 +0000 UTC m=+440.554971645" lastFinishedPulling="2026-04-23 01:17:14.223517382 +0000 UTC m=+442.009191719" observedRunningTime="2026-04-23 01:17:15.217492892 +0000 UTC m=+443.003167250" watchObservedRunningTime="2026-04-23 01:17:15.217545561 +0000 UTC m=+443.003219920" Apr 23 01:17:15.233384 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:15.233341 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-wgqcm" podStartSLOduration=2.203218468 podStartE2EDuration="4.233328139s" podCreationTimestamp="2026-04-23 01:17:11 +0000 UTC" firstStartedPulling="2026-04-23 01:17:12.192922825 +0000 UTC m=+439.978597161" lastFinishedPulling="2026-04-23 01:17:14.223032493 +0000 UTC m=+442.008706832" observedRunningTime="2026-04-23 01:17:15.231035707 +0000 UTC m=+443.016710064" watchObservedRunningTime="2026-04-23 01:17:15.233328139 +0000 UTC m=+443.019002495" Apr 23 01:17:15.451197 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:15.451173 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-wgqcm" Apr 23 01:17:15.479958 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:15.479890 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rjhq\" (UniqueName: \"kubernetes.io/projected/f70168e1-7dd1-4b0c-bd66-a8b5a6e79419-kube-api-access-6rjhq\") pod \"f70168e1-7dd1-4b0c-bd66-a8b5a6e79419\" (UID: \"f70168e1-7dd1-4b0c-bd66-a8b5a6e79419\") " Apr 23 01:17:15.482090 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:15.482069 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f70168e1-7dd1-4b0c-bd66-a8b5a6e79419-kube-api-access-6rjhq" (OuterVolumeSpecName: "kube-api-access-6rjhq") pod "f70168e1-7dd1-4b0c-bd66-a8b5a6e79419" (UID: "f70168e1-7dd1-4b0c-bd66-a8b5a6e79419"). InnerVolumeSpecName "kube-api-access-6rjhq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:17:15.580942 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:15.580897 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6rjhq\" (UniqueName: \"kubernetes.io/projected/f70168e1-7dd1-4b0c-bd66-a8b5a6e79419-kube-api-access-6rjhq\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:17:16.209897 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:16.209860 2569 generic.go:358] "Generic (PLEG): container finished" podID="f70168e1-7dd1-4b0c-bd66-a8b5a6e79419" containerID="6e9d28370b370eedff842c538e8748d60ddac18ade78ad161c5cb33adab022b2" exitCode=0 Apr 23 01:17:16.210335 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:16.209922 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-wgqcm" Apr 23 01:17:16.210335 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:16.209950 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-wgqcm" event={"ID":"f70168e1-7dd1-4b0c-bd66-a8b5a6e79419","Type":"ContainerDied","Data":"6e9d28370b370eedff842c538e8748d60ddac18ade78ad161c5cb33adab022b2"} Apr 23 01:17:16.210335 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:16.209991 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-wgqcm" event={"ID":"f70168e1-7dd1-4b0c-bd66-a8b5a6e79419","Type":"ContainerDied","Data":"99e5070a13ac88cbf97e67f1c03fe8a784281e639655444b63a75048b8b54539"} Apr 23 01:17:16.210335 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:16.210012 2569 scope.go:117] "RemoveContainer" containerID="6e9d28370b370eedff842c538e8748d60ddac18ade78ad161c5cb33adab022b2" Apr 23 01:17:16.218748 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:16.218729 2569 scope.go:117] "RemoveContainer" containerID="6e9d28370b370eedff842c538e8748d60ddac18ade78ad161c5cb33adab022b2" Apr 23 01:17:16.219028 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:17:16.219009 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e9d28370b370eedff842c538e8748d60ddac18ade78ad161c5cb33adab022b2\": container with ID starting with 6e9d28370b370eedff842c538e8748d60ddac18ade78ad161c5cb33adab022b2 not found: ID does not exist" containerID="6e9d28370b370eedff842c538e8748d60ddac18ade78ad161c5cb33adab022b2" Apr 23 01:17:16.219083 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:16.219037 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e9d28370b370eedff842c538e8748d60ddac18ade78ad161c5cb33adab022b2"} err="failed to get container status \"6e9d28370b370eedff842c538e8748d60ddac18ade78ad161c5cb33adab022b2\": rpc error: code = NotFound desc = could not find container \"6e9d28370b370eedff842c538e8748d60ddac18ade78ad161c5cb33adab022b2\": container with ID starting with 6e9d28370b370eedff842c538e8748d60ddac18ade78ad161c5cb33adab022b2 not found: ID does not exist" Apr 23 01:17:16.228831 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:16.228804 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-wgqcm"] Apr 23 01:17:16.231038 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:16.231015 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-wgqcm"] Apr 23 01:17:16.766734 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:16.766703 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f70168e1-7dd1-4b0c-bd66-a8b5a6e79419" path="/var/lib/kubelet/pods/f70168e1-7dd1-4b0c-bd66-a8b5a6e79419/volumes" Apr 23 01:17:22.638468 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:22.638428 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-5qqtk" Apr 23 01:17:22.638468 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:22.638471 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-5qqtk" Apr 23 01:17:22.659947 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:22.659920 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-5qqtk" Apr 23 01:17:23.254975 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:23.254942 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-5qqtk" Apr 23 01:17:43.693135 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:43.693096 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-hktjf"] Apr 23 01:17:43.693622 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:43.693437 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f70168e1-7dd1-4b0c-bd66-a8b5a6e79419" containerName="registry-server" Apr 23 01:17:43.693622 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:43.693448 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f70168e1-7dd1-4b0c-bd66-a8b5a6e79419" containerName="registry-server" Apr 23 01:17:43.693622 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:43.693507 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="f70168e1-7dd1-4b0c-bd66-a8b5a6e79419" containerName="registry-server" Apr 23 01:17:43.695304 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:43.695288 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-hktjf" Apr 23 01:17:43.701090 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:43.701064 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-rx2mb\"" Apr 23 01:17:43.706079 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:43.706051 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-hktjf"] Apr 23 01:17:43.821785 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:43.821715 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtzpg\" (UniqueName: \"kubernetes.io/projected/6aa9b334-0986-4219-8395-33b3949fd6c2-kube-api-access-wtzpg\") pod \"authorino-operator-657f44b778-hktjf\" (UID: \"6aa9b334-0986-4219-8395-33b3949fd6c2\") " pod="kuadrant-system/authorino-operator-657f44b778-hktjf" Apr 23 01:17:43.922582 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:43.922551 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtzpg\" (UniqueName: \"kubernetes.io/projected/6aa9b334-0986-4219-8395-33b3949fd6c2-kube-api-access-wtzpg\") pod \"authorino-operator-657f44b778-hktjf\" (UID: \"6aa9b334-0986-4219-8395-33b3949fd6c2\") " pod="kuadrant-system/authorino-operator-657f44b778-hktjf" Apr 23 01:17:43.932645 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:43.932602 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtzpg\" (UniqueName: \"kubernetes.io/projected/6aa9b334-0986-4219-8395-33b3949fd6c2-kube-api-access-wtzpg\") pod \"authorino-operator-657f44b778-hktjf\" (UID: \"6aa9b334-0986-4219-8395-33b3949fd6c2\") " pod="kuadrant-system/authorino-operator-657f44b778-hktjf" Apr 23 01:17:44.010585 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:44.010499 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-hktjf" Apr 23 01:17:44.142506 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:44.142480 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-hktjf"] Apr 23 01:17:44.145169 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:17:44.145134 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6aa9b334_0986_4219_8395_33b3949fd6c2.slice/crio-682f089463249066c066b8899bc75fa68e226d466873669ee6eecfef2fe73dfc WatchSource:0}: Error finding container 682f089463249066c066b8899bc75fa68e226d466873669ee6eecfef2fe73dfc: Status 404 returned error can't find the container with id 682f089463249066c066b8899bc75fa68e226d466873669ee6eecfef2fe73dfc Apr 23 01:17:44.309000 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:44.308913 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-hktjf" event={"ID":"6aa9b334-0986-4219-8395-33b3949fd6c2","Type":"ContainerStarted","Data":"682f089463249066c066b8899bc75fa68e226d466873669ee6eecfef2fe73dfc"} Apr 23 01:17:44.802471 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:44.802435 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5f59b5659c-7w4ng"] Apr 23 01:17:44.804983 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:44.804956 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f59b5659c-7w4ng" Apr 23 01:17:44.810069 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:44.809817 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 01:17:44.810069 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:44.809849 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-cm99s\"" Apr 23 01:17:44.810069 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:44.809898 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 01:17:44.810069 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:44.810029 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 01:17:44.811207 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:44.811183 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 01:17:44.811644 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:44.811466 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 01:17:44.818470 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:44.818452 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 23 01:17:44.825367 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:44.825335 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f59b5659c-7w4ng"] Apr 23 01:17:44.932144 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:44.932099 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrc2q\" (UniqueName: \"kubernetes.io/projected/9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c-kube-api-access-hrc2q\") pod \"console-5f59b5659c-7w4ng\" (UID: \"9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c\") " pod="openshift-console/console-5f59b5659c-7w4ng" Apr 23 01:17:44.932333 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:44.932175 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c-oauth-serving-cert\") pod \"console-5f59b5659c-7w4ng\" (UID: \"9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c\") " pod="openshift-console/console-5f59b5659c-7w4ng" Apr 23 01:17:44.932333 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:44.932237 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c-service-ca\") pod \"console-5f59b5659c-7w4ng\" (UID: \"9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c\") " pod="openshift-console/console-5f59b5659c-7w4ng" Apr 23 01:17:44.932333 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:44.932270 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c-console-config\") pod \"console-5f59b5659c-7w4ng\" (UID: \"9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c\") " pod="openshift-console/console-5f59b5659c-7w4ng" Apr 23 01:17:44.932333 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:44.932303 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c-console-oauth-config\") pod \"console-5f59b5659c-7w4ng\" (UID: \"9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c\") " pod="openshift-console/console-5f59b5659c-7w4ng" Apr 23 01:17:44.932496 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:44.932356 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c-console-serving-cert\") pod \"console-5f59b5659c-7w4ng\" (UID: \"9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c\") " pod="openshift-console/console-5f59b5659c-7w4ng" Apr 23 01:17:44.932496 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:44.932382 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c-trusted-ca-bundle\") pod \"console-5f59b5659c-7w4ng\" (UID: \"9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c\") " pod="openshift-console/console-5f59b5659c-7w4ng" Apr 23 01:17:45.033725 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:45.033674 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c-oauth-serving-cert\") pod \"console-5f59b5659c-7w4ng\" (UID: \"9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c\") " pod="openshift-console/console-5f59b5659c-7w4ng" Apr 23 01:17:45.033926 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:45.033754 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c-service-ca\") pod \"console-5f59b5659c-7w4ng\" (UID: \"9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c\") " pod="openshift-console/console-5f59b5659c-7w4ng" Apr 23 01:17:45.033926 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:45.033791 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c-console-config\") pod \"console-5f59b5659c-7w4ng\" (UID: \"9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c\") " pod="openshift-console/console-5f59b5659c-7w4ng" Apr 23 01:17:45.033926 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:45.033827 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c-console-oauth-config\") pod \"console-5f59b5659c-7w4ng\" (UID: \"9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c\") " pod="openshift-console/console-5f59b5659c-7w4ng" Apr 23 01:17:45.033926 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:45.033869 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c-console-serving-cert\") pod \"console-5f59b5659c-7w4ng\" (UID: \"9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c\") " pod="openshift-console/console-5f59b5659c-7w4ng" Apr 23 01:17:45.033926 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:45.033895 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c-trusted-ca-bundle\") pod \"console-5f59b5659c-7w4ng\" (UID: \"9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c\") " pod="openshift-console/console-5f59b5659c-7w4ng" Apr 23 01:17:45.034292 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:45.033943 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hrc2q\" (UniqueName: \"kubernetes.io/projected/9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c-kube-api-access-hrc2q\") pod \"console-5f59b5659c-7w4ng\" (UID: \"9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c\") " pod="openshift-console/console-5f59b5659c-7w4ng" Apr 23 01:17:45.034772 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:45.034529 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c-oauth-serving-cert\") pod \"console-5f59b5659c-7w4ng\" (UID: \"9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c\") " pod="openshift-console/console-5f59b5659c-7w4ng" Apr 23 01:17:45.035300 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:45.035276 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c-console-config\") pod \"console-5f59b5659c-7w4ng\" (UID: \"9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c\") " pod="openshift-console/console-5f59b5659c-7w4ng" Apr 23 01:17:45.035415 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:45.035354 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c-trusted-ca-bundle\") pod \"console-5f59b5659c-7w4ng\" (UID: \"9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c\") " pod="openshift-console/console-5f59b5659c-7w4ng" Apr 23 01:17:45.035670 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:45.035646 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c-service-ca\") pod \"console-5f59b5659c-7w4ng\" (UID: \"9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c\") " pod="openshift-console/console-5f59b5659c-7w4ng" Apr 23 01:17:45.038204 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:45.038156 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c-console-serving-cert\") pod \"console-5f59b5659c-7w4ng\" (UID: \"9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c\") " pod="openshift-console/console-5f59b5659c-7w4ng" Apr 23 01:17:45.039066 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:45.039025 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c-console-oauth-config\") pod \"console-5f59b5659c-7w4ng\" (UID: \"9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c\") " pod="openshift-console/console-5f59b5659c-7w4ng" Apr 23 01:17:45.049276 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:45.049231 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrc2q\" (UniqueName: \"kubernetes.io/projected/9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c-kube-api-access-hrc2q\") pod \"console-5f59b5659c-7w4ng\" (UID: \"9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c\") " pod="openshift-console/console-5f59b5659c-7w4ng" Apr 23 01:17:45.118523 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:45.118439 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f59b5659c-7w4ng" Apr 23 01:17:45.268893 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:45.268813 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f59b5659c-7w4ng"] Apr 23 01:17:45.271464 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:17:45.271432 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fa6f04f_f8e0_4cbc_94b9_9114ea798d6c.slice/crio-4548ce281d0d9cf4b93877215018e3d60df8e5a0c4e0a82662cfe9dc05aac524 WatchSource:0}: Error finding container 4548ce281d0d9cf4b93877215018e3d60df8e5a0c4e0a82662cfe9dc05aac524: Status 404 returned error can't find the container with id 4548ce281d0d9cf4b93877215018e3d60df8e5a0c4e0a82662cfe9dc05aac524 Apr 23 01:17:45.313828 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:45.313791 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f59b5659c-7w4ng" event={"ID":"9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c","Type":"ContainerStarted","Data":"4548ce281d0d9cf4b93877215018e3d60df8e5a0c4e0a82662cfe9dc05aac524"} Apr 23 01:17:46.321481 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:46.321445 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f59b5659c-7w4ng" event={"ID":"9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c","Type":"ContainerStarted","Data":"1861d820c5f7c70cdbede21fe93396d67b3a9c8bb95fb777134964876e6a67e9"} Apr 23 01:17:46.338468 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:46.338423 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5f59b5659c-7w4ng" podStartSLOduration=2.338407909 podStartE2EDuration="2.338407909s" podCreationTimestamp="2026-04-23 01:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 01:17:46.336357498 +0000 UTC m=+474.122031857" watchObservedRunningTime="2026-04-23 01:17:46.338407909 +0000 UTC m=+474.124082265" Apr 23 01:17:47.326242 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:47.326209 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-hktjf" event={"ID":"6aa9b334-0986-4219-8395-33b3949fd6c2","Type":"ContainerStarted","Data":"c68af290bb4c3ec7ebf2c7cee46114a022ee4557cc5ea807906323be409afdde"} Apr 23 01:17:47.326713 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:47.326474 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-hktjf" Apr 23 01:17:47.344171 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:47.344122 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-hktjf" podStartSLOduration=2.140169871 podStartE2EDuration="4.344106351s" podCreationTimestamp="2026-04-23 01:17:43 +0000 UTC" firstStartedPulling="2026-04-23 01:17:44.147105592 +0000 UTC m=+471.932779928" lastFinishedPulling="2026-04-23 01:17:46.351042062 +0000 UTC m=+474.136716408" observedRunningTime="2026-04-23 01:17:47.342775123 +0000 UTC m=+475.128449479" watchObservedRunningTime="2026-04-23 01:17:47.344106351 +0000 UTC m=+475.129780708" Apr 23 01:17:50.301367 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:50.301332 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-lktzw"] Apr 23 01:17:50.305401 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:50.305380 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-lktzw" Apr 23 01:17:50.307738 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:50.307718 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-n75n9\"" Apr 23 01:17:50.308062 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:50.308034 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 23 01:17:50.316312 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:50.316287 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-lktzw"] Apr 23 01:17:50.381867 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:50.381833 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpbbt\" (UniqueName: \"kubernetes.io/projected/b743d95c-4e3b-49d3-867e-0354895fee99-kube-api-access-wpbbt\") pod \"dns-operator-controller-manager-648d5c98bc-lktzw\" (UID: \"b743d95c-4e3b-49d3-867e-0354895fee99\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-lktzw" Apr 23 01:17:50.482824 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:50.482788 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wpbbt\" (UniqueName: \"kubernetes.io/projected/b743d95c-4e3b-49d3-867e-0354895fee99-kube-api-access-wpbbt\") pod \"dns-operator-controller-manager-648d5c98bc-lktzw\" (UID: \"b743d95c-4e3b-49d3-867e-0354895fee99\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-lktzw" Apr 23 01:17:50.490805 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:50.490774 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpbbt\" (UniqueName: \"kubernetes.io/projected/b743d95c-4e3b-49d3-867e-0354895fee99-kube-api-access-wpbbt\") pod \"dns-operator-controller-manager-648d5c98bc-lktzw\" (UID: \"b743d95c-4e3b-49d3-867e-0354895fee99\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-lktzw" Apr 23 01:17:50.616177 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:50.616101 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-lktzw" Apr 23 01:17:50.743270 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:50.743249 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-lktzw"] Apr 23 01:17:50.745835 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:17:50.745810 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb743d95c_4e3b_49d3_867e_0354895fee99.slice/crio-bfcd8d9ce32c5b240d5fdd09e79397e4280a233f1b750a177a4f3001fe904dad WatchSource:0}: Error finding container bfcd8d9ce32c5b240d5fdd09e79397e4280a233f1b750a177a4f3001fe904dad: Status 404 returned error can't find the container with id bfcd8d9ce32c5b240d5fdd09e79397e4280a233f1b750a177a4f3001fe904dad Apr 23 01:17:51.341868 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:51.341765 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-lktzw" event={"ID":"b743d95c-4e3b-49d3-867e-0354895fee99","Type":"ContainerStarted","Data":"bfcd8d9ce32c5b240d5fdd09e79397e4280a233f1b750a177a4f3001fe904dad"} Apr 23 01:17:53.301541 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:53.301516 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 23 01:17:54.353180 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:54.353141 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-lktzw" event={"ID":"b743d95c-4e3b-49d3-867e-0354895fee99","Type":"ContainerStarted","Data":"3c032925ac8b48e7e7e68baed5e0d81edc16f72afbe4dc7d19076c72c86267a6"} Apr 23 01:17:54.353677 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:54.353256 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-lktzw" Apr 23 01:17:54.370546 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:54.370496 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-lktzw" podStartSLOduration=1.8196913129999999 podStartE2EDuration="4.370482365s" podCreationTimestamp="2026-04-23 01:17:50 +0000 UTC" firstStartedPulling="2026-04-23 01:17:50.748253934 +0000 UTC m=+478.533928272" lastFinishedPulling="2026-04-23 01:17:53.299044989 +0000 UTC m=+481.084719324" observedRunningTime="2026-04-23 01:17:54.36821883 +0000 UTC m=+482.153893187" watchObservedRunningTime="2026-04-23 01:17:54.370482365 +0000 UTC m=+482.156156722" Apr 23 01:17:55.119276 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:55.119240 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5f59b5659c-7w4ng" Apr 23 01:17:55.119450 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:55.119323 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5f59b5659c-7w4ng" Apr 23 01:17:55.124303 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:55.124279 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5f59b5659c-7w4ng" Apr 23 01:17:55.154945 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:55.154911 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9xhmx"] Apr 23 01:17:55.157188 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:55.157172 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9xhmx" Apr 23 01:17:55.159532 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:55.159511 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-94w6b\"" Apr 23 01:17:55.167853 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:55.167829 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9xhmx"] Apr 23 01:17:55.328749 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:55.328712 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cg89\" (UniqueName: \"kubernetes.io/projected/b45a2e15-ba50-4ac6-b55d-79480e67a497-kube-api-access-2cg89\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-9xhmx\" (UID: \"b45a2e15-ba50-4ac6-b55d-79480e67a497\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9xhmx" Apr 23 01:17:55.328919 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:55.328762 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b45a2e15-ba50-4ac6-b55d-79480e67a497-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-9xhmx\" (UID: \"b45a2e15-ba50-4ac6-b55d-79480e67a497\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9xhmx" Apr 23 01:17:55.360734 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:55.360707 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5f59b5659c-7w4ng" Apr 23 01:17:55.429500 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:55.429464 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b45a2e15-ba50-4ac6-b55d-79480e67a497-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-9xhmx\" (UID: \"b45a2e15-ba50-4ac6-b55d-79480e67a497\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9xhmx" Apr 23 01:17:55.429701 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:55.429593 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2cg89\" (UniqueName: \"kubernetes.io/projected/b45a2e15-ba50-4ac6-b55d-79480e67a497-kube-api-access-2cg89\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-9xhmx\" (UID: \"b45a2e15-ba50-4ac6-b55d-79480e67a497\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9xhmx" Apr 23 01:17:55.429936 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:55.429914 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b45a2e15-ba50-4ac6-b55d-79480e67a497-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-9xhmx\" (UID: \"b45a2e15-ba50-4ac6-b55d-79480e67a497\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9xhmx" Apr 23 01:17:55.437371 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:55.437345 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cg89\" (UniqueName: \"kubernetes.io/projected/b45a2e15-ba50-4ac6-b55d-79480e67a497-kube-api-access-2cg89\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-9xhmx\" (UID: \"b45a2e15-ba50-4ac6-b55d-79480e67a497\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9xhmx" Apr 23 01:17:55.467746 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:55.467713 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9xhmx" Apr 23 01:17:55.594215 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:55.594183 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9xhmx"] Apr 23 01:17:55.597454 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:17:55.597423 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb45a2e15_ba50_4ac6_b55d_79480e67a497.slice/crio-8ad9fa3e01bba0dd9d7af5b03a5cd03e2fb48cc5793a6e282d86d4b65910b106 WatchSource:0}: Error finding container 8ad9fa3e01bba0dd9d7af5b03a5cd03e2fb48cc5793a6e282d86d4b65910b106: Status 404 returned error can't find the container with id 8ad9fa3e01bba0dd9d7af5b03a5cd03e2fb48cc5793a6e282d86d4b65910b106 Apr 23 01:17:56.361305 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:56.361255 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9xhmx" event={"ID":"b45a2e15-ba50-4ac6-b55d-79480e67a497","Type":"ContainerStarted","Data":"8ad9fa3e01bba0dd9d7af5b03a5cd03e2fb48cc5793a6e282d86d4b65910b106"} Apr 23 01:17:58.332159 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:17:58.332123 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-hktjf" Apr 23 01:18:01.380684 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:01.380650 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9xhmx" event={"ID":"b45a2e15-ba50-4ac6-b55d-79480e67a497","Type":"ContainerStarted","Data":"2b35b488ea8872b3c1c3ef2333e15fa6c434c579aba92f0433cf628ede80e12d"} Apr 23 01:18:01.381066 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:01.380881 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9xhmx" Apr 23 01:18:01.403182 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:01.403130 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9xhmx" podStartSLOduration=1.206477644 podStartE2EDuration="6.403117892s" podCreationTimestamp="2026-04-23 01:17:55 +0000 UTC" firstStartedPulling="2026-04-23 01:17:55.600067661 +0000 UTC m=+483.385741997" lastFinishedPulling="2026-04-23 01:18:00.79670791 +0000 UTC m=+488.582382245" observedRunningTime="2026-04-23 01:18:01.401292581 +0000 UTC m=+489.186966939" watchObservedRunningTime="2026-04-23 01:18:01.403117892 +0000 UTC m=+489.188792248" Apr 23 01:18:05.359904 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:05.359869 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-lktzw" Apr 23 01:18:12.385991 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:12.385963 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9xhmx" Apr 23 01:18:13.305509 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:13.305476 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mzjsq"] Apr 23 01:18:13.307784 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:13.307768 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mzjsq" Apr 23 01:18:13.319150 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:13.319127 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mzjsq"] Apr 23 01:18:13.376624 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:13.376583 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f67ae350-e7a9-47f3-98bd-f42432d98be2-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-mzjsq\" (UID: \"f67ae350-e7a9-47f3-98bd-f42432d98be2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mzjsq" Apr 23 01:18:13.376766 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:13.376636 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr5sn\" (UniqueName: \"kubernetes.io/projected/f67ae350-e7a9-47f3-98bd-f42432d98be2-kube-api-access-cr5sn\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-mzjsq\" (UID: \"f67ae350-e7a9-47f3-98bd-f42432d98be2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mzjsq" Apr 23 01:18:13.477816 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:13.477781 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f67ae350-e7a9-47f3-98bd-f42432d98be2-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-mzjsq\" (UID: \"f67ae350-e7a9-47f3-98bd-f42432d98be2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mzjsq" Apr 23 01:18:13.477816 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:13.477818 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cr5sn\" (UniqueName: \"kubernetes.io/projected/f67ae350-e7a9-47f3-98bd-f42432d98be2-kube-api-access-cr5sn\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-mzjsq\" (UID: \"f67ae350-e7a9-47f3-98bd-f42432d98be2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mzjsq" Apr 23 01:18:13.478201 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:13.478137 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f67ae350-e7a9-47f3-98bd-f42432d98be2-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-mzjsq\" (UID: \"f67ae350-e7a9-47f3-98bd-f42432d98be2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mzjsq" Apr 23 01:18:13.497387 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:13.497361 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr5sn\" (UniqueName: \"kubernetes.io/projected/f67ae350-e7a9-47f3-98bd-f42432d98be2-kube-api-access-cr5sn\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-mzjsq\" (UID: \"f67ae350-e7a9-47f3-98bd-f42432d98be2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mzjsq" Apr 23 01:18:13.619542 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:13.619463 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mzjsq" Apr 23 01:18:13.741908 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:13.741885 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mzjsq"] Apr 23 01:18:13.744408 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:18:13.744378 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf67ae350_e7a9_47f3_98bd_f42432d98be2.slice/crio-221e74d88d0514c5e517ff33f488f21fb1537684b515c1b1cf4a761e63944ab7 WatchSource:0}: Error finding container 221e74d88d0514c5e517ff33f488f21fb1537684b515c1b1cf4a761e63944ab7: Status 404 returned error can't find the container with id 221e74d88d0514c5e517ff33f488f21fb1537684b515c1b1cf4a761e63944ab7 Apr 23 01:18:13.955331 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:13.955294 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9xhmx"] Apr 23 01:18:13.955534 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:13.955484 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9xhmx" podUID="b45a2e15-ba50-4ac6-b55d-79480e67a497" containerName="manager" containerID="cri-o://2b35b488ea8872b3c1c3ef2333e15fa6c434c579aba92f0433cf628ede80e12d" gracePeriod=2 Apr 23 01:18:13.965066 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:13.964993 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9xhmx"] Apr 23 01:18:13.974759 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:13.974729 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mzjsq"] Apr 23 01:18:13.982457 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:13.982372 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mzjsq"] Apr 23 01:18:13.988915 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:13.988888 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9l8hf"] Apr 23 01:18:13.989245 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:13.989229 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b45a2e15-ba50-4ac6-b55d-79480e67a497" containerName="manager" Apr 23 01:18:13.989245 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:13.989243 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45a2e15-ba50-4ac6-b55d-79480e67a497" containerName="manager" Apr 23 01:18:13.989392 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:13.989269 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f67ae350-e7a9-47f3-98bd-f42432d98be2" containerName="manager" Apr 23 01:18:13.989392 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:13.989274 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f67ae350-e7a9-47f3-98bd-f42432d98be2" containerName="manager" Apr 23 01:18:13.989392 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:13.989323 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b45a2e15-ba50-4ac6-b55d-79480e67a497" containerName="manager" Apr 23 01:18:13.989392 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:13.989332 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="f67ae350-e7a9-47f3-98bd-f42432d98be2" containerName="manager" Apr 23 01:18:13.991138 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:13.991118 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9l8hf" Apr 23 01:18:14.010933 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.010905 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w7999"] Apr 23 01:18:14.013104 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.013079 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w7999" Apr 23 01:18:14.023479 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.023372 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9l8hf"] Apr 23 01:18:14.028206 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.028170 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w7999"] Apr 23 01:18:14.033806 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.033774 2569 status_manager.go:895] "Failed to get status for pod" podUID="b45a2e15-ba50-4ac6-b55d-79480e67a497" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9xhmx" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-9xhmx\" is forbidden: User \"system:node:ip-10-0-138-235.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-235.ec2.internal' and this object" Apr 23 01:18:14.057842 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.057803 2569 status_manager.go:895] "Failed to get status for pod" podUID="b45a2e15-ba50-4ac6-b55d-79480e67a497" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9xhmx" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-9xhmx\" is forbidden: User \"system:node:ip-10-0-138-235.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-235.ec2.internal' and this object" Apr 23 01:18:14.083549 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.083521 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nst87\" (UniqueName: \"kubernetes.io/projected/2fdaa77e-9a45-4627-aa2b-f8412f42ba7e-kube-api-access-nst87\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-w7999\" (UID: \"2fdaa77e-9a45-4627-aa2b-f8412f42ba7e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w7999" Apr 23 01:18:14.083702 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.083664 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cjjn\" (UniqueName: \"kubernetes.io/projected/94998a26-8514-4fd9-8198-900153d547dd-kube-api-access-8cjjn\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-9l8hf\" (UID: \"94998a26-8514-4fd9-8198-900153d547dd\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9l8hf" Apr 23 01:18:14.083756 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.083711 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/94998a26-8514-4fd9-8198-900153d547dd-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-9l8hf\" (UID: \"94998a26-8514-4fd9-8198-900153d547dd\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9l8hf" Apr 23 01:18:14.083756 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.083745 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2fdaa77e-9a45-4627-aa2b-f8412f42ba7e-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-w7999\" (UID: \"2fdaa77e-9a45-4627-aa2b-f8412f42ba7e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w7999" Apr 23 01:18:14.176169 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.176146 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9xhmx" Apr 23 01:18:14.178208 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.178183 2569 status_manager.go:895] "Failed to get status for pod" podUID="b45a2e15-ba50-4ac6-b55d-79480e67a497" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9xhmx" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-9xhmx\" is forbidden: User \"system:node:ip-10-0-138-235.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-235.ec2.internal' and this object" Apr 23 01:18:14.184581 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.184563 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cg89\" (UniqueName: \"kubernetes.io/projected/b45a2e15-ba50-4ac6-b55d-79480e67a497-kube-api-access-2cg89\") pod \"b45a2e15-ba50-4ac6-b55d-79480e67a497\" (UID: \"b45a2e15-ba50-4ac6-b55d-79480e67a497\") " Apr 23 01:18:14.184755 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.184743 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b45a2e15-ba50-4ac6-b55d-79480e67a497-extensions-socket-volume\") pod \"b45a2e15-ba50-4ac6-b55d-79480e67a497\" (UID: \"b45a2e15-ba50-4ac6-b55d-79480e67a497\") " Apr 23 01:18:14.184859 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.184846 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8cjjn\" (UniqueName: \"kubernetes.io/projected/94998a26-8514-4fd9-8198-900153d547dd-kube-api-access-8cjjn\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-9l8hf\" (UID: \"94998a26-8514-4fd9-8198-900153d547dd\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9l8hf" Apr 23 01:18:14.184915 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.184903 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/94998a26-8514-4fd9-8198-900153d547dd-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-9l8hf\" (UID: \"94998a26-8514-4fd9-8198-900153d547dd\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9l8hf" Apr 23 01:18:14.184966 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.184928 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2fdaa77e-9a45-4627-aa2b-f8412f42ba7e-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-w7999\" (UID: \"2fdaa77e-9a45-4627-aa2b-f8412f42ba7e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w7999" Apr 23 01:18:14.185017 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.184984 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nst87\" (UniqueName: \"kubernetes.io/projected/2fdaa77e-9a45-4627-aa2b-f8412f42ba7e-kube-api-access-nst87\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-w7999\" (UID: \"2fdaa77e-9a45-4627-aa2b-f8412f42ba7e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w7999" Apr 23 01:18:14.185313 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.185285 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/94998a26-8514-4fd9-8198-900153d547dd-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-9l8hf\" (UID: \"94998a26-8514-4fd9-8198-900153d547dd\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9l8hf" Apr 23 01:18:14.185400 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.185293 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b45a2e15-ba50-4ac6-b55d-79480e67a497-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "b45a2e15-ba50-4ac6-b55d-79480e67a497" (UID: "b45a2e15-ba50-4ac6-b55d-79480e67a497"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 01:18:14.185440 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.185426 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2fdaa77e-9a45-4627-aa2b-f8412f42ba7e-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-w7999\" (UID: \"2fdaa77e-9a45-4627-aa2b-f8412f42ba7e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w7999" Apr 23 01:18:14.186529 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.186509 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b45a2e15-ba50-4ac6-b55d-79480e67a497-kube-api-access-2cg89" (OuterVolumeSpecName: "kube-api-access-2cg89") pod "b45a2e15-ba50-4ac6-b55d-79480e67a497" (UID: "b45a2e15-ba50-4ac6-b55d-79480e67a497"). InnerVolumeSpecName "kube-api-access-2cg89". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:18:14.194274 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.194248 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nst87\" (UniqueName: \"kubernetes.io/projected/2fdaa77e-9a45-4627-aa2b-f8412f42ba7e-kube-api-access-nst87\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-w7999\" (UID: \"2fdaa77e-9a45-4627-aa2b-f8412f42ba7e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w7999" Apr 23 01:18:14.194518 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.194503 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cjjn\" (UniqueName: \"kubernetes.io/projected/94998a26-8514-4fd9-8198-900153d547dd-kube-api-access-8cjjn\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-9l8hf\" (UID: \"94998a26-8514-4fd9-8198-900153d547dd\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9l8hf" Apr 23 01:18:14.285974 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.285895 2569 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b45a2e15-ba50-4ac6-b55d-79480e67a497-extensions-socket-volume\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:18:14.285974 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.285926 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2cg89\" (UniqueName: \"kubernetes.io/projected/b45a2e15-ba50-4ac6-b55d-79480e67a497-kube-api-access-2cg89\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:18:14.347841 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.347808 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9l8hf" Apr 23 01:18:14.355178 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.355152 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w7999" Apr 23 01:18:14.430843 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.430807 2569 generic.go:358] "Generic (PLEG): container finished" podID="b45a2e15-ba50-4ac6-b55d-79480e67a497" containerID="2b35b488ea8872b3c1c3ef2333e15fa6c434c579aba92f0433cf628ede80e12d" exitCode=0 Apr 23 01:18:14.430987 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.430975 2569 scope.go:117] "RemoveContainer" containerID="2b35b488ea8872b3c1c3ef2333e15fa6c434c579aba92f0433cf628ede80e12d" Apr 23 01:18:14.431130 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.431110 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9xhmx" Apr 23 01:18:14.433095 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:18:14.433075 2569 kuberuntime_manager.go:623] "Missing actuated resource record" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mzjsq" container="manager" Apr 23 01:18:14.435315 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.434986 2569 status_manager.go:895] "Failed to get status for pod" podUID="b45a2e15-ba50-4ac6-b55d-79480e67a497" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9xhmx" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-9xhmx\" is forbidden: User \"system:node:ip-10-0-138-235.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-235.ec2.internal' and this object" Apr 23 01:18:14.437201 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.437130 2569 status_manager.go:895] "Failed to get status for pod" podUID="f67ae350-e7a9-47f3-98bd-f42432d98be2" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mzjsq" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-mzjsq\" is forbidden: User \"system:node:ip-10-0-138-235.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-235.ec2.internal' and this object" Apr 23 01:18:14.439648 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.439111 2569 status_manager.go:895] "Failed to get status for pod" podUID="b45a2e15-ba50-4ac6-b55d-79480e67a497" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9xhmx" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-9xhmx\" is forbidden: User \"system:node:ip-10-0-138-235.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-235.ec2.internal' and this object" Apr 23 01:18:14.443876 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.443858 2569 scope.go:117] "RemoveContainer" containerID="2b35b488ea8872b3c1c3ef2333e15fa6c434c579aba92f0433cf628ede80e12d" Apr 23 01:18:14.444165 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:18:14.444143 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b35b488ea8872b3c1c3ef2333e15fa6c434c579aba92f0433cf628ede80e12d\": container with ID starting with 2b35b488ea8872b3c1c3ef2333e15fa6c434c579aba92f0433cf628ede80e12d not found: ID does not exist" containerID="2b35b488ea8872b3c1c3ef2333e15fa6c434c579aba92f0433cf628ede80e12d" Apr 23 01:18:14.444248 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.444175 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b35b488ea8872b3c1c3ef2333e15fa6c434c579aba92f0433cf628ede80e12d"} err="failed to get container status \"2b35b488ea8872b3c1c3ef2333e15fa6c434c579aba92f0433cf628ede80e12d\": rpc error: code = NotFound desc = could not find container \"2b35b488ea8872b3c1c3ef2333e15fa6c434c579aba92f0433cf628ede80e12d\": container with ID starting with 2b35b488ea8872b3c1c3ef2333e15fa6c434c579aba92f0433cf628ede80e12d not found: ID does not exist" Apr 23 01:18:14.445741 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.445716 2569 status_manager.go:895] "Failed to get status for pod" podUID="f67ae350-e7a9-47f3-98bd-f42432d98be2" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mzjsq" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-mzjsq\" is forbidden: User \"system:node:ip-10-0-138-235.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-235.ec2.internal' and this object" Apr 23 01:18:14.447649 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.447594 2569 status_manager.go:895] "Failed to get status for pod" podUID="b45a2e15-ba50-4ac6-b55d-79480e67a497" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9xhmx" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-9xhmx\" is forbidden: User \"system:node:ip-10-0-138-235.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-235.ec2.internal' and this object" Apr 23 01:18:14.482858 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.482767 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9l8hf"] Apr 23 01:18:14.485735 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:18:14.485702 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94998a26_8514_4fd9_8198_900153d547dd.slice/crio-1d5feacc2ef459917f86273fa794ee2fee32c04f1ed57ee44da9ab20a6401671 WatchSource:0}: Error finding container 1d5feacc2ef459917f86273fa794ee2fee32c04f1ed57ee44da9ab20a6401671: Status 404 returned error can't find the container with id 1d5feacc2ef459917f86273fa794ee2fee32c04f1ed57ee44da9ab20a6401671 Apr 23 01:18:14.505752 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.505712 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w7999"] Apr 23 01:18:14.507947 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:18:14.507913 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fdaa77e_9a45_4627_aa2b_f8412f42ba7e.slice/crio-75761be218c3c5441f61d5255b740a06a2a8deecd4654a79367c3e74c57ef4c6 WatchSource:0}: Error finding container 75761be218c3c5441f61d5255b740a06a2a8deecd4654a79367c3e74c57ef4c6: Status 404 returned error can't find the container with id 75761be218c3c5441f61d5255b740a06a2a8deecd4654a79367c3e74c57ef4c6 Apr 23 01:18:14.768015 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:14.767984 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b45a2e15-ba50-4ac6-b55d-79480e67a497" path="/var/lib/kubelet/pods/b45a2e15-ba50-4ac6-b55d-79480e67a497/volumes" Apr 23 01:18:15.437293 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:15.437259 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w7999" event={"ID":"2fdaa77e-9a45-4627-aa2b-f8412f42ba7e","Type":"ContainerStarted","Data":"4882fc9ba0f6fce7211fb181e2438de0bdd057efdb7cdbd22114ff7992c483b6"} Apr 23 01:18:15.437293 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:15.437297 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w7999" event={"ID":"2fdaa77e-9a45-4627-aa2b-f8412f42ba7e","Type":"ContainerStarted","Data":"75761be218c3c5441f61d5255b740a06a2a8deecd4654a79367c3e74c57ef4c6"} Apr 23 01:18:15.437535 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:15.437362 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w7999" Apr 23 01:18:15.438796 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:15.438770 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9l8hf" event={"ID":"94998a26-8514-4fd9-8198-900153d547dd","Type":"ContainerStarted","Data":"1bd8998ed5a6709ee336d45dd7d8edf74ba7048c7ab4f1da46f46a641da8af65"} Apr 23 01:18:15.438897 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:15.438803 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9l8hf" event={"ID":"94998a26-8514-4fd9-8198-900153d547dd","Type":"ContainerStarted","Data":"1d5feacc2ef459917f86273fa794ee2fee32c04f1ed57ee44da9ab20a6401671"} Apr 23 01:18:15.438897 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:15.438848 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9l8hf" Apr 23 01:18:15.440144 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:15.440116 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mzjsq" podUID="f67ae350-e7a9-47f3-98bd-f42432d98be2" containerName="manager" containerID="cri-o://f8c44110888aa14441d961adb8c356729093b8e920958a84c14c308760680d2b" gracePeriod=2 Apr 23 01:18:15.462141 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:15.461335 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w7999" podStartSLOduration=2.461310073 podStartE2EDuration="2.461310073s" podCreationTimestamp="2026-04-23 01:18:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 01:18:15.460102162 +0000 UTC m=+503.245776520" watchObservedRunningTime="2026-04-23 01:18:15.461310073 +0000 UTC m=+503.246984433" Apr 23 01:18:15.481490 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:15.481448 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9l8hf" podStartSLOduration=2.481435285 podStartE2EDuration="2.481435285s" podCreationTimestamp="2026-04-23 01:18:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 01:18:15.480432316 +0000 UTC m=+503.266106674" watchObservedRunningTime="2026-04-23 01:18:15.481435285 +0000 UTC m=+503.267109699" Apr 23 01:18:15.675172 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:15.675151 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mzjsq" Apr 23 01:18:15.677114 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:15.677090 2569 status_manager.go:895] "Failed to get status for pod" podUID="f67ae350-e7a9-47f3-98bd-f42432d98be2" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mzjsq" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-mzjsq\" is forbidden: User \"system:node:ip-10-0-138-235.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-235.ec2.internal' and this object" Apr 23 01:18:15.701083 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:15.701034 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cr5sn\" (UniqueName: \"kubernetes.io/projected/f67ae350-e7a9-47f3-98bd-f42432d98be2-kube-api-access-cr5sn\") pod \"f67ae350-e7a9-47f3-98bd-f42432d98be2\" (UID: \"f67ae350-e7a9-47f3-98bd-f42432d98be2\") " Apr 23 01:18:15.701083 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:15.701067 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f67ae350-e7a9-47f3-98bd-f42432d98be2-extensions-socket-volume\") pod \"f67ae350-e7a9-47f3-98bd-f42432d98be2\" (UID: \"f67ae350-e7a9-47f3-98bd-f42432d98be2\") " Apr 23 01:18:15.701343 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:15.701323 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f67ae350-e7a9-47f3-98bd-f42432d98be2-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "f67ae350-e7a9-47f3-98bd-f42432d98be2" (UID: "f67ae350-e7a9-47f3-98bd-f42432d98be2"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 01:18:15.702977 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:15.702955 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f67ae350-e7a9-47f3-98bd-f42432d98be2-kube-api-access-cr5sn" (OuterVolumeSpecName: "kube-api-access-cr5sn") pod "f67ae350-e7a9-47f3-98bd-f42432d98be2" (UID: "f67ae350-e7a9-47f3-98bd-f42432d98be2"). InnerVolumeSpecName "kube-api-access-cr5sn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:18:15.801864 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:15.801831 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cr5sn\" (UniqueName: \"kubernetes.io/projected/f67ae350-e7a9-47f3-98bd-f42432d98be2-kube-api-access-cr5sn\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:18:15.801864 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:15.801856 2569 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f67ae350-e7a9-47f3-98bd-f42432d98be2-extensions-socket-volume\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:18:16.445185 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:16.445154 2569 generic.go:358] "Generic (PLEG): container finished" podID="f67ae350-e7a9-47f3-98bd-f42432d98be2" containerID="f8c44110888aa14441d961adb8c356729093b8e920958a84c14c308760680d2b" exitCode=2 Apr 23 01:18:16.445390 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:16.445203 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mzjsq" Apr 23 01:18:16.445390 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:16.445260 2569 scope.go:117] "RemoveContainer" containerID="f8c44110888aa14441d961adb8c356729093b8e920958a84c14c308760680d2b" Apr 23 01:18:16.447297 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:16.447267 2569 status_manager.go:895] "Failed to get status for pod" podUID="f67ae350-e7a9-47f3-98bd-f42432d98be2" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mzjsq" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-mzjsq\" is forbidden: User \"system:node:ip-10-0-138-235.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-235.ec2.internal' and this object" Apr 23 01:18:16.454074 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:16.454059 2569 scope.go:117] "RemoveContainer" containerID="f8c44110888aa14441d961adb8c356729093b8e920958a84c14c308760680d2b" Apr 23 01:18:16.454322 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:18:16.454302 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8c44110888aa14441d961adb8c356729093b8e920958a84c14c308760680d2b\": container with ID starting with f8c44110888aa14441d961adb8c356729093b8e920958a84c14c308760680d2b not found: ID does not exist" containerID="f8c44110888aa14441d961adb8c356729093b8e920958a84c14c308760680d2b" Apr 23 01:18:16.454371 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:16.454330 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8c44110888aa14441d961adb8c356729093b8e920958a84c14c308760680d2b"} err="failed to get container status \"f8c44110888aa14441d961adb8c356729093b8e920958a84c14c308760680d2b\": rpc error: code = NotFound desc = could not find container \"f8c44110888aa14441d961adb8c356729093b8e920958a84c14c308760680d2b\": container with ID starting with f8c44110888aa14441d961adb8c356729093b8e920958a84c14c308760680d2b not found: ID does not exist" Apr 23 01:18:16.455927 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:16.455906 2569 status_manager.go:895] "Failed to get status for pod" podUID="f67ae350-e7a9-47f3-98bd-f42432d98be2" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-mzjsq" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-mzjsq\" is forbidden: User \"system:node:ip-10-0-138-235.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-235.ec2.internal' and this object" Apr 23 01:18:16.767163 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:16.767085 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f67ae350-e7a9-47f3-98bd-f42432d98be2" path="/var/lib/kubelet/pods/f67ae350-e7a9-47f3-98bd-f42432d98be2/volumes" Apr 23 01:18:26.447827 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:26.447795 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w7999" Apr 23 01:18:26.448267 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:26.447852 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9l8hf" Apr 23 01:18:26.513094 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:26.513059 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9l8hf"] Apr 23 01:18:26.513338 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:26.513297 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9l8hf" podUID="94998a26-8514-4fd9-8198-900153d547dd" containerName="manager" containerID="cri-o://1bd8998ed5a6709ee336d45dd7d8edf74ba7048c7ab4f1da46f46a641da8af65" gracePeriod=10 Apr 23 01:18:26.748033 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:26.748007 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nzzbg"] Apr 23 01:18:26.751866 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:26.751845 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nzzbg" Apr 23 01:18:26.753173 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:26.753146 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9l8hf" Apr 23 01:18:26.769297 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:26.769273 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nzzbg"] Apr 23 01:18:26.802160 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:26.802133 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cjjn\" (UniqueName: \"kubernetes.io/projected/94998a26-8514-4fd9-8198-900153d547dd-kube-api-access-8cjjn\") pod \"94998a26-8514-4fd9-8198-900153d547dd\" (UID: \"94998a26-8514-4fd9-8198-900153d547dd\") " Apr 23 01:18:26.802307 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:26.802237 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/94998a26-8514-4fd9-8198-900153d547dd-extensions-socket-volume\") pod \"94998a26-8514-4fd9-8198-900153d547dd\" (UID: \"94998a26-8514-4fd9-8198-900153d547dd\") " Apr 23 01:18:26.802414 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:26.802394 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4aaffec6-1473-4ae5-9e1c-7241b8592f1d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-nzzbg\" (UID: \"4aaffec6-1473-4ae5-9e1c-7241b8592f1d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nzzbg" Apr 23 01:18:26.802629 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:26.802587 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94998a26-8514-4fd9-8198-900153d547dd-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "94998a26-8514-4fd9-8198-900153d547dd" (UID: "94998a26-8514-4fd9-8198-900153d547dd"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 01:18:26.802689 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:26.802591 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59c4b\" (UniqueName: \"kubernetes.io/projected/4aaffec6-1473-4ae5-9e1c-7241b8592f1d-kube-api-access-59c4b\") pod \"kuadrant-operator-controller-manager-55c7f4c975-nzzbg\" (UID: \"4aaffec6-1473-4ae5-9e1c-7241b8592f1d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nzzbg" Apr 23 01:18:26.802726 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:26.802698 2569 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/94998a26-8514-4fd9-8198-900153d547dd-extensions-socket-volume\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:18:26.804261 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:26.804234 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94998a26-8514-4fd9-8198-900153d547dd-kube-api-access-8cjjn" (OuterVolumeSpecName: "kube-api-access-8cjjn") pod "94998a26-8514-4fd9-8198-900153d547dd" (UID: "94998a26-8514-4fd9-8198-900153d547dd"). InnerVolumeSpecName "kube-api-access-8cjjn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:18:26.903795 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:26.903758 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-59c4b\" (UniqueName: \"kubernetes.io/projected/4aaffec6-1473-4ae5-9e1c-7241b8592f1d-kube-api-access-59c4b\") pod \"kuadrant-operator-controller-manager-55c7f4c975-nzzbg\" (UID: \"4aaffec6-1473-4ae5-9e1c-7241b8592f1d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nzzbg" Apr 23 01:18:26.904017 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:26.903819 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4aaffec6-1473-4ae5-9e1c-7241b8592f1d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-nzzbg\" (UID: \"4aaffec6-1473-4ae5-9e1c-7241b8592f1d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nzzbg" Apr 23 01:18:26.904017 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:26.903869 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8cjjn\" (UniqueName: \"kubernetes.io/projected/94998a26-8514-4fd9-8198-900153d547dd-kube-api-access-8cjjn\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:18:26.904175 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:26.904156 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4aaffec6-1473-4ae5-9e1c-7241b8592f1d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-nzzbg\" (UID: \"4aaffec6-1473-4ae5-9e1c-7241b8592f1d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nzzbg" Apr 23 01:18:26.921951 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:26.921916 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-59c4b\" (UniqueName: \"kubernetes.io/projected/4aaffec6-1473-4ae5-9e1c-7241b8592f1d-kube-api-access-59c4b\") pod \"kuadrant-operator-controller-manager-55c7f4c975-nzzbg\" (UID: \"4aaffec6-1473-4ae5-9e1c-7241b8592f1d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nzzbg" Apr 23 01:18:27.061233 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:27.061149 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nzzbg" Apr 23 01:18:27.186953 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:27.186929 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nzzbg"] Apr 23 01:18:27.189335 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:18:27.189304 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4aaffec6_1473_4ae5_9e1c_7241b8592f1d.slice/crio-a7d885cc5adc5da36c3d2e8de66e18457a3cf255dbe8e2b5f23577918e27a982 WatchSource:0}: Error finding container a7d885cc5adc5da36c3d2e8de66e18457a3cf255dbe8e2b5f23577918e27a982: Status 404 returned error can't find the container with id a7d885cc5adc5da36c3d2e8de66e18457a3cf255dbe8e2b5f23577918e27a982 Apr 23 01:18:27.486480 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:27.486445 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nzzbg" event={"ID":"4aaffec6-1473-4ae5-9e1c-7241b8592f1d","Type":"ContainerStarted","Data":"0ec8f00cbccfc146c931fa124358726c147e4f3f1695b63f10f4331ef14c8aba"} Apr 23 01:18:27.486480 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:27.486483 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nzzbg" event={"ID":"4aaffec6-1473-4ae5-9e1c-7241b8592f1d","Type":"ContainerStarted","Data":"a7d885cc5adc5da36c3d2e8de66e18457a3cf255dbe8e2b5f23577918e27a982"} Apr 23 01:18:27.487004 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:27.486498 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nzzbg" Apr 23 01:18:27.487635 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:27.487593 2569 generic.go:358] "Generic (PLEG): container finished" podID="94998a26-8514-4fd9-8198-900153d547dd" containerID="1bd8998ed5a6709ee336d45dd7d8edf74ba7048c7ab4f1da46f46a641da8af65" exitCode=0 Apr 23 01:18:27.487684 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:27.487653 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9l8hf" event={"ID":"94998a26-8514-4fd9-8198-900153d547dd","Type":"ContainerDied","Data":"1bd8998ed5a6709ee336d45dd7d8edf74ba7048c7ab4f1da46f46a641da8af65"} Apr 23 01:18:27.487684 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:27.487661 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9l8hf" Apr 23 01:18:27.487684 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:27.487676 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9l8hf" event={"ID":"94998a26-8514-4fd9-8198-900153d547dd","Type":"ContainerDied","Data":"1d5feacc2ef459917f86273fa794ee2fee32c04f1ed57ee44da9ab20a6401671"} Apr 23 01:18:27.487824 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:27.487691 2569 scope.go:117] "RemoveContainer" containerID="1bd8998ed5a6709ee336d45dd7d8edf74ba7048c7ab4f1da46f46a641da8af65" Apr 23 01:18:27.496632 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:27.496597 2569 scope.go:117] "RemoveContainer" containerID="1bd8998ed5a6709ee336d45dd7d8edf74ba7048c7ab4f1da46f46a641da8af65" Apr 23 01:18:27.496898 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:18:27.496882 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bd8998ed5a6709ee336d45dd7d8edf74ba7048c7ab4f1da46f46a641da8af65\": container with ID starting with 1bd8998ed5a6709ee336d45dd7d8edf74ba7048c7ab4f1da46f46a641da8af65 not found: ID does not exist" containerID="1bd8998ed5a6709ee336d45dd7d8edf74ba7048c7ab4f1da46f46a641da8af65" Apr 23 01:18:27.496940 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:27.496908 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bd8998ed5a6709ee336d45dd7d8edf74ba7048c7ab4f1da46f46a641da8af65"} err="failed to get container status \"1bd8998ed5a6709ee336d45dd7d8edf74ba7048c7ab4f1da46f46a641da8af65\": rpc error: code = NotFound desc = could not find container \"1bd8998ed5a6709ee336d45dd7d8edf74ba7048c7ab4f1da46f46a641da8af65\": container with ID starting with 1bd8998ed5a6709ee336d45dd7d8edf74ba7048c7ab4f1da46f46a641da8af65 not found: ID does not exist" Apr 23 01:18:27.510723 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:27.510671 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nzzbg" podStartSLOduration=1.510656312 podStartE2EDuration="1.510656312s" podCreationTimestamp="2026-04-23 01:18:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 01:18:27.509759236 +0000 UTC m=+515.295433593" watchObservedRunningTime="2026-04-23 01:18:27.510656312 +0000 UTC m=+515.296330668" Apr 23 01:18:27.524949 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:27.524920 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9l8hf"] Apr 23 01:18:27.528965 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:27.528942 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9l8hf"] Apr 23 01:18:28.766691 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:28.766656 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94998a26-8514-4fd9-8198-900153d547dd" path="/var/lib/kubelet/pods/94998a26-8514-4fd9-8198-900153d547dd/volumes" Apr 23 01:18:38.494905 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:38.494876 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nzzbg" Apr 23 01:18:38.541084 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:38.541048 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w7999"] Apr 23 01:18:38.541361 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:38.541333 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w7999" podUID="2fdaa77e-9a45-4627-aa2b-f8412f42ba7e" containerName="manager" containerID="cri-o://4882fc9ba0f6fce7211fb181e2438de0bdd057efdb7cdbd22114ff7992c483b6" gracePeriod=10 Apr 23 01:18:38.775466 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:38.775438 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w7999" Apr 23 01:18:38.923692 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:38.923662 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nst87\" (UniqueName: \"kubernetes.io/projected/2fdaa77e-9a45-4627-aa2b-f8412f42ba7e-kube-api-access-nst87\") pod \"2fdaa77e-9a45-4627-aa2b-f8412f42ba7e\" (UID: \"2fdaa77e-9a45-4627-aa2b-f8412f42ba7e\") " Apr 23 01:18:38.923875 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:38.923706 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2fdaa77e-9a45-4627-aa2b-f8412f42ba7e-extensions-socket-volume\") pod \"2fdaa77e-9a45-4627-aa2b-f8412f42ba7e\" (UID: \"2fdaa77e-9a45-4627-aa2b-f8412f42ba7e\") " Apr 23 01:18:38.924188 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:38.924156 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fdaa77e-9a45-4627-aa2b-f8412f42ba7e-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "2fdaa77e-9a45-4627-aa2b-f8412f42ba7e" (UID: "2fdaa77e-9a45-4627-aa2b-f8412f42ba7e"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 01:18:38.925565 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:38.925541 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fdaa77e-9a45-4627-aa2b-f8412f42ba7e-kube-api-access-nst87" (OuterVolumeSpecName: "kube-api-access-nst87") pod "2fdaa77e-9a45-4627-aa2b-f8412f42ba7e" (UID: "2fdaa77e-9a45-4627-aa2b-f8412f42ba7e"). InnerVolumeSpecName "kube-api-access-nst87". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:18:39.024731 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:39.024646 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nst87\" (UniqueName: \"kubernetes.io/projected/2fdaa77e-9a45-4627-aa2b-f8412f42ba7e-kube-api-access-nst87\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:18:39.024731 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:39.024678 2569 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2fdaa77e-9a45-4627-aa2b-f8412f42ba7e-extensions-socket-volume\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:18:39.531571 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:39.531538 2569 generic.go:358] "Generic (PLEG): container finished" podID="2fdaa77e-9a45-4627-aa2b-f8412f42ba7e" containerID="4882fc9ba0f6fce7211fb181e2438de0bdd057efdb7cdbd22114ff7992c483b6" exitCode=0 Apr 23 01:18:39.532028 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:39.531598 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w7999" Apr 23 01:18:39.532028 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:39.531645 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w7999" event={"ID":"2fdaa77e-9a45-4627-aa2b-f8412f42ba7e","Type":"ContainerDied","Data":"4882fc9ba0f6fce7211fb181e2438de0bdd057efdb7cdbd22114ff7992c483b6"} Apr 23 01:18:39.532028 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:39.531692 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w7999" event={"ID":"2fdaa77e-9a45-4627-aa2b-f8412f42ba7e","Type":"ContainerDied","Data":"75761be218c3c5441f61d5255b740a06a2a8deecd4654a79367c3e74c57ef4c6"} Apr 23 01:18:39.532028 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:39.531718 2569 scope.go:117] "RemoveContainer" containerID="4882fc9ba0f6fce7211fb181e2438de0bdd057efdb7cdbd22114ff7992c483b6" Apr 23 01:18:39.540083 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:39.540068 2569 scope.go:117] "RemoveContainer" containerID="4882fc9ba0f6fce7211fb181e2438de0bdd057efdb7cdbd22114ff7992c483b6" Apr 23 01:18:39.540328 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:18:39.540306 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4882fc9ba0f6fce7211fb181e2438de0bdd057efdb7cdbd22114ff7992c483b6\": container with ID starting with 4882fc9ba0f6fce7211fb181e2438de0bdd057efdb7cdbd22114ff7992c483b6 not found: ID does not exist" containerID="4882fc9ba0f6fce7211fb181e2438de0bdd057efdb7cdbd22114ff7992c483b6" Apr 23 01:18:39.540422 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:39.540332 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4882fc9ba0f6fce7211fb181e2438de0bdd057efdb7cdbd22114ff7992c483b6"} err="failed to get container status \"4882fc9ba0f6fce7211fb181e2438de0bdd057efdb7cdbd22114ff7992c483b6\": rpc error: code = NotFound desc = could not find container \"4882fc9ba0f6fce7211fb181e2438de0bdd057efdb7cdbd22114ff7992c483b6\": container with ID starting with 4882fc9ba0f6fce7211fb181e2438de0bdd057efdb7cdbd22114ff7992c483b6 not found: ID does not exist" Apr 23 01:18:39.553159 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:39.553135 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w7999"] Apr 23 01:18:39.556889 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:39.556869 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-w7999"] Apr 23 01:18:40.766429 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:40.766393 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fdaa77e-9a45-4627-aa2b-f8412f42ba7e" path="/var/lib/kubelet/pods/2fdaa77e-9a45-4627-aa2b-f8412f42ba7e/volumes" Apr 23 01:18:42.784981 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:42.784872 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr"] Apr 23 01:18:42.785922 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:42.785437 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94998a26-8514-4fd9-8198-900153d547dd" containerName="manager" Apr 23 01:18:42.785922 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:42.785454 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="94998a26-8514-4fd9-8198-900153d547dd" containerName="manager" Apr 23 01:18:42.785922 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:42.785484 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2fdaa77e-9a45-4627-aa2b-f8412f42ba7e" containerName="manager" Apr 23 01:18:42.785922 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:42.785491 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fdaa77e-9a45-4627-aa2b-f8412f42ba7e" containerName="manager" Apr 23 01:18:42.785922 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:42.785574 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="94998a26-8514-4fd9-8198-900153d547dd" containerName="manager" Apr 23 01:18:42.785922 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:42.785586 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="2fdaa77e-9a45-4627-aa2b-f8412f42ba7e" containerName="manager" Apr 23 01:18:42.788518 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:42.788494 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr" Apr 23 01:18:42.790537 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:42.790516 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-4k4xv\"" Apr 23 01:18:42.800605 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:42.800584 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr"] Apr 23 01:18:42.960579 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:42.960539 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a5b1db40-a787-4d97-87ff-5443591b723f-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-2qnsr\" (UID: \"a5b1db40-a787-4d97-87ff-5443591b723f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr" Apr 23 01:18:42.960579 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:42.960590 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s628r\" (UniqueName: \"kubernetes.io/projected/a5b1db40-a787-4d97-87ff-5443591b723f-kube-api-access-s628r\") pod \"maas-default-gateway-openshift-default-845c6b4b48-2qnsr\" (UID: \"a5b1db40-a787-4d97-87ff-5443591b723f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr" Apr 23 01:18:42.960842 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:42.960705 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/a5b1db40-a787-4d97-87ff-5443591b723f-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-2qnsr\" (UID: \"a5b1db40-a787-4d97-87ff-5443591b723f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr" Apr 23 01:18:42.960842 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:42.960746 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/a5b1db40-a787-4d97-87ff-5443591b723f-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-2qnsr\" (UID: \"a5b1db40-a787-4d97-87ff-5443591b723f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr" Apr 23 01:18:42.960842 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:42.960779 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/a5b1db40-a787-4d97-87ff-5443591b723f-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-2qnsr\" (UID: \"a5b1db40-a787-4d97-87ff-5443591b723f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr" Apr 23 01:18:42.960951 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:42.960860 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/a5b1db40-a787-4d97-87ff-5443591b723f-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-2qnsr\" (UID: \"a5b1db40-a787-4d97-87ff-5443591b723f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr" Apr 23 01:18:42.960951 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:42.960875 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/a5b1db40-a787-4d97-87ff-5443591b723f-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-2qnsr\" (UID: \"a5b1db40-a787-4d97-87ff-5443591b723f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr" Apr 23 01:18:42.960951 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:42.960894 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/a5b1db40-a787-4d97-87ff-5443591b723f-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-2qnsr\" (UID: \"a5b1db40-a787-4d97-87ff-5443591b723f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr" Apr 23 01:18:42.961060 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:42.960937 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/a5b1db40-a787-4d97-87ff-5443591b723f-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-2qnsr\" (UID: \"a5b1db40-a787-4d97-87ff-5443591b723f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr" Apr 23 01:18:43.062176 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:43.062087 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/a5b1db40-a787-4d97-87ff-5443591b723f-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-2qnsr\" (UID: \"a5b1db40-a787-4d97-87ff-5443591b723f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr" Apr 23 01:18:43.062176 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:43.062133 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/a5b1db40-a787-4d97-87ff-5443591b723f-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-2qnsr\" (UID: \"a5b1db40-a787-4d97-87ff-5443591b723f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr" Apr 23 01:18:43.062399 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:43.062175 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/a5b1db40-a787-4d97-87ff-5443591b723f-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-2qnsr\" (UID: \"a5b1db40-a787-4d97-87ff-5443591b723f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr" Apr 23 01:18:43.062399 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:43.062201 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/a5b1db40-a787-4d97-87ff-5443591b723f-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-2qnsr\" (UID: \"a5b1db40-a787-4d97-87ff-5443591b723f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr" Apr 23 01:18:43.062399 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:43.062227 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/a5b1db40-a787-4d97-87ff-5443591b723f-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-2qnsr\" (UID: \"a5b1db40-a787-4d97-87ff-5443591b723f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr" Apr 23 01:18:43.062545 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:43.062440 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/a5b1db40-a787-4d97-87ff-5443591b723f-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-2qnsr\" (UID: \"a5b1db40-a787-4d97-87ff-5443591b723f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr" Apr 23 01:18:43.062545 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:43.062519 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a5b1db40-a787-4d97-87ff-5443591b723f-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-2qnsr\" (UID: \"a5b1db40-a787-4d97-87ff-5443591b723f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr" Apr 23 01:18:43.062690 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:43.062563 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s628r\" (UniqueName: \"kubernetes.io/projected/a5b1db40-a787-4d97-87ff-5443591b723f-kube-api-access-s628r\") pod \"maas-default-gateway-openshift-default-845c6b4b48-2qnsr\" (UID: \"a5b1db40-a787-4d97-87ff-5443591b723f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr" Apr 23 01:18:43.062690 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:43.062585 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/a5b1db40-a787-4d97-87ff-5443591b723f-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-2qnsr\" (UID: \"a5b1db40-a787-4d97-87ff-5443591b723f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr" Apr 23 01:18:43.062690 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:43.062646 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/a5b1db40-a787-4d97-87ff-5443591b723f-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-2qnsr\" (UID: \"a5b1db40-a787-4d97-87ff-5443591b723f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr" Apr 23 01:18:43.062831 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:43.062731 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/a5b1db40-a787-4d97-87ff-5443591b723f-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-2qnsr\" (UID: \"a5b1db40-a787-4d97-87ff-5443591b723f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr" Apr 23 01:18:43.062875 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:43.062828 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/a5b1db40-a787-4d97-87ff-5443591b723f-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-2qnsr\" (UID: \"a5b1db40-a787-4d97-87ff-5443591b723f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr" Apr 23 01:18:43.062958 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:43.062937 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/a5b1db40-a787-4d97-87ff-5443591b723f-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-2qnsr\" (UID: \"a5b1db40-a787-4d97-87ff-5443591b723f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr" Apr 23 01:18:43.063046 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:43.063023 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/a5b1db40-a787-4d97-87ff-5443591b723f-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-2qnsr\" (UID: \"a5b1db40-a787-4d97-87ff-5443591b723f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr" Apr 23 01:18:43.064530 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:43.064508 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/a5b1db40-a787-4d97-87ff-5443591b723f-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-2qnsr\" (UID: \"a5b1db40-a787-4d97-87ff-5443591b723f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr" Apr 23 01:18:43.064926 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:43.064907 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/a5b1db40-a787-4d97-87ff-5443591b723f-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-2qnsr\" (UID: \"a5b1db40-a787-4d97-87ff-5443591b723f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr" Apr 23 01:18:43.069903 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:43.069885 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s628r\" (UniqueName: \"kubernetes.io/projected/a5b1db40-a787-4d97-87ff-5443591b723f-kube-api-access-s628r\") pod \"maas-default-gateway-openshift-default-845c6b4b48-2qnsr\" (UID: \"a5b1db40-a787-4d97-87ff-5443591b723f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr" Apr 23 01:18:43.069989 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:43.069948 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a5b1db40-a787-4d97-87ff-5443591b723f-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-2qnsr\" (UID: \"a5b1db40-a787-4d97-87ff-5443591b723f\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr" Apr 23 01:18:43.102582 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:43.102560 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr" Apr 23 01:18:43.430876 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:43.430822 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr"] Apr 23 01:18:43.433538 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:18:43.433505 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5b1db40_a787_4d97_87ff_5443591b723f.slice/crio-a8d69a71cfdf783440edafa43ab376c601932f3ad39aaf2956814d2f130c9dc8 WatchSource:0}: Error finding container a8d69a71cfdf783440edafa43ab376c601932f3ad39aaf2956814d2f130c9dc8: Status 404 returned error can't find the container with id a8d69a71cfdf783440edafa43ab376c601932f3ad39aaf2956814d2f130c9dc8 Apr 23 01:18:43.435632 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:43.435582 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 23 01:18:43.435711 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:43.435669 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 23 01:18:43.435711 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:43.435698 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 23 01:18:43.550122 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:43.550090 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr" event={"ID":"a5b1db40-a787-4d97-87ff-5443591b723f","Type":"ContainerStarted","Data":"89e4388e5638ecb3ae9781ac9801efce396df5afec6dbebd8a32d2bb3d3dd407"} Apr 23 01:18:43.550122 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:43.550128 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr" event={"ID":"a5b1db40-a787-4d97-87ff-5443591b723f","Type":"ContainerStarted","Data":"a8d69a71cfdf783440edafa43ab376c601932f3ad39aaf2956814d2f130c9dc8"} Apr 23 01:18:43.569568 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:43.569508 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr" podStartSLOduration=1.569487589 podStartE2EDuration="1.569487589s" podCreationTimestamp="2026-04-23 01:18:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 01:18:43.566339746 +0000 UTC m=+531.352014128" watchObservedRunningTime="2026-04-23 01:18:43.569487589 +0000 UTC m=+531.355161947" Apr 23 01:18:44.102913 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:44.102881 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr" Apr 23 01:18:44.104488 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:44.104462 2569 patch_prober.go:28] interesting pod/maas-default-gateway-openshift-default-845c6b4b48-2qnsr container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.132.0.39:15021/healthz/ready\": dial tcp 10.132.0.39:15021: connect: connection refused" start-of-body= Apr 23 01:18:44.104625 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:44.104513 2569 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr" podUID="a5b1db40-a787-4d97-87ff-5443591b723f" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.132.0.39:15021/healthz/ready\": dial tcp 10.132.0.39:15021: connect: connection refused" Apr 23 01:18:45.108096 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:45.108069 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr" Apr 23 01:18:45.558251 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:45.558218 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr" Apr 23 01:18:45.559302 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:45.559284 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-2qnsr" Apr 23 01:18:56.625310 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:56.625266 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:18:56.630533 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:56.630508 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-7mwh2" Apr 23 01:18:56.632953 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:56.632924 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 23 01:18:56.633526 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:56.633503 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-4dlc4\"" Apr 23 01:18:56.635702 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:56.635683 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:18:56.650703 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:56.650674 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:18:56.688647 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:56.688597 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/d5e25ee0-d154-4b93-ac77-53abdcdc47be-config-file\") pod \"limitador-limitador-78c99df468-7mwh2\" (UID: \"d5e25ee0-d154-4b93-ac77-53abdcdc47be\") " pod="kuadrant-system/limitador-limitador-78c99df468-7mwh2" Apr 23 01:18:56.688813 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:56.688753 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zct7h\" (UniqueName: \"kubernetes.io/projected/d5e25ee0-d154-4b93-ac77-53abdcdc47be-kube-api-access-zct7h\") pod \"limitador-limitador-78c99df468-7mwh2\" (UID: \"d5e25ee0-d154-4b93-ac77-53abdcdc47be\") " pod="kuadrant-system/limitador-limitador-78c99df468-7mwh2" Apr 23 01:18:56.789949 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:56.789914 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zct7h\" (UniqueName: \"kubernetes.io/projected/d5e25ee0-d154-4b93-ac77-53abdcdc47be-kube-api-access-zct7h\") pod \"limitador-limitador-78c99df468-7mwh2\" (UID: \"d5e25ee0-d154-4b93-ac77-53abdcdc47be\") " pod="kuadrant-system/limitador-limitador-78c99df468-7mwh2" Apr 23 01:18:56.790148 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:56.789966 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/d5e25ee0-d154-4b93-ac77-53abdcdc47be-config-file\") pod \"limitador-limitador-78c99df468-7mwh2\" (UID: \"d5e25ee0-d154-4b93-ac77-53abdcdc47be\") " pod="kuadrant-system/limitador-limitador-78c99df468-7mwh2" Apr 23 01:18:56.790572 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:56.790549 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/d5e25ee0-d154-4b93-ac77-53abdcdc47be-config-file\") pod \"limitador-limitador-78c99df468-7mwh2\" (UID: \"d5e25ee0-d154-4b93-ac77-53abdcdc47be\") " pod="kuadrant-system/limitador-limitador-78c99df468-7mwh2" Apr 23 01:18:56.797302 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:56.797279 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zct7h\" (UniqueName: \"kubernetes.io/projected/d5e25ee0-d154-4b93-ac77-53abdcdc47be-kube-api-access-zct7h\") pod \"limitador-limitador-78c99df468-7mwh2\" (UID: \"d5e25ee0-d154-4b93-ac77-53abdcdc47be\") " pod="kuadrant-system/limitador-limitador-78c99df468-7mwh2" Apr 23 01:18:56.943654 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:56.943592 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-7mwh2" Apr 23 01:18:57.090549 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:57.090514 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:18:57.093770 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:18:57.093731 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5e25ee0_d154_4b93_ac77_53abdcdc47be.slice/crio-a22f90e76fefd2ab618742ed74aaf4c07b624a9bdc9109ee2d37c56f87fe7810 WatchSource:0}: Error finding container a22f90e76fefd2ab618742ed74aaf4c07b624a9bdc9109ee2d37c56f87fe7810: Status 404 returned error can't find the container with id a22f90e76fefd2ab618742ed74aaf4c07b624a9bdc9109ee2d37c56f87fe7810 Apr 23 01:18:57.606794 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:18:57.606739 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-7mwh2" event={"ID":"d5e25ee0-d154-4b93-ac77-53abdcdc47be","Type":"ContainerStarted","Data":"a22f90e76fefd2ab618742ed74aaf4c07b624a9bdc9109ee2d37c56f87fe7810"} Apr 23 01:19:00.622334 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:19:00.622292 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-7mwh2" event={"ID":"d5e25ee0-d154-4b93-ac77-53abdcdc47be","Type":"ContainerStarted","Data":"e9700f440df87f82875e0173f9b5f0b21ef94005f87234324cb5332fa3f156d7"} Apr 23 01:19:00.622780 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:19:00.622405 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-7mwh2" Apr 23 01:19:00.639333 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:19:00.639288 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-7mwh2" podStartSLOduration=2.178375495 podStartE2EDuration="4.639276015s" podCreationTimestamp="2026-04-23 01:18:56 +0000 UTC" firstStartedPulling="2026-04-23 01:18:57.095806959 +0000 UTC m=+544.881481293" lastFinishedPulling="2026-04-23 01:18:59.556707475 +0000 UTC m=+547.342381813" observedRunningTime="2026-04-23 01:19:00.636980003 +0000 UTC m=+548.422654360" watchObservedRunningTime="2026-04-23 01:19:00.639276015 +0000 UTC m=+548.424950370" Apr 23 01:19:11.627148 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:19:11.627116 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-7mwh2" Apr 23 01:19:50.537941 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:19:50.537901 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:19:52.682919 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:19:52.682884 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-grc82_58715151-e1c9-475c-be78-487774704c95/console-operator/1.log" Apr 23 01:19:52.685674 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:19:52.685651 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-grc82_58715151-e1c9-475c-be78-487774704c95/console-operator/1.log" Apr 23 01:19:52.690423 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:19:52.690400 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79v47_52115d8e-033b-485d-aa67-434e7ae395d5/ovn-acl-logging/0.log" Apr 23 01:19:52.691991 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:19:52.691963 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79v47_52115d8e-033b-485d-aa67-434e7ae395d5/ovn-acl-logging/0.log" Apr 23 01:20:23.614752 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:20:23.614718 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:20:31.211946 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:20:31.211910 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:20:33.907055 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:20:33.907016 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:20:43.244016 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:20:43.243983 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:20:52.414005 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:20:52.413971 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:21:06.909005 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:21:06.908966 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:22:15.311880 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:22:15.311807 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:22:24.711527 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:22:24.711488 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:22:33.511046 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:22:33.511011 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:22:44.316845 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:22:44.316810 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:22:53.414825 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:22:53.414787 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:23:04.612658 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:23:04.612620 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:24:05.707323 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:24:05.707283 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:24:20.512769 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:24:20.512730 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:24:52.711019 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:24:52.710940 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-grc82_58715151-e1c9-475c-be78-487774704c95/console-operator/1.log" Apr 23 01:24:52.714648 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:24:52.714604 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-grc82_58715151-e1c9-475c-be78-487774704c95/console-operator/1.log" Apr 23 01:24:52.715929 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:24:52.715909 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79v47_52115d8e-033b-485d-aa67-434e7ae395d5/ovn-acl-logging/0.log" Apr 23 01:24:52.719425 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:24:52.719409 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79v47_52115d8e-033b-485d-aa67-434e7ae395d5/ovn-acl-logging/0.log" Apr 23 01:24:59.217310 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:24:59.217276 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:25:15.609539 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:25:15.609498 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:25:30.309479 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:25:30.309438 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:25:46.712967 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:25:46.712926 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:26:37.811290 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:26:37.811254 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:26:46.414552 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:26:46.414517 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:27:03.205149 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:27:03.205111 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:27:11.513247 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:27:11.513208 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:27:28.813638 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:27:28.811643 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:27:37.117787 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:27:37.117749 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:28:10.313467 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:28:10.313388 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:28:18.111190 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:28:18.111153 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:28:26.609039 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:28:26.609005 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:28:35.205809 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:28:35.205777 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:28:44.104500 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:28:44.104465 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:28:59.804207 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:28:59.804165 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:29:11.208713 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:29:11.208672 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:29:52.736687 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:29:52.736653 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-grc82_58715151-e1c9-475c-be78-487774704c95/console-operator/1.log" Apr 23 01:29:52.741405 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:29:52.741374 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-grc82_58715151-e1c9-475c-be78-487774704c95/console-operator/1.log" Apr 23 01:29:52.741715 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:29:52.741693 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79v47_52115d8e-033b-485d-aa67-434e7ae395d5/ovn-acl-logging/0.log" Apr 23 01:29:52.746410 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:29:52.746393 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79v47_52115d8e-033b-485d-aa67-434e7ae395d5/ovn-acl-logging/0.log" Apr 23 01:29:57.713736 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:29:57.713695 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:30:00.138544 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:30:00.138510 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29615130-xkq4d"] Apr 23 01:30:00.142061 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:30:00.142040 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29615130-xkq4d" Apr 23 01:30:00.144139 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:30:00.144115 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-6vbgm\"" Apr 23 01:30:00.153870 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:30:00.153845 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29615130-xkq4d"] Apr 23 01:30:00.224433 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:30:00.224390 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbmft\" (UniqueName: \"kubernetes.io/projected/3f311e89-efb6-4857-bb74-abd5ed9d3663-kube-api-access-gbmft\") pod \"maas-api-key-cleanup-29615130-xkq4d\" (UID: \"3f311e89-efb6-4857-bb74-abd5ed9d3663\") " pod="opendatahub/maas-api-key-cleanup-29615130-xkq4d" Apr 23 01:30:00.325726 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:30:00.325684 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gbmft\" (UniqueName: \"kubernetes.io/projected/3f311e89-efb6-4857-bb74-abd5ed9d3663-kube-api-access-gbmft\") pod \"maas-api-key-cleanup-29615130-xkq4d\" (UID: \"3f311e89-efb6-4857-bb74-abd5ed9d3663\") " pod="opendatahub/maas-api-key-cleanup-29615130-xkq4d" Apr 23 01:30:00.334012 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:30:00.333979 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbmft\" (UniqueName: \"kubernetes.io/projected/3f311e89-efb6-4857-bb74-abd5ed9d3663-kube-api-access-gbmft\") pod \"maas-api-key-cleanup-29615130-xkq4d\" (UID: \"3f311e89-efb6-4857-bb74-abd5ed9d3663\") " pod="opendatahub/maas-api-key-cleanup-29615130-xkq4d" Apr 23 01:30:00.453401 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:30:00.453312 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29615130-xkq4d" Apr 23 01:30:00.786129 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:30:00.786099 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29615130-xkq4d"] Apr 23 01:30:00.790969 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:30:00.790935 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f311e89_efb6_4857_bb74_abd5ed9d3663.slice/crio-6acd71cc35c29a3b5470deac5c33e5723ebf95926ad6ab0fac70ff9c8214dfbb WatchSource:0}: Error finding container 6acd71cc35c29a3b5470deac5c33e5723ebf95926ad6ab0fac70ff9c8214dfbb: Status 404 returned error can't find the container with id 6acd71cc35c29a3b5470deac5c33e5723ebf95926ad6ab0fac70ff9c8214dfbb Apr 23 01:30:00.792684 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:30:00.792666 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 01:30:00.933688 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:30:00.933648 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29615130-xkq4d" event={"ID":"3f311e89-efb6-4857-bb74-abd5ed9d3663","Type":"ContainerStarted","Data":"6acd71cc35c29a3b5470deac5c33e5723ebf95926ad6ab0fac70ff9c8214dfbb"} Apr 23 01:30:03.946435 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:30:03.946399 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29615130-xkq4d" event={"ID":"3f311e89-efb6-4857-bb74-abd5ed9d3663","Type":"ContainerStarted","Data":"d79f6e56a0474e14c141e634cb101531fa2b778b073ccb694b8bd931bb831d82"} Apr 23 01:30:03.960207 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:30:03.960153 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29615130-xkq4d" podStartSLOduration=1.790353823 podStartE2EDuration="3.96013712s" podCreationTimestamp="2026-04-23 01:30:00 +0000 UTC" firstStartedPulling="2026-04-23 01:30:00.792796539 +0000 UTC m=+1208.578470877" lastFinishedPulling="2026-04-23 01:30:02.962579833 +0000 UTC m=+1210.748254174" observedRunningTime="2026-04-23 01:30:03.95885625 +0000 UTC m=+1211.744530608" watchObservedRunningTime="2026-04-23 01:30:03.96013712 +0000 UTC m=+1211.745811477" Apr 23 01:30:06.310199 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:30:06.310160 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:30:15.363492 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:30:15.363388 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:30:22.710420 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:30:22.710380 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:30:24.028001 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:30:24.027963 2569 generic.go:358] "Generic (PLEG): container finished" podID="3f311e89-efb6-4857-bb74-abd5ed9d3663" containerID="d79f6e56a0474e14c141e634cb101531fa2b778b073ccb694b8bd931bb831d82" exitCode=6 Apr 23 01:30:24.028495 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:30:24.028044 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29615130-xkq4d" event={"ID":"3f311e89-efb6-4857-bb74-abd5ed9d3663","Type":"ContainerDied","Data":"d79f6e56a0474e14c141e634cb101531fa2b778b073ccb694b8bd931bb831d82"} Apr 23 01:30:24.028495 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:30:24.028364 2569 scope.go:117] "RemoveContainer" containerID="d79f6e56a0474e14c141e634cb101531fa2b778b073ccb694b8bd931bb831d82" Apr 23 01:30:25.032916 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:30:25.032879 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29615130-xkq4d" event={"ID":"3f311e89-efb6-4857-bb74-abd5ed9d3663","Type":"ContainerStarted","Data":"fae902ef78e39348884d9d053a1a9c6fc4cc30ecd5b209b5be9882d794e1ed29"} Apr 23 01:30:33.510000 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:30:33.509964 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:30:41.712068 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:30:41.711835 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:30:45.121717 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:30:45.121624 2569 generic.go:358] "Generic (PLEG): container finished" podID="3f311e89-efb6-4857-bb74-abd5ed9d3663" containerID="fae902ef78e39348884d9d053a1a9c6fc4cc30ecd5b209b5be9882d794e1ed29" exitCode=6 Apr 23 01:30:45.121717 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:30:45.121640 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29615130-xkq4d" event={"ID":"3f311e89-efb6-4857-bb74-abd5ed9d3663","Type":"ContainerDied","Data":"fae902ef78e39348884d9d053a1a9c6fc4cc30ecd5b209b5be9882d794e1ed29"} Apr 23 01:30:45.121717 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:30:45.121698 2569 scope.go:117] "RemoveContainer" containerID="d79f6e56a0474e14c141e634cb101531fa2b778b073ccb694b8bd931bb831d82" Apr 23 01:30:45.122216 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:30:45.122041 2569 scope.go:117] "RemoveContainer" containerID="fae902ef78e39348884d9d053a1a9c6fc4cc30ecd5b209b5be9882d794e1ed29" Apr 23 01:30:45.122298 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:30:45.122279 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29615130-xkq4d_opendatahub(3f311e89-efb6-4857-bb74-abd5ed9d3663)\"" pod="opendatahub/maas-api-key-cleanup-29615130-xkq4d" podUID="3f311e89-efb6-4857-bb74-abd5ed9d3663" Apr 23 01:30:49.607811 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:30:49.607768 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:30:54.705128 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:30:54.705092 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:30:58.814948 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:30:58.814913 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:30:59.763230 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:30:59.763196 2569 scope.go:117] "RemoveContainer" containerID="fae902ef78e39348884d9d053a1a9c6fc4cc30ecd5b209b5be9882d794e1ed29" Apr 23 01:31:00.010215 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:31:00.010184 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29615130-xkq4d"] Apr 23 01:31:00.178538 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:31:00.178501 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29615130-xkq4d" event={"ID":"3f311e89-efb6-4857-bb74-abd5ed9d3663","Type":"ContainerStarted","Data":"457817cad08d9de75914ee0ad505c829d03e5688eb71d10bebc8ae4512de5033"} Apr 23 01:31:00.178753 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:31:00.178664 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29615130-xkq4d" podUID="3f311e89-efb6-4857-bb74-abd5ed9d3663" containerName="cleanup" containerID="cri-o://457817cad08d9de75914ee0ad505c829d03e5688eb71d10bebc8ae4512de5033" gracePeriod=30 Apr 23 01:31:07.014365 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:31:07.014330 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:31:15.907066 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:31:15.907029 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:31:20.526788 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:31:20.526765 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29615130-xkq4d" Apr 23 01:31:20.663197 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:31:20.663149 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbmft\" (UniqueName: \"kubernetes.io/projected/3f311e89-efb6-4857-bb74-abd5ed9d3663-kube-api-access-gbmft\") pod \"3f311e89-efb6-4857-bb74-abd5ed9d3663\" (UID: \"3f311e89-efb6-4857-bb74-abd5ed9d3663\") " Apr 23 01:31:20.665402 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:31:20.665374 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f311e89-efb6-4857-bb74-abd5ed9d3663-kube-api-access-gbmft" (OuterVolumeSpecName: "kube-api-access-gbmft") pod "3f311e89-efb6-4857-bb74-abd5ed9d3663" (UID: "3f311e89-efb6-4857-bb74-abd5ed9d3663"). InnerVolumeSpecName "kube-api-access-gbmft". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:31:20.764685 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:31:20.764641 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gbmft\" (UniqueName: \"kubernetes.io/projected/3f311e89-efb6-4857-bb74-abd5ed9d3663-kube-api-access-gbmft\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:31:21.257508 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:31:21.257473 2569 generic.go:358] "Generic (PLEG): container finished" podID="3f311e89-efb6-4857-bb74-abd5ed9d3663" containerID="457817cad08d9de75914ee0ad505c829d03e5688eb71d10bebc8ae4512de5033" exitCode=6 Apr 23 01:31:21.257714 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:31:21.257541 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29615130-xkq4d" Apr 23 01:31:21.257714 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:31:21.257560 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29615130-xkq4d" event={"ID":"3f311e89-efb6-4857-bb74-abd5ed9d3663","Type":"ContainerDied","Data":"457817cad08d9de75914ee0ad505c829d03e5688eb71d10bebc8ae4512de5033"} Apr 23 01:31:21.257714 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:31:21.257598 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29615130-xkq4d" event={"ID":"3f311e89-efb6-4857-bb74-abd5ed9d3663","Type":"ContainerDied","Data":"6acd71cc35c29a3b5470deac5c33e5723ebf95926ad6ab0fac70ff9c8214dfbb"} Apr 23 01:31:21.257714 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:31:21.257627 2569 scope.go:117] "RemoveContainer" containerID="457817cad08d9de75914ee0ad505c829d03e5688eb71d10bebc8ae4512de5033" Apr 23 01:31:21.266696 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:31:21.266675 2569 scope.go:117] "RemoveContainer" containerID="fae902ef78e39348884d9d053a1a9c6fc4cc30ecd5b209b5be9882d794e1ed29" Apr 23 01:31:21.272675 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:31:21.272649 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29615130-xkq4d"] Apr 23 01:31:21.274372 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:31:21.274351 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29615130-xkq4d"] Apr 23 01:31:21.275213 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:31:21.275201 2569 scope.go:117] "RemoveContainer" containerID="457817cad08d9de75914ee0ad505c829d03e5688eb71d10bebc8ae4512de5033" Apr 23 01:31:21.275493 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:31:21.275471 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"457817cad08d9de75914ee0ad505c829d03e5688eb71d10bebc8ae4512de5033\": container with ID starting with 457817cad08d9de75914ee0ad505c829d03e5688eb71d10bebc8ae4512de5033 not found: ID does not exist" containerID="457817cad08d9de75914ee0ad505c829d03e5688eb71d10bebc8ae4512de5033" Apr 23 01:31:21.275547 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:31:21.275501 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"457817cad08d9de75914ee0ad505c829d03e5688eb71d10bebc8ae4512de5033"} err="failed to get container status \"457817cad08d9de75914ee0ad505c829d03e5688eb71d10bebc8ae4512de5033\": rpc error: code = NotFound desc = could not find container \"457817cad08d9de75914ee0ad505c829d03e5688eb71d10bebc8ae4512de5033\": container with ID starting with 457817cad08d9de75914ee0ad505c829d03e5688eb71d10bebc8ae4512de5033 not found: ID does not exist" Apr 23 01:31:21.275547 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:31:21.275520 2569 scope.go:117] "RemoveContainer" containerID="fae902ef78e39348884d9d053a1a9c6fc4cc30ecd5b209b5be9882d794e1ed29" Apr 23 01:31:21.275816 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:31:21.275787 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fae902ef78e39348884d9d053a1a9c6fc4cc30ecd5b209b5be9882d794e1ed29\": container with ID starting with fae902ef78e39348884d9d053a1a9c6fc4cc30ecd5b209b5be9882d794e1ed29 not found: ID does not exist" containerID="fae902ef78e39348884d9d053a1a9c6fc4cc30ecd5b209b5be9882d794e1ed29" Apr 23 01:31:21.275912 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:31:21.275818 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fae902ef78e39348884d9d053a1a9c6fc4cc30ecd5b209b5be9882d794e1ed29"} err="failed to get container status \"fae902ef78e39348884d9d053a1a9c6fc4cc30ecd5b209b5be9882d794e1ed29\": rpc error: code = NotFound desc = could not find container \"fae902ef78e39348884d9d053a1a9c6fc4cc30ecd5b209b5be9882d794e1ed29\": container with ID starting with fae902ef78e39348884d9d053a1a9c6fc4cc30ecd5b209b5be9882d794e1ed29 not found: ID does not exist" Apr 23 01:31:22.767923 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:31:22.767884 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f311e89-efb6-4857-bb74-abd5ed9d3663" path="/var/lib/kubelet/pods/3f311e89-efb6-4857-bb74-abd5ed9d3663/volumes" Apr 23 01:31:25.209638 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:31:25.209592 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:31:33.013670 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:31:33.013633 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:31:41.908400 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:31:41.908369 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:31:51.616950 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:31:51.616913 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:31:59.215843 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:31:59.215809 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:32:07.611952 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:32:07.611918 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:32:16.714787 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:32:16.714744 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:32:25.007749 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:32:25.007707 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:33:16.871647 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:33:16.871554 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nzzbg"] Apr 23 01:33:16.872255 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:33:16.871810 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nzzbg" podUID="4aaffec6-1473-4ae5-9e1c-7241b8592f1d" containerName="manager" containerID="cri-o://0ec8f00cbccfc146c931fa124358726c147e4f3f1695b63f10f4331ef14c8aba" gracePeriod=10 Apr 23 01:33:17.323068 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:33:17.323044 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nzzbg" Apr 23 01:33:17.503497 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:33:17.503396 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4aaffec6-1473-4ae5-9e1c-7241b8592f1d-extensions-socket-volume\") pod \"4aaffec6-1473-4ae5-9e1c-7241b8592f1d\" (UID: \"4aaffec6-1473-4ae5-9e1c-7241b8592f1d\") " Apr 23 01:33:17.503497 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:33:17.503456 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59c4b\" (UniqueName: \"kubernetes.io/projected/4aaffec6-1473-4ae5-9e1c-7241b8592f1d-kube-api-access-59c4b\") pod \"4aaffec6-1473-4ae5-9e1c-7241b8592f1d\" (UID: \"4aaffec6-1473-4ae5-9e1c-7241b8592f1d\") " Apr 23 01:33:17.503838 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:33:17.503809 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4aaffec6-1473-4ae5-9e1c-7241b8592f1d-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "4aaffec6-1473-4ae5-9e1c-7241b8592f1d" (UID: "4aaffec6-1473-4ae5-9e1c-7241b8592f1d"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 01:33:17.505564 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:33:17.505536 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4aaffec6-1473-4ae5-9e1c-7241b8592f1d-kube-api-access-59c4b" (OuterVolumeSpecName: "kube-api-access-59c4b") pod "4aaffec6-1473-4ae5-9e1c-7241b8592f1d" (UID: "4aaffec6-1473-4ae5-9e1c-7241b8592f1d"). InnerVolumeSpecName "kube-api-access-59c4b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 01:33:17.604063 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:33:17.604032 2569 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4aaffec6-1473-4ae5-9e1c-7241b8592f1d-extensions-socket-volume\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:33:17.604063 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:33:17.604057 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-59c4b\" (UniqueName: \"kubernetes.io/projected/4aaffec6-1473-4ae5-9e1c-7241b8592f1d-kube-api-access-59c4b\") on node \"ip-10-0-138-235.ec2.internal\" DevicePath \"\"" Apr 23 01:33:17.681763 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:33:17.681729 2569 generic.go:358] "Generic (PLEG): container finished" podID="4aaffec6-1473-4ae5-9e1c-7241b8592f1d" containerID="0ec8f00cbccfc146c931fa124358726c147e4f3f1695b63f10f4331ef14c8aba" exitCode=0 Apr 23 01:33:17.681924 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:33:17.681792 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nzzbg" event={"ID":"4aaffec6-1473-4ae5-9e1c-7241b8592f1d","Type":"ContainerDied","Data":"0ec8f00cbccfc146c931fa124358726c147e4f3f1695b63f10f4331ef14c8aba"} Apr 23 01:33:17.681924 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:33:17.681799 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nzzbg" Apr 23 01:33:17.681924 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:33:17.681819 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nzzbg" event={"ID":"4aaffec6-1473-4ae5-9e1c-7241b8592f1d","Type":"ContainerDied","Data":"a7d885cc5adc5da36c3d2e8de66e18457a3cf255dbe8e2b5f23577918e27a982"} Apr 23 01:33:17.681924 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:33:17.681835 2569 scope.go:117] "RemoveContainer" containerID="0ec8f00cbccfc146c931fa124358726c147e4f3f1695b63f10f4331ef14c8aba" Apr 23 01:33:17.691665 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:33:17.691644 2569 scope.go:117] "RemoveContainer" containerID="0ec8f00cbccfc146c931fa124358726c147e4f3f1695b63f10f4331ef14c8aba" Apr 23 01:33:17.691953 ip-10-0-138-235 kubenswrapper[2569]: E0423 01:33:17.691933 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ec8f00cbccfc146c931fa124358726c147e4f3f1695b63f10f4331ef14c8aba\": container with ID starting with 0ec8f00cbccfc146c931fa124358726c147e4f3f1695b63f10f4331ef14c8aba not found: ID does not exist" containerID="0ec8f00cbccfc146c931fa124358726c147e4f3f1695b63f10f4331ef14c8aba" Apr 23 01:33:17.692040 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:33:17.691966 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ec8f00cbccfc146c931fa124358726c147e4f3f1695b63f10f4331ef14c8aba"} err="failed to get container status \"0ec8f00cbccfc146c931fa124358726c147e4f3f1695b63f10f4331ef14c8aba\": rpc error: code = NotFound desc = could not find container \"0ec8f00cbccfc146c931fa124358726c147e4f3f1695b63f10f4331ef14c8aba\": container with ID starting with 0ec8f00cbccfc146c931fa124358726c147e4f3f1695b63f10f4331ef14c8aba not found: ID does not exist" Apr 23 01:33:17.702760 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:33:17.702733 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nzzbg"] Apr 23 01:33:17.706188 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:33:17.706165 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nzzbg"] Apr 23 01:33:18.769941 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:33:18.769908 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4aaffec6-1473-4ae5-9e1c-7241b8592f1d" path="/var/lib/kubelet/pods/4aaffec6-1473-4ae5-9e1c-7241b8592f1d/volumes" Apr 23 01:34:22.930854 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:34:22.930816 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kgsn4"] Apr 23 01:34:22.931439 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:34:22.931339 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4aaffec6-1473-4ae5-9e1c-7241b8592f1d" containerName="manager" Apr 23 01:34:22.931439 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:34:22.931358 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aaffec6-1473-4ae5-9e1c-7241b8592f1d" containerName="manager" Apr 23 01:34:22.931439 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:34:22.931380 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f311e89-efb6-4857-bb74-abd5ed9d3663" containerName="cleanup" Apr 23 01:34:22.931439 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:34:22.931390 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f311e89-efb6-4857-bb74-abd5ed9d3663" containerName="cleanup" Apr 23 01:34:22.931439 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:34:22.931417 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f311e89-efb6-4857-bb74-abd5ed9d3663" containerName="cleanup" Apr 23 01:34:22.931439 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:34:22.931425 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f311e89-efb6-4857-bb74-abd5ed9d3663" containerName="cleanup" Apr 23 01:34:22.931786 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:34:22.931515 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="4aaffec6-1473-4ae5-9e1c-7241b8592f1d" containerName="manager" Apr 23 01:34:22.931786 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:34:22.931530 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="3f311e89-efb6-4857-bb74-abd5ed9d3663" containerName="cleanup" Apr 23 01:34:22.931786 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:34:22.931543 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="3f311e89-efb6-4857-bb74-abd5ed9d3663" containerName="cleanup" Apr 23 01:34:22.931786 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:34:22.931554 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="3f311e89-efb6-4857-bb74-abd5ed9d3663" containerName="cleanup" Apr 23 01:34:22.934572 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:34:22.934551 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kgsn4" Apr 23 01:34:22.937122 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:34:22.937103 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-94w6b\"" Apr 23 01:34:22.945251 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:34:22.945230 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kgsn4"] Apr 23 01:34:23.001930 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:34:23.001898 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpql9\" (UniqueName: \"kubernetes.io/projected/7937fc2f-481d-4b74-9469-519188bc0350-kube-api-access-wpql9\") pod \"kuadrant-operator-controller-manager-55c7f4c975-kgsn4\" (UID: \"7937fc2f-481d-4b74-9469-519188bc0350\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kgsn4" Apr 23 01:34:23.001930 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:34:23.001934 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7937fc2f-481d-4b74-9469-519188bc0350-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-kgsn4\" (UID: \"7937fc2f-481d-4b74-9469-519188bc0350\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kgsn4" Apr 23 01:34:23.102756 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:34:23.102720 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wpql9\" (UniqueName: \"kubernetes.io/projected/7937fc2f-481d-4b74-9469-519188bc0350-kube-api-access-wpql9\") pod \"kuadrant-operator-controller-manager-55c7f4c975-kgsn4\" (UID: \"7937fc2f-481d-4b74-9469-519188bc0350\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kgsn4" Apr 23 01:34:23.102957 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:34:23.102773 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7937fc2f-481d-4b74-9469-519188bc0350-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-kgsn4\" (UID: \"7937fc2f-481d-4b74-9469-519188bc0350\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kgsn4" Apr 23 01:34:23.103192 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:34:23.103170 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7937fc2f-481d-4b74-9469-519188bc0350-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-kgsn4\" (UID: \"7937fc2f-481d-4b74-9469-519188bc0350\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kgsn4" Apr 23 01:34:23.110658 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:34:23.110603 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpql9\" (UniqueName: \"kubernetes.io/projected/7937fc2f-481d-4b74-9469-519188bc0350-kube-api-access-wpql9\") pod \"kuadrant-operator-controller-manager-55c7f4c975-kgsn4\" (UID: \"7937fc2f-481d-4b74-9469-519188bc0350\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kgsn4" Apr 23 01:34:23.245517 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:34:23.245441 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kgsn4" Apr 23 01:34:23.375722 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:34:23.375690 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kgsn4"] Apr 23 01:34:23.378119 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:34:23.378091 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7937fc2f_481d_4b74_9469_519188bc0350.slice/crio-ba850b8720387c53b99d17dfeb2c8f286ab744b001d2bd065d2958d927f56de6 WatchSource:0}: Error finding container ba850b8720387c53b99d17dfeb2c8f286ab744b001d2bd065d2958d927f56de6: Status 404 returned error can't find the container with id ba850b8720387c53b99d17dfeb2c8f286ab744b001d2bd065d2958d927f56de6 Apr 23 01:34:23.917779 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:34:23.917738 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kgsn4" event={"ID":"7937fc2f-481d-4b74-9469-519188bc0350","Type":"ContainerStarted","Data":"3e4e8627efcfcdee98dc825d26e1946f0322686a44a9528a962f388f6523f62d"} Apr 23 01:34:23.917779 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:34:23.917779 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kgsn4" Apr 23 01:34:23.917779 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:34:23.917789 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kgsn4" event={"ID":"7937fc2f-481d-4b74-9469-519188bc0350","Type":"ContainerStarted","Data":"ba850b8720387c53b99d17dfeb2c8f286ab744b001d2bd065d2958d927f56de6"} Apr 23 01:34:23.939692 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:34:23.938629 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kgsn4" podStartSLOduration=1.9385890190000001 podStartE2EDuration="1.938589019s" podCreationTimestamp="2026-04-23 01:34:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 01:34:23.935050644 +0000 UTC m=+1471.720725001" watchObservedRunningTime="2026-04-23 01:34:23.938589019 +0000 UTC m=+1471.724263376" Apr 23 01:34:34.923556 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:34:34.923523 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kgsn4" Apr 23 01:34:44.113945 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:34:44.113910 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:34:50.915039 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:34:50.915002 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:34:52.770339 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:34:52.770309 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-grc82_58715151-e1c9-475c-be78-487774704c95/console-operator/1.log" Apr 23 01:34:52.774624 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:34:52.774592 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79v47_52115d8e-033b-485d-aa67-434e7ae395d5/ovn-acl-logging/0.log" Apr 23 01:34:52.776082 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:34:52.776057 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-grc82_58715151-e1c9-475c-be78-487774704c95/console-operator/1.log" Apr 23 01:34:52.780747 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:34:52.780729 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79v47_52115d8e-033b-485d-aa67-434e7ae395d5/ovn-acl-logging/0.log" Apr 23 01:35:14.511161 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:35:14.511080 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:35:21.206972 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:35:21.206934 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:35:28.908540 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:35:28.908506 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:35:40.615816 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:35:40.615781 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:35:48.715397 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:35:48.715357 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:35:59.318093 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:35:59.318059 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:36:08.417700 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:36:08.417662 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:36:19.606119 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:36:19.606082 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:36:27.408942 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:36:27.408899 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:36:38.224292 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:36:38.224257 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:36:47.709724 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:36:47.709642 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:37:21.312811 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:37:21.312770 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:38:04.915619 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:38:04.915586 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:38:12.916859 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:38:12.916771 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:38:22.022677 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:38:22.022636 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:38:30.623022 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:38:30.622987 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:38:38.922560 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:38:38.922524 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:38:51.933234 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:38:51.933200 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:39:01.215076 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:39:01.215039 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:39:07.319687 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:39:07.319652 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:39:17.583744 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:39:17.583704 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:39:25.645698 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:39:25.645660 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:39:33.342106 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:39:33.342027 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:39:44.353948 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:39:44.353870 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:39:52.795540 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:39:52.795513 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-grc82_58715151-e1c9-475c-be78-487774704c95/console-operator/1.log" Apr 23 01:39:52.800188 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:39:52.800163 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79v47_52115d8e-033b-485d-aa67-434e7ae395d5/ovn-acl-logging/0.log" Apr 23 01:39:52.802218 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:39:52.802197 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-grc82_58715151-e1c9-475c-be78-487774704c95/console-operator/1.log" Apr 23 01:39:52.806734 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:39:52.806717 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79v47_52115d8e-033b-485d-aa67-434e7ae395d5/ovn-acl-logging/0.log" Apr 23 01:40:01.342694 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:40:01.342654 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:40:09.825698 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:40:09.825657 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:40:18.721463 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:40:18.721426 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:40:26.713114 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:40:26.713079 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:40:44.316779 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:40:44.316747 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:40:52.220975 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:40:52.220938 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:41:01.319634 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:41:01.319574 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:41:09.513081 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:41:09.513046 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:41:19.217420 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:41:19.217337 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:41:27.717025 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:41:27.716988 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:41:36.212413 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:41:36.212376 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:41:50.010887 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:41:50.010854 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:41:58.019441 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:41:58.019408 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:42:09.306851 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:42:09.306816 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:42:18.327002 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:42:18.326961 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:42:28.208127 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:42:28.208088 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:42:37.411338 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:42:37.411305 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:42:45.912679 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:42:45.912584 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:43:01.312089 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:43:01.312055 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:43:09.914169 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:43:09.914128 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:43:19.218053 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:43:19.218017 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:43:27.113414 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:43:27.113376 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:43:50.017880 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:43:50.017840 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:44:02.811084 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:02.811052 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7mwh2"] Apr 23 01:44:09.791417 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:09.791387 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5fb5768b86-hvntf_45c1a1fa-47fe-4d79-813d-abe2e5c22d31/manager/0.log" Apr 23 01:44:11.335641 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:11.335591 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-hktjf_6aa9b334-0986-4219-8395-33b3949fd6c2/manager/0.log" Apr 23 01:44:11.441141 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:11.441117 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-lktzw_b743d95c-4e3b-49d3-867e-0354895fee99/manager/0.log" Apr 23 01:44:11.661508 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:11.661470 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-5qqtk_ca2f3745-de45-4910-b99a-354a7ed67843/registry-server/0.log" Apr 23 01:44:11.786054 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:11.786018 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-kgsn4_7937fc2f-481d-4b74-9469-519188bc0350/manager/0.log" Apr 23 01:44:11.896229 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:11.896203 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-7mwh2_d5e25ee0-d154-4b93-ac77-53abdcdc47be/limitador/0.log" Apr 23 01:44:12.339221 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:12.339189 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557f4qzln_814b8694-9550-460f-a692-dce74660f64d/istio-proxy/0.log" Apr 23 01:44:12.781744 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:12.781715 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-2qnsr_a5b1db40-a787-4d97-87ff-5443591b723f/istio-proxy/0.log" Apr 23 01:44:17.508383 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:17.508350 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6hxp5/must-gather-l2n6c"] Apr 23 01:44:17.508879 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:17.508863 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f311e89-efb6-4857-bb74-abd5ed9d3663" containerName="cleanup" Apr 23 01:44:17.508951 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:17.508882 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f311e89-efb6-4857-bb74-abd5ed9d3663" containerName="cleanup" Apr 23 01:44:17.512269 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:17.512250 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6hxp5/must-gather-l2n6c" Apr 23 01:44:17.514493 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:17.514466 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6hxp5\"/\"kube-root-ca.crt\"" Apr 23 01:44:17.514493 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:17.514483 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-6hxp5\"/\"default-dockercfg-rb8bq\"" Apr 23 01:44:17.514699 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:17.514478 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6hxp5\"/\"openshift-service-ca.crt\"" Apr 23 01:44:17.526994 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:17.526971 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6hxp5/must-gather-l2n6c"] Apr 23 01:44:17.642549 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:17.642514 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/918ba1d1-dad7-4302-a5c7-33bf75720a84-must-gather-output\") pod \"must-gather-l2n6c\" (UID: \"918ba1d1-dad7-4302-a5c7-33bf75720a84\") " pod="openshift-must-gather-6hxp5/must-gather-l2n6c" Apr 23 01:44:17.642742 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:17.642579 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79v72\" (UniqueName: \"kubernetes.io/projected/918ba1d1-dad7-4302-a5c7-33bf75720a84-kube-api-access-79v72\") pod \"must-gather-l2n6c\" (UID: \"918ba1d1-dad7-4302-a5c7-33bf75720a84\") " pod="openshift-must-gather-6hxp5/must-gather-l2n6c" Apr 23 01:44:17.743746 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:17.743717 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/918ba1d1-dad7-4302-a5c7-33bf75720a84-must-gather-output\") pod \"must-gather-l2n6c\" (UID: \"918ba1d1-dad7-4302-a5c7-33bf75720a84\") " pod="openshift-must-gather-6hxp5/must-gather-l2n6c" Apr 23 01:44:17.743897 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:17.743759 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-79v72\" (UniqueName: \"kubernetes.io/projected/918ba1d1-dad7-4302-a5c7-33bf75720a84-kube-api-access-79v72\") pod \"must-gather-l2n6c\" (UID: \"918ba1d1-dad7-4302-a5c7-33bf75720a84\") " pod="openshift-must-gather-6hxp5/must-gather-l2n6c" Apr 23 01:44:17.744060 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:17.744038 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/918ba1d1-dad7-4302-a5c7-33bf75720a84-must-gather-output\") pod \"must-gather-l2n6c\" (UID: \"918ba1d1-dad7-4302-a5c7-33bf75720a84\") " pod="openshift-must-gather-6hxp5/must-gather-l2n6c" Apr 23 01:44:17.750995 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:17.750968 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-79v72\" (UniqueName: \"kubernetes.io/projected/918ba1d1-dad7-4302-a5c7-33bf75720a84-kube-api-access-79v72\") pod \"must-gather-l2n6c\" (UID: \"918ba1d1-dad7-4302-a5c7-33bf75720a84\") " pod="openshift-must-gather-6hxp5/must-gather-l2n6c" Apr 23 01:44:17.822257 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:17.822187 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6hxp5/must-gather-l2n6c" Apr 23 01:44:17.946678 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:17.946649 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6hxp5/must-gather-l2n6c"] Apr 23 01:44:17.949578 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:44:17.949545 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod918ba1d1_dad7_4302_a5c7_33bf75720a84.slice/crio-45034cd35f67ff288b5eebf78dd62cec4ae93b2241d79f32f1e312eaff5dc3c9 WatchSource:0}: Error finding container 45034cd35f67ff288b5eebf78dd62cec4ae93b2241d79f32f1e312eaff5dc3c9: Status 404 returned error can't find the container with id 45034cd35f67ff288b5eebf78dd62cec4ae93b2241d79f32f1e312eaff5dc3c9 Apr 23 01:44:17.951404 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:17.951389 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 01:44:18.059700 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:18.059645 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6hxp5/must-gather-l2n6c" event={"ID":"918ba1d1-dad7-4302-a5c7-33bf75720a84","Type":"ContainerStarted","Data":"45034cd35f67ff288b5eebf78dd62cec4ae93b2241d79f32f1e312eaff5dc3c9"} Apr 23 01:44:19.067285 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:19.067217 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6hxp5/must-gather-l2n6c" event={"ID":"918ba1d1-dad7-4302-a5c7-33bf75720a84","Type":"ContainerStarted","Data":"cf8aedcaea727b33c0a4f18e3154564096ae57ff1c12a37b640a28bf39bbf264"} Apr 23 01:44:20.074824 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:20.074785 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6hxp5/must-gather-l2n6c" event={"ID":"918ba1d1-dad7-4302-a5c7-33bf75720a84","Type":"ContainerStarted","Data":"fd2a12f64b2211d2f10ef3a0161111fb56b31737534ab422ac31d0c562c9332c"} Apr 23 01:44:20.089043 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:20.089001 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6hxp5/must-gather-l2n6c" podStartSLOduration=2.163089745 podStartE2EDuration="3.088989307s" podCreationTimestamp="2026-04-23 01:44:17 +0000 UTC" firstStartedPulling="2026-04-23 01:44:17.951513988 +0000 UTC m=+2065.737188323" lastFinishedPulling="2026-04-23 01:44:18.87741355 +0000 UTC m=+2066.663087885" observedRunningTime="2026-04-23 01:44:20.088109827 +0000 UTC m=+2067.873784186" watchObservedRunningTime="2026-04-23 01:44:20.088989307 +0000 UTC m=+2067.874663664" Apr 23 01:44:20.458129 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:20.458081 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-6jtxr_ae1a6350-32fe-4569-a4d1-9c369aaff8e4/global-pull-secret-syncer/0.log" Apr 23 01:44:20.591192 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:20.591160 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-6wfbb_643f99c6-2212-475f-8fa2-d1ea3fa8a17a/konnectivity-agent/0.log" Apr 23 01:44:20.724404 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:20.724291 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-235.ec2.internal_b29f92d08abcb80be4466bff586fc859/haproxy/0.log" Apr 23 01:44:24.861551 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:24.861514 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-hktjf_6aa9b334-0986-4219-8395-33b3949fd6c2/manager/0.log" Apr 23 01:44:24.911135 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:24.911107 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-lktzw_b743d95c-4e3b-49d3-867e-0354895fee99/manager/0.log" Apr 23 01:44:24.981281 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:24.981108 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-5qqtk_ca2f3745-de45-4910-b99a-354a7ed67843/registry-server/0.log" Apr 23 01:44:25.058055 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:25.058023 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-kgsn4_7937fc2f-481d-4b74-9469-519188bc0350/manager/0.log" Apr 23 01:44:25.080029 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:25.080000 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-7mwh2_d5e25ee0-d154-4b93-ac77-53abdcdc47be/limitador/0.log" Apr 23 01:44:26.736458 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:26.736427 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-qgtjh_287bea1a-68cd-49c1-a36a-2fc24dbc7719/cluster-monitoring-operator/0.log" Apr 23 01:44:26.889879 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:26.889853 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-kdwf9_4a3638c0-7a9c-4827-92b6-19af4e48804e/node-exporter/0.log" Apr 23 01:44:26.913361 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:26.913328 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-kdwf9_4a3638c0-7a9c-4827-92b6-19af4e48804e/kube-rbac-proxy/0.log" Apr 23 01:44:26.938024 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:26.937996 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-kdwf9_4a3638c0-7a9c-4827-92b6-19af4e48804e/init-textfile/0.log" Apr 23 01:44:27.213884 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:27.213796 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_35985415-770a-4cf2-a83a-2a1bbef2d634/prometheus/0.log" Apr 23 01:44:27.233844 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:27.233810 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_35985415-770a-4cf2-a83a-2a1bbef2d634/config-reloader/0.log" Apr 23 01:44:27.260680 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:27.260651 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_35985415-770a-4cf2-a83a-2a1bbef2d634/thanos-sidecar/0.log" Apr 23 01:44:27.286660 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:27.286635 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_35985415-770a-4cf2-a83a-2a1bbef2d634/kube-rbac-proxy-web/0.log" Apr 23 01:44:27.309219 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:27.309179 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_35985415-770a-4cf2-a83a-2a1bbef2d634/kube-rbac-proxy/0.log" Apr 23 01:44:27.329986 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:27.329958 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_35985415-770a-4cf2-a83a-2a1bbef2d634/kube-rbac-proxy-thanos/0.log" Apr 23 01:44:27.350927 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:27.350895 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_35985415-770a-4cf2-a83a-2a1bbef2d634/init-config-reloader/0.log" Apr 23 01:44:27.380185 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:27.380156 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-b4wg8_f3158a19-531a-473b-9d1f-1b765e094c1b/prometheus-operator/0.log" Apr 23 01:44:27.399637 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:27.399581 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-b4wg8_f3158a19-531a-473b-9d1f-1b765e094c1b/kube-rbac-proxy/0.log" Apr 23 01:44:27.423455 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:27.423427 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-j8mwv_fb37d61f-d23a-4a04-a8cc-a5b2f18a0faf/prometheus-operator-admission-webhook/0.log" Apr 23 01:44:27.452280 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:27.452241 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-549565c777-rzlzc_7a8c586f-d7a8-4d9d-9816-42be04262414/telemeter-client/0.log" Apr 23 01:44:27.474417 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:27.474348 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-549565c777-rzlzc_7a8c586f-d7a8-4d9d-9816-42be04262414/reload/0.log" Apr 23 01:44:27.509993 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:27.509928 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-549565c777-rzlzc_7a8c586f-d7a8-4d9d-9816-42be04262414/kube-rbac-proxy/0.log" Apr 23 01:44:28.767070 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:28.767038 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-vttwn_0fbe8b71-af85-4c24-a839-1dc68b57173b/networking-console-plugin/0.log" Apr 23 01:44:29.298732 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:29.298698 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-grc82_58715151-e1c9-475c-be78-487774704c95/console-operator/1.log" Apr 23 01:44:29.305581 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:29.305550 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-grc82_58715151-e1c9-475c-be78-487774704c95/console-operator/2.log" Apr 23 01:44:29.371312 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:29.371274 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6hxp5/perf-node-gather-daemonset-7q64m"] Apr 23 01:44:29.378100 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:29.378080 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-7q64m" Apr 23 01:44:29.383211 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:29.383187 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6hxp5/perf-node-gather-daemonset-7q64m"] Apr 23 01:44:29.470269 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:29.470232 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5dc146f5-6526-4d84-91f2-3cc69fad5c9b-podres\") pod \"perf-node-gather-daemonset-7q64m\" (UID: \"5dc146f5-6526-4d84-91f2-3cc69fad5c9b\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-7q64m" Apr 23 01:44:29.470590 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:29.470570 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5dc146f5-6526-4d84-91f2-3cc69fad5c9b-proc\") pod \"perf-node-gather-daemonset-7q64m\" (UID: \"5dc146f5-6526-4d84-91f2-3cc69fad5c9b\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-7q64m" Apr 23 01:44:29.470816 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:29.470799 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5dc146f5-6526-4d84-91f2-3cc69fad5c9b-sys\") pod \"perf-node-gather-daemonset-7q64m\" (UID: \"5dc146f5-6526-4d84-91f2-3cc69fad5c9b\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-7q64m" Apr 23 01:44:29.470983 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:29.470949 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq898\" (UniqueName: \"kubernetes.io/projected/5dc146f5-6526-4d84-91f2-3cc69fad5c9b-kube-api-access-cq898\") pod \"perf-node-gather-daemonset-7q64m\" (UID: \"5dc146f5-6526-4d84-91f2-3cc69fad5c9b\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-7q64m" Apr 23 01:44:29.471106 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:29.471095 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5dc146f5-6526-4d84-91f2-3cc69fad5c9b-lib-modules\") pod \"perf-node-gather-daemonset-7q64m\" (UID: \"5dc146f5-6526-4d84-91f2-3cc69fad5c9b\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-7q64m" Apr 23 01:44:29.571946 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:29.571856 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5dc146f5-6526-4d84-91f2-3cc69fad5c9b-sys\") pod \"perf-node-gather-daemonset-7q64m\" (UID: \"5dc146f5-6526-4d84-91f2-3cc69fad5c9b\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-7q64m" Apr 23 01:44:29.571946 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:29.571910 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cq898\" (UniqueName: \"kubernetes.io/projected/5dc146f5-6526-4d84-91f2-3cc69fad5c9b-kube-api-access-cq898\") pod \"perf-node-gather-daemonset-7q64m\" (UID: \"5dc146f5-6526-4d84-91f2-3cc69fad5c9b\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-7q64m" Apr 23 01:44:29.571946 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:29.571943 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5dc146f5-6526-4d84-91f2-3cc69fad5c9b-lib-modules\") pod \"perf-node-gather-daemonset-7q64m\" (UID: \"5dc146f5-6526-4d84-91f2-3cc69fad5c9b\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-7q64m" Apr 23 01:44:29.572235 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:29.571988 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5dc146f5-6526-4d84-91f2-3cc69fad5c9b-podres\") pod \"perf-node-gather-daemonset-7q64m\" (UID: \"5dc146f5-6526-4d84-91f2-3cc69fad5c9b\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-7q64m" Apr 23 01:44:29.572235 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:29.572055 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5dc146f5-6526-4d84-91f2-3cc69fad5c9b-proc\") pod \"perf-node-gather-daemonset-7q64m\" (UID: \"5dc146f5-6526-4d84-91f2-3cc69fad5c9b\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-7q64m" Apr 23 01:44:29.572235 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:29.572169 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5dc146f5-6526-4d84-91f2-3cc69fad5c9b-proc\") pod \"perf-node-gather-daemonset-7q64m\" (UID: \"5dc146f5-6526-4d84-91f2-3cc69fad5c9b\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-7q64m" Apr 23 01:44:29.572235 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:29.572172 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5dc146f5-6526-4d84-91f2-3cc69fad5c9b-lib-modules\") pod \"perf-node-gather-daemonset-7q64m\" (UID: \"5dc146f5-6526-4d84-91f2-3cc69fad5c9b\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-7q64m" Apr 23 01:44:29.572385 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:29.572284 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5dc146f5-6526-4d84-91f2-3cc69fad5c9b-podres\") pod \"perf-node-gather-daemonset-7q64m\" (UID: \"5dc146f5-6526-4d84-91f2-3cc69fad5c9b\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-7q64m" Apr 23 01:44:29.572385 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:29.572324 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5dc146f5-6526-4d84-91f2-3cc69fad5c9b-sys\") pod \"perf-node-gather-daemonset-7q64m\" (UID: \"5dc146f5-6526-4d84-91f2-3cc69fad5c9b\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-7q64m" Apr 23 01:44:29.581283 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:29.581244 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq898\" (UniqueName: \"kubernetes.io/projected/5dc146f5-6526-4d84-91f2-3cc69fad5c9b-kube-api-access-cq898\") pod \"perf-node-gather-daemonset-7q64m\" (UID: \"5dc146f5-6526-4d84-91f2-3cc69fad5c9b\") " pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-7q64m" Apr 23 01:44:29.692166 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:29.692128 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-7q64m" Apr 23 01:44:29.825112 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:29.825019 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f59b5659c-7w4ng_9fa6f04f-f8e0-4cbc-94b9-9114ea798d6c/console/0.log" Apr 23 01:44:29.842342 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:29.842318 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6hxp5/perf-node-gather-daemonset-7q64m"] Apr 23 01:44:29.845212 ip-10-0-138-235 kubenswrapper[2569]: W0423 01:44:29.845187 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5dc146f5_6526_4d84_91f2_3cc69fad5c9b.slice/crio-b95b977fefc405b88d821d208b919273713f73bbbf1066297cb9233b34145d95 WatchSource:0}: Error finding container b95b977fefc405b88d821d208b919273713f73bbbf1066297cb9233b34145d95: Status 404 returned error can't find the container with id b95b977fefc405b88d821d208b919273713f73bbbf1066297cb9233b34145d95 Apr 23 01:44:29.869186 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:29.866781 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-td9fq_74ab69de-e0f4-4c2e-9254-d9fc69aed149/download-server/0.log" Apr 23 01:44:30.120182 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:30.120096 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-7q64m" event={"ID":"5dc146f5-6526-4d84-91f2-3cc69fad5c9b","Type":"ContainerStarted","Data":"288243b576dcf7119c28ba50166f569ed5a7967ac11ad721e3623049a6997b0d"} Apr 23 01:44:30.120182 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:30.120133 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-7q64m" event={"ID":"5dc146f5-6526-4d84-91f2-3cc69fad5c9b","Type":"ContainerStarted","Data":"b95b977fefc405b88d821d208b919273713f73bbbf1066297cb9233b34145d95"} Apr 23 01:44:30.120182 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:30.120173 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-7q64m" Apr 23 01:44:30.149342 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:30.149292 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-7q64m" podStartSLOduration=1.149277256 podStartE2EDuration="1.149277256s" podCreationTimestamp="2026-04-23 01:44:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 01:44:30.148692344 +0000 UTC m=+2077.934366696" watchObservedRunningTime="2026-04-23 01:44:30.149277256 +0000 UTC m=+2077.934951612" Apr 23 01:44:30.445343 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:30.445303 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-jnpmf_b4aa80dc-3684-4d0e-b46a-70d65c9c0782/volume-data-source-validator/0.log" Apr 23 01:44:31.511259 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:31.511228 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rpd44_746bf4ef-4ba5-45d1-9cc6-ab6354c10b18/dns/0.log" Apr 23 01:44:31.530199 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:31.530175 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rpd44_746bf4ef-4ba5-45d1-9cc6-ab6354c10b18/kube-rbac-proxy/0.log" Apr 23 01:44:31.551852 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:31.551810 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-fgvhm_b025c029-af84-46be-a329-3c26d61f764a/dns-node-resolver/0.log" Apr 23 01:44:32.113319 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:32.113292 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-grrph_83355238-6978-4f5d-8b07-0ea3d3784353/node-ca/0.log" Apr 23 01:44:33.024866 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:33.024839 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557f4qzln_814b8694-9550-460f-a692-dce74660f64d/istio-proxy/0.log" Apr 23 01:44:33.264207 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:33.263893 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-2qnsr_a5b1db40-a787-4d97-87ff-5443591b723f/istio-proxy/0.log" Apr 23 01:44:33.775840 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:33.775807 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-bk66m_79d4f338-f964-4fa6-985e-50bbb3b105a9/serve-healthcheck-canary/0.log" Apr 23 01:44:34.270733 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:34.270668 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-sxqx7_7df9359b-30f8-4806-98db-cf999c7f0ed8/insights-operator/0.log" Apr 23 01:44:34.271590 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:34.271556 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-sxqx7_7df9359b-30f8-4806-98db-cf999c7f0ed8/insights-operator/1.log" Apr 23 01:44:34.352906 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:34.352883 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-j5x9h_55cf13db-98bb-4d59-9791-81984d001d0c/kube-rbac-proxy/0.log" Apr 23 01:44:34.371432 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:34.371412 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-j5x9h_55cf13db-98bb-4d59-9791-81984d001d0c/exporter/0.log" Apr 23 01:44:34.391490 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:34.391457 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-j5x9h_55cf13db-98bb-4d59-9791-81984d001d0c/extractor/0.log" Apr 23 01:44:36.133827 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:36.133797 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-6hxp5/perf-node-gather-daemonset-7q64m" Apr 23 01:44:36.521701 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:36.521664 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5fb5768b86-hvntf_45c1a1fa-47fe-4d79-813d-abe2e5c22d31/manager/0.log" Apr 23 01:44:37.768446 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:37.768418 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-6b799cbd77-t4mj7_998ca98b-02e0-4f9b-8e9e-6c4fb7b647ad/manager/0.log" Apr 23 01:44:37.794831 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:37.794795 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-7mtq5_45831f9c-10ca-4d6e-a7fe-1cff13e1e7a2/openshift-lws-operator/0.log" Apr 23 01:44:42.557566 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:42.557527 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-krk48_2a988bd2-2ed4-468c-ab78-3be082ee6a61/kube-storage-version-migrator-operator/1.log" Apr 23 01:44:42.559718 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:42.559692 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-krk48_2a988bd2-2ed4-468c-ab78-3be082ee6a61/kube-storage-version-migrator-operator/0.log" Apr 23 01:44:43.676583 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:43.676551 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gzwpj_b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb/kube-multus-additional-cni-plugins/0.log" Apr 23 01:44:43.695770 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:43.695743 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gzwpj_b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb/egress-router-binary-copy/0.log" Apr 23 01:44:43.716951 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:43.716928 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gzwpj_b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb/cni-plugins/0.log" Apr 23 01:44:43.736095 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:43.736070 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gzwpj_b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb/bond-cni-plugin/0.log" Apr 23 01:44:43.754888 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:43.754867 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gzwpj_b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb/routeoverride-cni/0.log" Apr 23 01:44:43.778866 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:43.778839 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gzwpj_b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb/whereabouts-cni-bincopy/0.log" Apr 23 01:44:43.798239 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:43.798221 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gzwpj_b83ed83c-9c45-410a-8bf4-06d1bf0a4bbb/whereabouts-cni/0.log" Apr 23 01:44:43.984772 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:43.984687 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c6dx7_21b4a1fd-e436-4824-abb9-d40c296dc036/kube-multus/0.log" Apr 23 01:44:44.051983 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:44.051952 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-5mm4v_608e8d52-e2cd-48e3-b524-0f0d764d9501/network-metrics-daemon/0.log" Apr 23 01:44:44.071277 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:44.071250 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-5mm4v_608e8d52-e2cd-48e3-b524-0f0d764d9501/kube-rbac-proxy/0.log" Apr 23 01:44:44.946220 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:44.946188 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79v47_52115d8e-033b-485d-aa67-434e7ae395d5/ovn-controller/0.log" Apr 23 01:44:44.961283 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:44.961235 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79v47_52115d8e-033b-485d-aa67-434e7ae395d5/ovn-acl-logging/0.log" Apr 23 01:44:44.980318 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:44.980293 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79v47_52115d8e-033b-485d-aa67-434e7ae395d5/ovn-acl-logging/1.log" Apr 23 01:44:45.001637 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:45.001600 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79v47_52115d8e-033b-485d-aa67-434e7ae395d5/kube-rbac-proxy-node/0.log" Apr 23 01:44:45.023095 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:45.023069 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79v47_52115d8e-033b-485d-aa67-434e7ae395d5/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 01:44:45.039130 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:45.039101 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79v47_52115d8e-033b-485d-aa67-434e7ae395d5/northd/0.log" Apr 23 01:44:45.057231 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:45.057197 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79v47_52115d8e-033b-485d-aa67-434e7ae395d5/nbdb/0.log" Apr 23 01:44:45.076140 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:45.076108 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79v47_52115d8e-033b-485d-aa67-434e7ae395d5/sbdb/0.log" Apr 23 01:44:45.246698 ip-10-0-138-235 kubenswrapper[2569]: I0423 01:44:45.246628 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-79v47_52115d8e-033b-485d-aa67-434e7ae395d5/ovnkube-controller/0.log"