Apr 21 04:36:27.460695 ip-10-0-141-241 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 21 04:36:27.460708 ip-10-0-141-241 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 21 04:36:27.460717 ip-10-0-141-241 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 21 04:36:27.460916 ip-10-0-141-241 systemd[1]: Failed to start Kubernetes Kubelet. Apr 21 04:36:37.486990 ip-10-0-141-241 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 21 04:36:37.487006 ip-10-0-141-241 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 3ed036aa8a3f42fea2042b9f98cf8c1f -- Apr 21 04:39:04.753354 ip-10-0-141-241 systemd[1]: Starting Kubernetes Kubelet... Apr 21 04:39:05.232889 ip-10-0-141-241 kubenswrapper[2568]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 04:39:05.232889 ip-10-0-141-241 kubenswrapper[2568]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 04:39:05.232889 ip-10-0-141-241 kubenswrapper[2568]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 04:39:05.232889 ip-10-0-141-241 kubenswrapper[2568]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 04:39:05.232889 ip-10-0-141-241 kubenswrapper[2568]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 04:39:05.234741 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.234650 2568 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 04:39:05.240516 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240482 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 04:39:05.240516 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240505 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 04:39:05.240516 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240509 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 04:39:05.240516 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240513 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 04:39:05.240516 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240516 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 04:39:05.240516 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240519 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 04:39:05.240516 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240522 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 04:39:05.240516 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240525 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 04:39:05.240516 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240527 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 04:39:05.240840 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240531 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 04:39:05.240840 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240534 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 04:39:05.240840 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240536 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 04:39:05.240840 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240539 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 04:39:05.240840 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240541 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 04:39:05.240840 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240544 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 04:39:05.240840 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240547 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 04:39:05.240840 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240549 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 04:39:05.240840 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240551 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 04:39:05.240840 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240554 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 04:39:05.240840 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240557 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 04:39:05.240840 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240559 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 04:39:05.240840 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240562 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 04:39:05.240840 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240565 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 04:39:05.240840 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240568 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 04:39:05.240840 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240571 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 04:39:05.240840 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240574 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 04:39:05.240840 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240576 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 04:39:05.240840 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240579 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 04:39:05.241318 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240581 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 04:39:05.241318 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240584 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 04:39:05.241318 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240586 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 04:39:05.241318 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240589 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 04:39:05.241318 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240591 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 04:39:05.241318 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240593 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 04:39:05.241318 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240596 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 04:39:05.241318 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240599 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 04:39:05.241318 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240603 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 04:39:05.241318 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240605 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 04:39:05.241318 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240608 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 04:39:05.241318 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240610 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 04:39:05.241318 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240613 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 04:39:05.241318 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240615 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 04:39:05.241318 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240619 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 04:39:05.241318 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240621 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 04:39:05.241318 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240624 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 04:39:05.241318 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240627 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 04:39:05.241318 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240630 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 04:39:05.241318 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240632 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 04:39:05.241834 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240635 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 04:39:05.241834 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240637 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 04:39:05.241834 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240640 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 04:39:05.241834 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240643 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 04:39:05.241834 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240645 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 04:39:05.241834 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240648 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 04:39:05.241834 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240650 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 04:39:05.241834 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240653 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 04:39:05.241834 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240656 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 04:39:05.241834 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240658 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 04:39:05.241834 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240661 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 04:39:05.241834 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240663 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 04:39:05.241834 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240666 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 04:39:05.241834 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240669 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 21 04:39:05.241834 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240672 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 04:39:05.241834 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240675 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 04:39:05.241834 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240677 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 04:39:05.241834 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240680 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 04:39:05.241834 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240683 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 04:39:05.241834 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240686 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 04:39:05.242353 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240688 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 04:39:05.242353 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240690 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 04:39:05.242353 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240693 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 04:39:05.242353 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240696 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 04:39:05.242353 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240699 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 04:39:05.242353 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240701 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 04:39:05.242353 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240704 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 04:39:05.242353 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240707 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 04:39:05.242353 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240710 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 04:39:05.242353 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240712 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 04:39:05.242353 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240716 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 04:39:05.242353 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240718 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 04:39:05.242353 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240721 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 04:39:05.242353 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240724 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 04:39:05.242353 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240729 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 04:39:05.242353 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240732 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 04:39:05.242353 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240735 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 04:39:05.242353 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.240739 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 04:39:05.242353 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241169 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 04:39:05.242847 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241176 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 04:39:05.242847 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241179 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 04:39:05.242847 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241182 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 04:39:05.242847 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241185 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 04:39:05.242847 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241187 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 04:39:05.242847 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241190 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 04:39:05.242847 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241193 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 04:39:05.242847 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241196 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 04:39:05.242847 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241198 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 04:39:05.242847 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241201 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 04:39:05.242847 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241205 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 04:39:05.242847 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241208 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 04:39:05.242847 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241211 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 04:39:05.242847 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241214 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 04:39:05.242847 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241216 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 04:39:05.242847 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241219 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 04:39:05.242847 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241222 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 04:39:05.242847 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241224 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 04:39:05.242847 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241227 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 04:39:05.242847 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241230 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 04:39:05.243341 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241232 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 04:39:05.243341 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241235 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 04:39:05.243341 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241239 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 04:39:05.243341 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241243 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 04:39:05.243341 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241246 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 04:39:05.243341 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241249 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 04:39:05.243341 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241252 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 04:39:05.243341 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241255 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 04:39:05.243341 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241258 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 04:39:05.243341 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241260 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 04:39:05.243341 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241263 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 04:39:05.243341 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241266 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 04:39:05.243341 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241268 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 04:39:05.243341 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241271 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 04:39:05.243341 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241273 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 04:39:05.243341 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241276 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 04:39:05.243341 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241279 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 04:39:05.243341 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241282 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 04:39:05.243341 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241285 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 04:39:05.243814 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241287 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 04:39:05.243814 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241290 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 04:39:05.243814 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241293 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 04:39:05.243814 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241295 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 04:39:05.243814 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241298 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 04:39:05.243814 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241301 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 04:39:05.243814 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241304 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 04:39:05.243814 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241306 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 04:39:05.243814 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241309 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 04:39:05.243814 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241312 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 04:39:05.243814 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241314 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 04:39:05.243814 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241317 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 04:39:05.243814 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241319 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 04:39:05.243814 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241323 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 04:39:05.243814 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241325 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 04:39:05.243814 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241329 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 04:39:05.243814 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241332 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 04:39:05.243814 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241335 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 04:39:05.243814 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241339 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 04:39:05.244294 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241343 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 04:39:05.244294 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241346 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 04:39:05.244294 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241349 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 04:39:05.244294 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241351 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 04:39:05.244294 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241354 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 21 04:39:05.244294 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241357 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 04:39:05.244294 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241359 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 04:39:05.244294 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241361 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 04:39:05.244294 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241364 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 04:39:05.244294 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241366 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 04:39:05.244294 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241369 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 04:39:05.244294 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241372 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 04:39:05.244294 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241374 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 04:39:05.244294 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241377 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 04:39:05.244294 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241379 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 04:39:05.244294 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241382 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 04:39:05.244294 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241385 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 04:39:05.244294 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241387 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 04:39:05.244294 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241389 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 04:39:05.244294 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241392 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 04:39:05.244788 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241394 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 04:39:05.244788 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241397 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 04:39:05.244788 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241399 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 04:39:05.244788 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241402 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 04:39:05.244788 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241404 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 04:39:05.244788 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241407 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 04:39:05.244788 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.241409 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 04:39:05.244788 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242783 2568 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 04:39:05.244788 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242793 2568 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 04:39:05.244788 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242801 2568 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 04:39:05.244788 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242806 2568 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 04:39:05.244788 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242812 2568 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 04:39:05.244788 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242816 2568 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 04:39:05.244788 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242821 2568 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 04:39:05.244788 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242825 2568 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 04:39:05.244788 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242829 2568 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 04:39:05.244788 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242832 2568 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 04:39:05.244788 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242836 2568 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 04:39:05.244788 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242840 2568 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 04:39:05.244788 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242843 2568 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 04:39:05.244788 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242846 2568 flags.go:64] FLAG: --cgroup-root="" Apr 21 04:39:05.244788 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242849 2568 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 04:39:05.245341 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242852 2568 flags.go:64] FLAG: --client-ca-file="" Apr 21 04:39:05.245341 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242856 2568 flags.go:64] FLAG: --cloud-config="" Apr 21 04:39:05.245341 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242859 2568 flags.go:64] FLAG: --cloud-provider="external" Apr 21 04:39:05.245341 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242862 2568 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 04:39:05.245341 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242868 2568 flags.go:64] FLAG: --cluster-domain="" Apr 21 04:39:05.245341 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242871 2568 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 04:39:05.245341 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242875 2568 flags.go:64] FLAG: --config-dir="" Apr 21 04:39:05.245341 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242878 2568 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 04:39:05.245341 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242881 2568 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 04:39:05.245341 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242885 2568 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 04:39:05.245341 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242889 2568 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 04:39:05.245341 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242892 2568 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 04:39:05.245341 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242895 2568 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 04:39:05.245341 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242898 2568 flags.go:64] FLAG: --contention-profiling="false" Apr 21 04:39:05.245341 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242901 2568 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 04:39:05.245341 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242904 2568 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 04:39:05.245341 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242908 2568 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 04:39:05.245341 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242910 2568 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 04:39:05.245341 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242915 2568 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 04:39:05.245341 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242918 2568 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 04:39:05.245341 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242921 2568 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 04:39:05.245341 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242924 2568 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 04:39:05.245341 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242928 2568 flags.go:64] FLAG: --enable-server="true" Apr 21 04:39:05.245341 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242931 2568 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 04:39:05.245341 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242936 2568 flags.go:64] FLAG: --event-burst="100" Apr 21 04:39:05.245977 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242939 2568 flags.go:64] FLAG: --event-qps="50" Apr 21 04:39:05.245977 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242942 2568 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 04:39:05.245977 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242945 2568 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 04:39:05.245977 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242948 2568 flags.go:64] FLAG: --eviction-hard="" Apr 21 04:39:05.245977 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242952 2568 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 04:39:05.245977 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242955 2568 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 04:39:05.245977 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242959 2568 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 04:39:05.245977 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242962 2568 flags.go:64] FLAG: --eviction-soft="" Apr 21 04:39:05.245977 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242965 2568 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 04:39:05.245977 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242968 2568 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 04:39:05.245977 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242971 2568 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 04:39:05.245977 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242974 2568 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 04:39:05.245977 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242977 2568 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 04:39:05.245977 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242980 2568 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 04:39:05.245977 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242983 2568 flags.go:64] FLAG: --feature-gates="" Apr 21 04:39:05.245977 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242987 2568 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 04:39:05.245977 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242990 2568 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 04:39:05.245977 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242993 2568 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 04:39:05.245977 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242996 2568 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 04:39:05.245977 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.242999 2568 flags.go:64] FLAG: --healthz-port="10248" Apr 21 04:39:05.245977 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243002 2568 flags.go:64] FLAG: --help="false" Apr 21 04:39:05.245977 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243005 2568 flags.go:64] FLAG: --hostname-override="ip-10-0-141-241.ec2.internal" Apr 21 04:39:05.245977 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243008 2568 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 04:39:05.245977 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243011 2568 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 04:39:05.246578 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243013 2568 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 04:39:05.246578 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243017 2568 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 04:39:05.246578 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243020 2568 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 04:39:05.246578 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243023 2568 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 04:39:05.246578 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243026 2568 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 04:39:05.246578 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243029 2568 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 04:39:05.246578 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243033 2568 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 04:39:05.246578 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243036 2568 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 04:39:05.246578 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243039 2568 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 04:39:05.246578 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243042 2568 flags.go:64] FLAG: --kube-reserved="" Apr 21 04:39:05.246578 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243045 2568 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 04:39:05.246578 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243048 2568 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 04:39:05.246578 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243052 2568 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 04:39:05.246578 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243054 2568 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 04:39:05.246578 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243057 2568 flags.go:64] FLAG: --lock-file="" Apr 21 04:39:05.246578 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243060 2568 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 04:39:05.246578 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243063 2568 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 04:39:05.246578 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243077 2568 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 04:39:05.246578 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243083 2568 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 04:39:05.246578 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243086 2568 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 04:39:05.246578 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243089 2568 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 04:39:05.246578 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243091 2568 flags.go:64] FLAG: --logging-format="text" Apr 21 04:39:05.246578 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243094 2568 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 04:39:05.247156 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243098 2568 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 04:39:05.247156 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243101 2568 flags.go:64] FLAG: --manifest-url="" Apr 21 04:39:05.247156 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243103 2568 flags.go:64] FLAG: --manifest-url-header="" Apr 21 04:39:05.247156 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243108 2568 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 04:39:05.247156 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243111 2568 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 04:39:05.247156 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243115 2568 flags.go:64] FLAG: --max-pods="110" Apr 21 04:39:05.247156 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243118 2568 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 04:39:05.247156 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243121 2568 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 04:39:05.247156 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243124 2568 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 04:39:05.247156 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243127 2568 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 04:39:05.247156 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243130 2568 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 04:39:05.247156 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243133 2568 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 04:39:05.247156 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243136 2568 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 04:39:05.247156 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243144 2568 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 04:39:05.247156 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243147 2568 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 04:39:05.247156 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243150 2568 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 04:39:05.247156 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243154 2568 flags.go:64] FLAG: --pod-cidr="" Apr 21 04:39:05.247156 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243157 2568 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 04:39:05.247156 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243163 2568 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 04:39:05.247156 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243166 2568 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 04:39:05.247156 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243170 2568 flags.go:64] FLAG: --pods-per-core="0" Apr 21 04:39:05.247156 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243172 2568 flags.go:64] FLAG: --port="10250" Apr 21 04:39:05.247156 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243176 2568 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 04:39:05.247156 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243179 2568 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-064ebc4a5a0fda653" Apr 21 04:39:05.247796 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243182 2568 flags.go:64] FLAG: --qos-reserved="" Apr 21 04:39:05.247796 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243185 2568 flags.go:64] FLAG: --read-only-port="10255" Apr 21 04:39:05.247796 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243188 2568 flags.go:64] FLAG: --register-node="true" Apr 21 04:39:05.247796 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243191 2568 flags.go:64] FLAG: --register-schedulable="true" Apr 21 04:39:05.247796 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243194 2568 flags.go:64] FLAG: --register-with-taints="" Apr 21 04:39:05.247796 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243198 2568 flags.go:64] FLAG: --registry-burst="10" Apr 21 04:39:05.247796 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243201 2568 flags.go:64] FLAG: --registry-qps="5" Apr 21 04:39:05.247796 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243203 2568 flags.go:64] FLAG: --reserved-cpus="" Apr 21 04:39:05.247796 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243206 2568 flags.go:64] FLAG: --reserved-memory="" Apr 21 04:39:05.247796 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243210 2568 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 04:39:05.247796 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243214 2568 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 04:39:05.247796 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243217 2568 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 04:39:05.247796 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243220 2568 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 04:39:05.247796 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243222 2568 flags.go:64] FLAG: --runonce="false" Apr 21 04:39:05.247796 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243225 2568 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 04:39:05.247796 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243228 2568 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 04:39:05.247796 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243231 2568 flags.go:64] FLAG: --seccomp-default="false" Apr 21 04:39:05.247796 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243234 2568 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 04:39:05.247796 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243237 2568 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 04:39:05.247796 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243240 2568 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 04:39:05.247796 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243243 2568 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 04:39:05.247796 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243246 2568 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 04:39:05.247796 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243249 2568 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 04:39:05.247796 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243252 2568 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 04:39:05.247796 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243255 2568 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 04:39:05.247796 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243258 2568 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 04:39:05.248449 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243262 2568 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 04:39:05.248449 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243265 2568 flags.go:64] FLAG: --system-cgroups="" Apr 21 04:39:05.248449 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243268 2568 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 04:39:05.248449 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243273 2568 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 04:39:05.248449 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243277 2568 flags.go:64] FLAG: --tls-cert-file="" Apr 21 04:39:05.248449 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243280 2568 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 04:39:05.248449 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243284 2568 flags.go:64] FLAG: --tls-min-version="" Apr 21 04:39:05.248449 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243287 2568 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 04:39:05.248449 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243290 2568 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 04:39:05.248449 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243293 2568 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 04:39:05.248449 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243296 2568 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 04:39:05.248449 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243299 2568 flags.go:64] FLAG: --v="2" Apr 21 04:39:05.248449 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243304 2568 flags.go:64] FLAG: --version="false" Apr 21 04:39:05.248449 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243308 2568 flags.go:64] FLAG: --vmodule="" Apr 21 04:39:05.248449 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243313 2568 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 04:39:05.248449 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243316 2568 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 04:39:05.248449 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243423 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 04:39:05.248449 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243427 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 04:39:05.248449 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243430 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 21 04:39:05.248449 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243432 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 04:39:05.248449 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243438 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 04:39:05.248449 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243440 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 04:39:05.248449 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243443 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 04:39:05.249028 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243446 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 04:39:05.249028 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243448 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 04:39:05.249028 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243451 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 04:39:05.249028 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243454 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 04:39:05.249028 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243456 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 04:39:05.249028 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243459 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 04:39:05.249028 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243461 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 04:39:05.249028 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243464 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 04:39:05.249028 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243466 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 04:39:05.249028 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243470 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 04:39:05.249028 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243473 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 04:39:05.249028 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243476 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 04:39:05.249028 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243479 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 04:39:05.249028 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243481 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 04:39:05.249028 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243484 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 04:39:05.249028 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243486 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 04:39:05.249028 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243489 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 04:39:05.249028 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243491 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 04:39:05.249028 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243494 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 04:39:05.249028 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243497 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 04:39:05.249594 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243499 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 04:39:05.249594 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243502 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 04:39:05.249594 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243504 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 04:39:05.249594 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243507 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 04:39:05.249594 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243509 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 04:39:05.249594 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243512 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 04:39:05.249594 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243514 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 04:39:05.249594 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243518 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 04:39:05.249594 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243521 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 04:39:05.249594 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243526 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 04:39:05.249594 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243529 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 04:39:05.249594 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243531 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 04:39:05.249594 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243534 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 04:39:05.249594 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243536 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 04:39:05.249594 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243539 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 04:39:05.249594 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243541 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 04:39:05.249594 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243544 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 04:39:05.249594 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243546 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 04:39:05.249594 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243549 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 04:39:05.249594 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243551 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 04:39:05.250117 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243554 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 04:39:05.250117 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243556 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 04:39:05.250117 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243560 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 04:39:05.250117 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243562 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 04:39:05.250117 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243565 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 04:39:05.250117 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243571 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 04:39:05.250117 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243573 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 04:39:05.250117 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243576 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 04:39:05.250117 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243580 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 04:39:05.250117 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243584 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 04:39:05.250117 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243587 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 04:39:05.250117 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243590 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 04:39:05.250117 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243593 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 04:39:05.250117 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243596 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 04:39:05.250117 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243599 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 04:39:05.250117 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243602 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 04:39:05.250117 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243604 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 04:39:05.250117 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243607 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 04:39:05.250117 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243609 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 04:39:05.250592 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243612 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 04:39:05.250592 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243614 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 04:39:05.250592 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243619 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 04:39:05.250592 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243622 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 04:39:05.250592 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243624 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 04:39:05.250592 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243627 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 04:39:05.250592 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243629 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 04:39:05.250592 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243632 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 04:39:05.250592 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243635 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 04:39:05.250592 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243637 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 04:39:05.250592 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243640 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 04:39:05.250592 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243642 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 04:39:05.250592 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243645 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 04:39:05.250592 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243647 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 04:39:05.250592 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243650 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 04:39:05.250592 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243653 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 04:39:05.250592 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243656 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 04:39:05.250592 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243659 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 04:39:05.250592 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243662 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 04:39:05.250592 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.243665 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 04:39:05.251102 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.243671 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 04:39:05.252523 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.252501 2568 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 04:39:05.252576 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.252524 2568 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 04:39:05.252609 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252597 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 04:39:05.252609 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252603 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 04:39:05.252609 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252607 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 04:39:05.252609 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252610 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 04:39:05.252759 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252613 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 04:39:05.252759 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252617 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 04:39:05.252759 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252620 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 04:39:05.252759 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252623 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 04:39:05.252759 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252626 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 04:39:05.252759 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252629 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 04:39:05.252759 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252631 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 04:39:05.252759 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252634 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 04:39:05.252759 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252636 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 04:39:05.252759 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252639 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 04:39:05.252759 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252642 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 04:39:05.252759 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252645 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 04:39:05.252759 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252648 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 04:39:05.252759 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252650 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 04:39:05.252759 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252653 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 04:39:05.252759 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252656 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 04:39:05.252759 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252658 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 04:39:05.252759 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252661 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 04:39:05.252759 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252663 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 04:39:05.252759 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252666 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 04:39:05.253359 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252668 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 04:39:05.253359 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252671 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 04:39:05.253359 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252673 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 04:39:05.253359 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252676 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 04:39:05.253359 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252678 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 04:39:05.253359 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252681 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 04:39:05.253359 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252683 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 04:39:05.253359 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252687 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 04:39:05.253359 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252690 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 04:39:05.253359 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252693 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 04:39:05.253359 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252696 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 04:39:05.253359 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252698 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 21 04:39:05.253359 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252701 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 04:39:05.253359 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252704 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 04:39:05.253359 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252707 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 04:39:05.253359 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252709 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 04:39:05.253359 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252712 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 04:39:05.253359 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252714 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 04:39:05.253359 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252717 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 04:39:05.253359 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252719 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 04:39:05.253917 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252722 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 04:39:05.253917 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252724 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 04:39:05.253917 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252726 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 04:39:05.253917 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252729 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 04:39:05.253917 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252731 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 04:39:05.253917 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252734 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 04:39:05.253917 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252737 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 04:39:05.253917 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252739 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 04:39:05.253917 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252742 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 04:39:05.253917 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252744 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 04:39:05.253917 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252747 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 04:39:05.253917 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252749 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 04:39:05.253917 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252753 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 04:39:05.253917 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252755 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 04:39:05.253917 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252758 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 04:39:05.253917 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252760 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 04:39:05.253917 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252763 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 04:39:05.253917 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252766 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 04:39:05.253917 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252769 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 04:39:05.254550 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252772 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 04:39:05.254550 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252777 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 04:39:05.254550 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252780 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 04:39:05.254550 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252782 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 04:39:05.254550 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252785 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 04:39:05.254550 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252788 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 04:39:05.254550 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252792 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 04:39:05.254550 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252794 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 04:39:05.254550 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252797 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 04:39:05.254550 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252801 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 04:39:05.254550 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252805 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 04:39:05.254550 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252809 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 04:39:05.254550 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252811 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 04:39:05.254550 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252814 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 04:39:05.254550 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252817 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 04:39:05.254550 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252819 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 04:39:05.254550 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252822 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 04:39:05.254550 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252825 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 04:39:05.255057 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252827 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 04:39:05.255057 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252830 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 04:39:05.255057 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252832 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 04:39:05.255057 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252835 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 04:39:05.255057 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252837 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 04:39:05.255057 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.252842 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 04:39:05.255057 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252946 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 04:39:05.255057 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252953 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 04:39:05.255057 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252956 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 04:39:05.255057 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252960 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 04:39:05.255057 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252962 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 04:39:05.255057 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252965 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 04:39:05.255057 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252968 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 04:39:05.255057 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252971 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 04:39:05.255057 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252974 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 04:39:05.255474 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252977 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 04:39:05.255474 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252980 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 04:39:05.255474 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252983 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 04:39:05.255474 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252986 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 04:39:05.255474 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252988 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 04:39:05.255474 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252991 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 04:39:05.255474 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252994 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 04:39:05.255474 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252997 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 04:39:05.255474 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.252999 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 04:39:05.255474 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253002 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 04:39:05.255474 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253005 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 04:39:05.255474 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253007 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 04:39:05.255474 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253010 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 04:39:05.255474 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253013 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 04:39:05.255474 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253015 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 04:39:05.255474 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253018 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 04:39:05.255474 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253020 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 04:39:05.255474 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253023 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 04:39:05.255474 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253025 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 04:39:05.255474 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253028 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 04:39:05.255993 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253030 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 04:39:05.255993 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253033 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 04:39:05.255993 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253036 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 04:39:05.255993 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253039 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 04:39:05.255993 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253041 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 04:39:05.255993 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253044 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 04:39:05.255993 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253046 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 04:39:05.255993 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253049 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 04:39:05.255993 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253051 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 04:39:05.255993 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253054 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 04:39:05.255993 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253056 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 04:39:05.255993 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253059 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 04:39:05.255993 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253061 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 04:39:05.255993 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253063 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 04:39:05.255993 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253082 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 04:39:05.255993 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253086 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 04:39:05.255993 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253088 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 04:39:05.255993 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253092 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 04:39:05.255993 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253094 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 04:39:05.255993 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253097 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 04:39:05.256508 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253100 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 04:39:05.256508 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253103 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 04:39:05.256508 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253106 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 04:39:05.256508 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253108 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 04:39:05.256508 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253111 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 04:39:05.256508 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253113 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 21 04:39:05.256508 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253116 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 04:39:05.256508 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253119 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 04:39:05.256508 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253121 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 04:39:05.256508 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253124 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 04:39:05.256508 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253126 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 04:39:05.256508 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253129 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 04:39:05.256508 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253132 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 04:39:05.256508 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253135 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 04:39:05.256508 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253137 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 04:39:05.256508 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253140 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 04:39:05.256508 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253143 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 04:39:05.256508 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253145 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 04:39:05.256508 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253148 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 04:39:05.256508 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253151 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 04:39:05.257011 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253153 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 04:39:05.257011 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253156 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 04:39:05.257011 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253158 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 04:39:05.257011 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253161 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 04:39:05.257011 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253163 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 04:39:05.257011 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253166 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 04:39:05.257011 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253168 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 04:39:05.257011 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253171 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 04:39:05.257011 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253174 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 04:39:05.257011 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253177 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 04:39:05.257011 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253179 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 04:39:05.257011 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253182 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 04:39:05.257011 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253185 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 04:39:05.257011 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253187 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 04:39:05.257011 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253190 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 04:39:05.257011 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253193 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 04:39:05.257011 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:05.253196 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 04:39:05.257586 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.253202 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 04:39:05.257586 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.254088 2568 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 04:39:05.257586 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.256622 2568 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 04:39:05.257709 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.257697 2568 server.go:1019] "Starting client certificate rotation" Apr 21 04:39:05.257810 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.257791 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 04:39:05.257843 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.257838 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 04:39:05.284181 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.284160 2568 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 04:39:05.288395 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.288369 2568 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 04:39:05.304487 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.304466 2568 log.go:25] "Validated CRI v1 runtime API" Apr 21 04:39:05.311082 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.311053 2568 log.go:25] "Validated CRI v1 image API" Apr 21 04:39:05.312493 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.312475 2568 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 04:39:05.314767 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.314747 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 04:39:05.315664 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.315639 2568 fs.go:135] Filesystem UUIDs: map[0da61a82-708a-45f0-a0ae-b19fa30409f3:/dev/nvme0n1p4 6870d6ff-94b8-4940-a322-7c493cdacf8f:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 21 04:39:05.315709 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.315665 2568 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 04:39:05.320837 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.320720 2568 manager.go:217] Machine: {Timestamp:2026-04-21 04:39:05.319381575 +0000 UTC m=+0.440939374 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098795 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2836d634b1b21fb565a4e87cd77d66 SystemUUID:ec2836d6-34b1-b21f-b565-a4e87cd77d66 BootID:3ed036aa-8a3f-42fe-a204-2b9f98cf8c1f Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:8d:f5:06:45:ff Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:8d:f5:06:45:ff Speed:0 Mtu:9001} {Name:ovs-system MacAddress:82:18:f7:33:ef:84 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 04:39:05.321248 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.321238 2568 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 04:39:05.321339 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.321327 2568 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 04:39:05.323565 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.323538 2568 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 04:39:05.323706 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.323568 2568 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-241.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 04:39:05.324644 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.324634 2568 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 04:39:05.324684 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.324646 2568 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 04:39:05.324684 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.324659 2568 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 04:39:05.324684 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.324670 2568 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 04:39:05.326019 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.326009 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 21 04:39:05.326136 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.326127 2568 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 04:39:05.331202 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.331180 2568 kubelet.go:491] "Attempting to sync node with API server" Apr 21 04:39:05.331202 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.331208 2568 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 04:39:05.331304 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.331221 2568 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 04:39:05.331304 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.331230 2568 kubelet.go:397] "Adding apiserver pod source" Apr 21 04:39:05.331304 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.331239 2568 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 04:39:05.332335 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.332321 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 04:39:05.332374 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.332350 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 04:39:05.332418 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.332398 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-hkrsn" Apr 21 04:39:05.336236 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.336220 2568 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 04:39:05.339848 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.339826 2568 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 04:39:05.340400 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.339876 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-hkrsn" Apr 21 04:39:05.341875 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.341850 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 04:39:05.341936 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.341883 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 04:39:05.341936 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.341893 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 04:39:05.341936 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.341901 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 04:39:05.341936 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.341910 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 04:39:05.341936 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.341919 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 04:39:05.341936 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.341927 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 04:39:05.341936 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.341936 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 04:39:05.342157 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.341947 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 04:39:05.342157 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.341958 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 04:39:05.342157 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.341971 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 04:39:05.342157 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.341984 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 04:39:05.342908 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.342895 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 04:39:05.342908 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.342909 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 04:39:05.345959 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.345944 2568 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 04:39:05.346706 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.346694 2568 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 04:39:05.346747 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.346732 2568 server.go:1295] "Started kubelet" Apr 21 04:39:05.346891 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.346831 2568 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 04:39:05.346949 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.346915 2568 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 04:39:05.347185 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.346824 2568 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 04:39:05.347607 ip-10-0-141-241 systemd[1]: Started Kubernetes Kubelet. Apr 21 04:39:05.348323 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.348307 2568 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 04:39:05.349464 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.349444 2568 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 04:39:05.349464 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.349467 2568 server.go:317] "Adding debug handlers to kubelet server" Apr 21 04:39:05.351385 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.351366 2568 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-141-241.ec2.internal" not found Apr 21 04:39:05.353300 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.353284 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 04:39:05.353774 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.353761 2568 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 04:39:05.354459 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.354442 2568 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 04:39:05.354459 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.354460 2568 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 04:39:05.354597 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.354552 2568 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 04:39:05.354648 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.354602 2568 reconstruct.go:97] "Volume reconstruction finished" Apr 21 04:39:05.354648 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.354608 2568 reconciler.go:26] "Reconciler: start to sync state" Apr 21 04:39:05.354953 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:05.354904 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-241.ec2.internal\" not found" Apr 21 04:39:05.355040 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.354956 2568 factory.go:55] Registering systemd factory Apr 21 04:39:05.355040 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.355011 2568 factory.go:223] Registration of the systemd container factory successfully Apr 21 04:39:05.355495 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.355468 2568 factory.go:153] Registering CRI-O factory Apr 21 04:39:05.355495 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.355484 2568 factory.go:223] Registration of the crio container factory successfully Apr 21 04:39:05.355617 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.355554 2568 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 04:39:05.355617 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.355595 2568 factory.go:103] Registering Raw factory Apr 21 04:39:05.355617 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.355613 2568 manager.go:1196] Started watching for new ooms in manager Apr 21 04:39:05.356038 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.356025 2568 manager.go:319] Starting recovery of all containers Apr 21 04:39:05.356676 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:05.356656 2568 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 04:39:05.356867 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.356852 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 04:39:05.359739 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:05.359719 2568 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-141-241.ec2.internal\" not found" node="ip-10-0-141-241.ec2.internal" Apr 21 04:39:05.364729 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.364321 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 04:39:05.366987 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.366935 2568 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-141-241.ec2.internal" not found Apr 21 04:39:05.367989 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.367972 2568 manager.go:324] Recovery completed Apr 21 04:39:05.372227 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.372211 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 04:39:05.374310 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.374292 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-241.ec2.internal" event="NodeHasSufficientMemory" Apr 21 04:39:05.374385 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.374324 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-241.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 04:39:05.374385 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.374335 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-241.ec2.internal" event="NodeHasSufficientPID" Apr 21 04:39:05.374868 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.374853 2568 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 04:39:05.374868 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.374866 2568 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 04:39:05.374962 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.374883 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 21 04:39:05.377252 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.377240 2568 policy_none.go:49] "None policy: Start" Apr 21 04:39:05.377298 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.377257 2568 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 04:39:05.377298 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.377266 2568 state_mem.go:35] "Initializing new in-memory state store" Apr 21 04:39:05.416755 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.416736 2568 manager.go:341] "Starting Device Plugin manager" Apr 21 04:39:05.425667 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:05.416776 2568 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 04:39:05.425667 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.416790 2568 server.go:85] "Starting device plugin registration server" Apr 21 04:39:05.425667 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.417107 2568 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 04:39:05.425667 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.417118 2568 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 04:39:05.425667 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.417223 2568 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 04:39:05.425667 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.417301 2568 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 04:39:05.425667 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.417309 2568 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 04:39:05.425667 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:05.417860 2568 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 04:39:05.425667 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:05.417905 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-241.ec2.internal\" not found" Apr 21 04:39:05.425957 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.425895 2568 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-141-241.ec2.internal" not found Apr 21 04:39:05.480147 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.480122 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 04:39:05.480147 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.480156 2568 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 04:39:05.480350 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.480178 2568 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 04:39:05.480350 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.480186 2568 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 04:39:05.480350 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:05.480218 2568 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 04:39:05.482817 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.482798 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 04:39:05.517802 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.517714 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 04:39:05.519256 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.519240 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-241.ec2.internal" event="NodeHasSufficientMemory" Apr 21 04:39:05.519330 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.519271 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-241.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 04:39:05.519330 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.519283 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-241.ec2.internal" event="NodeHasSufficientPID" Apr 21 04:39:05.519330 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.519307 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-241.ec2.internal" Apr 21 04:39:05.528341 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.528323 2568 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-241.ec2.internal" Apr 21 04:39:05.581307 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.581281 2568 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-241.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-141-241.ec2.internal"] Apr 21 04:39:05.583506 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.583489 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-241.ec2.internal" Apr 21 04:39:05.583586 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.583491 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-241.ec2.internal" Apr 21 04:39:05.602598 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.602577 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-241.ec2.internal" Apr 21 04:39:05.606464 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.606451 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-241.ec2.internal" Apr 21 04:39:05.637402 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.637380 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 04:39:05.640053 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.640037 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 04:39:05.656239 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.656218 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4404daed50f62015edf6e395cc7a9dec-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-241.ec2.internal\" (UID: \"4404daed50f62015edf6e395cc7a9dec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-241.ec2.internal" Apr 21 04:39:05.656330 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.656247 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4404daed50f62015edf6e395cc7a9dec-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-241.ec2.internal\" (UID: \"4404daed50f62015edf6e395cc7a9dec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-241.ec2.internal" Apr 21 04:39:05.656330 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.656264 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b815c8ee131098fba59a15d8b002ae1a-config\") pod \"kube-apiserver-proxy-ip-10-0-141-241.ec2.internal\" (UID: \"b815c8ee131098fba59a15d8b002ae1a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-241.ec2.internal" Apr 21 04:39:05.757417 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.757383 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4404daed50f62015edf6e395cc7a9dec-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-241.ec2.internal\" (UID: \"4404daed50f62015edf6e395cc7a9dec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-241.ec2.internal" Apr 21 04:39:05.757417 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.757424 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4404daed50f62015edf6e395cc7a9dec-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-241.ec2.internal\" (UID: \"4404daed50f62015edf6e395cc7a9dec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-241.ec2.internal" Apr 21 04:39:05.757553 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.757453 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b815c8ee131098fba59a15d8b002ae1a-config\") pod \"kube-apiserver-proxy-ip-10-0-141-241.ec2.internal\" (UID: \"b815c8ee131098fba59a15d8b002ae1a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-241.ec2.internal" Apr 21 04:39:05.757553 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.757482 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4404daed50f62015edf6e395cc7a9dec-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-241.ec2.internal\" (UID: \"4404daed50f62015edf6e395cc7a9dec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-241.ec2.internal" Apr 21 04:39:05.757553 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.757497 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b815c8ee131098fba59a15d8b002ae1a-config\") pod \"kube-apiserver-proxy-ip-10-0-141-241.ec2.internal\" (UID: \"b815c8ee131098fba59a15d8b002ae1a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-241.ec2.internal" Apr 21 04:39:05.757553 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.757504 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4404daed50f62015edf6e395cc7a9dec-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-241.ec2.internal\" (UID: \"4404daed50f62015edf6e395cc7a9dec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-241.ec2.internal" Apr 21 04:39:05.941521 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.941494 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-241.ec2.internal" Apr 21 04:39:05.941698 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:05.941494 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-241.ec2.internal" Apr 21 04:39:06.257456 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.257383 2568 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 04:39:06.257950 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.257510 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 04:39:06.257950 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.257535 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 04:39:06.257950 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.257558 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 04:39:06.331999 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.331960 2568 apiserver.go:52] "Watching apiserver" Apr 21 04:39:06.342076 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.342017 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 04:34:05 +0000 UTC" deadline="2027-12-02 18:30:14.396756673 +0000 UTC" Apr 21 04:39:06.342076 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.342062 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14173h51m8.054698114s" Apr 21 04:39:06.343403 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.343386 2568 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 04:39:06.343765 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.343743 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jhdlv","kube-system/konnectivity-agent-8djwj","kube-system/kube-apiserver-proxy-ip-10-0-141-241.ec2.internal","openshift-cluster-node-tuning-operator/tuned-vq49v","openshift-dns/node-resolver-n7x9t","openshift-multus/network-metrics-daemon-z4rqh","openshift-network-operator/iptables-alerter-qjbk5","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gd7sb","openshift-image-registry/node-ca-hm47b","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-241.ec2.internal","openshift-multus/multus-2pbgh","openshift-multus/multus-additional-cni-plugins-km9z2","openshift-network-diagnostics/network-check-target-n47bc"] Apr 21 04:39:06.345175 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.345156 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.347811 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.347792 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-8djwj" Apr 21 04:39:06.348080 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.348040 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 04:39:06.348293 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.348270 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 04:39:06.348363 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.348346 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 04:39:06.348430 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.348348 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 04:39:06.348430 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.348401 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-vkg9t\"" Apr 21 04:39:06.348430 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.348425 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 04:39:06.349572 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.349556 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 04:39:06.350026 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.350008 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-6kmv9\"" Apr 21 04:39:06.350140 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.350088 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.350328 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.350316 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 04:39:06.350509 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.350496 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 04:39:06.351156 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.351136 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-n7x9t" Apr 21 04:39:06.352360 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.352344 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z4rqh" Apr 21 04:39:06.352477 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:06.352456 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z4rqh" podUID="8743d92c-6080-4066-ad83-55bb582a3f6c" Apr 21 04:39:06.352711 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.352695 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-qk6l9\"" Apr 21 04:39:06.352798 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.352695 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 04:39:06.352857 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.352832 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 04:39:06.353384 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.353369 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 04:39:06.353506 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.353491 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qjbk5" Apr 21 04:39:06.354402 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.354385 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 04:39:06.354487 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.354476 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-vz5cd\"" Apr 21 04:39:06.354547 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.354515 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 04:39:06.354768 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.354754 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gd7sb" Apr 21 04:39:06.355825 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.355810 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hm47b" Apr 21 04:39:06.357023 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.357004 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 04:39:06.357123 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.357005 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 04:39:06.357123 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.357034 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-smttk\"" Apr 21 04:39:06.357319 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.357304 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 04:39:06.357354 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.357343 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.357872 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.357850 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 04:39:06.358024 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.358005 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 04:39:06.358154 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.358137 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 04:39:06.358268 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.358252 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 04:39:06.358392 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.358375 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-jhc9c\"" Apr 21 04:39:06.358623 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.358608 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 04:39:06.358667 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.358608 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-dmpdz\"" Apr 21 04:39:06.358991 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.358973 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-km9z2" Apr 21 04:39:06.359413 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.359397 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 04:39:06.359855 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.359841 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 04:39:06.360347 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.360331 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 04:39:06.360347 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.360339 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 04:39:06.360461 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.360440 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-ckwn4\"" Apr 21 04:39:06.360569 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.360553 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n47bc" Apr 21 04:39:06.360626 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.360601 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 04:39:06.360680 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:06.360620 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n47bc" podUID="f1f8e1ef-1033-4a98-837c-e59fe409b8fa" Apr 21 04:39:06.361260 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.361245 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 04:39:06.361348 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.361245 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/800e074a-3700-4f73-b98f-4007eadc4414-etc-sysctl-d\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.361348 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.361293 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/800e074a-3700-4f73-b98f-4007eadc4414-etc-tuned\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.361348 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.361319 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-node-log\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.361477 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.361344 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-host-run-ovn-kubernetes\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.361477 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.361369 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8b8f621a-cf82-4827-a595-5b1724d2e7df-ovnkube-config\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.361477 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.361395 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8b8f621a-cf82-4827-a595-5b1724d2e7df-ovn-node-metrics-cert\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.361477 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.361439 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8743d92c-6080-4066-ad83-55bb582a3f6c-metrics-certs\") pod \"network-metrics-daemon-z4rqh\" (UID: \"8743d92c-6080-4066-ad83-55bb582a3f6c\") " pod="openshift-multus/network-metrics-daemon-z4rqh" Apr 21 04:39:06.361670 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.361487 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/94f8b350-f4aa-4dd0-b82a-7ab99b4b7831-socket-dir\") pod \"aws-ebs-csi-driver-node-gd7sb\" (UID: \"94f8b350-f4aa-4dd0-b82a-7ab99b4b7831\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gd7sb" Apr 21 04:39:06.361670 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.361521 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/800e074a-3700-4f73-b98f-4007eadc4414-sys\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.361670 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.361558 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcdn9\" (UniqueName: \"kubernetes.io/projected/78c76e75-e505-48b3-b431-e14495f564b9-kube-api-access-pcdn9\") pod \"iptables-alerter-qjbk5\" (UID: \"78c76e75-e505-48b3-b431-e14495f564b9\") " pod="openshift-network-operator/iptables-alerter-qjbk5" Apr 21 04:39:06.361670 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.361583 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/12c88bd4-6fd7-4968-a57a-c248d0a14470-agent-certs\") pod \"konnectivity-agent-8djwj\" (UID: \"12c88bd4-6fd7-4968-a57a-c248d0a14470\") " pod="kube-system/konnectivity-agent-8djwj" Apr 21 04:39:06.361670 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.361609 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/94f8b350-f4aa-4dd0-b82a-7ab99b4b7831-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gd7sb\" (UID: \"94f8b350-f4aa-4dd0-b82a-7ab99b4b7831\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gd7sb" Apr 21 04:39:06.361670 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.361643 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/800e074a-3700-4f73-b98f-4007eadc4414-etc-modprobe-d\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.361670 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.361667 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/800e074a-3700-4f73-b98f-4007eadc4414-etc-sysconfig\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.361930 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.361689 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-host-cni-bin\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.361930 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.361734 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8b8f621a-cf82-4827-a595-5b1724d2e7df-ovnkube-script-lib\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.361930 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.361768 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fed25630-f93d-40db-800e-f8042fc4f7ca-serviceca\") pod \"node-ca-hm47b\" (UID: \"fed25630-f93d-40db-800e-f8042fc4f7ca\") " pod="openshift-image-registry/node-ca-hm47b" Apr 21 04:39:06.361930 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.361794 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp2n9\" (UniqueName: \"kubernetes.io/projected/fed25630-f93d-40db-800e-f8042fc4f7ca-kube-api-access-rp2n9\") pod \"node-ca-hm47b\" (UID: \"fed25630-f93d-40db-800e-f8042fc4f7ca\") " pod="openshift-image-registry/node-ca-hm47b" Apr 21 04:39:06.361930 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.361840 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/12c88bd4-6fd7-4968-a57a-c248d0a14470-konnectivity-ca\") pod \"konnectivity-agent-8djwj\" (UID: \"12c88bd4-6fd7-4968-a57a-c248d0a14470\") " pod="kube-system/konnectivity-agent-8djwj" Apr 21 04:39:06.361930 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.361878 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/800e074a-3700-4f73-b98f-4007eadc4414-etc-kubernetes\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.361930 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.361901 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-host-kubelet\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.362223 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.361969 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75dgk\" (UniqueName: \"kubernetes.io/projected/8b8f621a-cf82-4827-a595-5b1724d2e7df-kube-api-access-75dgk\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.362223 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.361987 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 04:39:06.362223 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.362014 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/94f8b350-f4aa-4dd0-b82a-7ab99b4b7831-device-dir\") pod \"aws-ebs-csi-driver-node-gd7sb\" (UID: \"94f8b350-f4aa-4dd0-b82a-7ab99b4b7831\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gd7sb" Apr 21 04:39:06.362223 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.362047 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-host-cni-netd\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.362223 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.361990 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-5p9gd\"" Apr 21 04:39:06.362223 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.362103 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.362223 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.362137 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8b8f621a-cf82-4827-a595-5b1724d2e7df-env-overrides\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.362223 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.362160 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-run-systemd\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.362223 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.362182 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/800e074a-3700-4f73-b98f-4007eadc4414-run\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.362223 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.362197 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/800e074a-3700-4f73-b98f-4007eadc4414-tmp\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.362223 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.362213 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqs4p\" (UniqueName: \"kubernetes.io/projected/800e074a-3700-4f73-b98f-4007eadc4414-kube-api-access-mqs4p\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.362577 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.362227 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-host-run-netns\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.362577 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.362249 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-var-lib-openvswitch\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.362577 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.362265 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-499vz\" (UniqueName: \"kubernetes.io/projected/8743d92c-6080-4066-ad83-55bb582a3f6c-kube-api-access-499vz\") pod \"network-metrics-daemon-z4rqh\" (UID: \"8743d92c-6080-4066-ad83-55bb582a3f6c\") " pod="openshift-multus/network-metrics-daemon-z4rqh" Apr 21 04:39:06.362577 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.362279 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/800e074a-3700-4f73-b98f-4007eadc4414-etc-sysctl-conf\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.362577 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.362315 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/800e074a-3700-4f73-b98f-4007eadc4414-lib-modules\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.362577 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.362337 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/800e074a-3700-4f73-b98f-4007eadc4414-var-lib-kubelet\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.362577 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.362351 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-run-ovn\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.362577 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.362366 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/800e074a-3700-4f73-b98f-4007eadc4414-host\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.362577 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.362381 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-systemd-units\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.362577 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.362415 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-run-openvswitch\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.362577 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.362470 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a74e0bd7-e17f-4529-9299-93c38644ab68-tmp-dir\") pod \"node-resolver-n7x9t\" (UID: \"a74e0bd7-e17f-4529-9299-93c38644ab68\") " pod="openshift-dns/node-resolver-n7x9t" Apr 21 04:39:06.362577 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.362495 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22bvg\" (UniqueName: \"kubernetes.io/projected/a74e0bd7-e17f-4529-9299-93c38644ab68-kube-api-access-22bvg\") pod \"node-resolver-n7x9t\" (UID: \"a74e0bd7-e17f-4529-9299-93c38644ab68\") " pod="openshift-dns/node-resolver-n7x9t" Apr 21 04:39:06.362577 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.362522 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/94f8b350-f4aa-4dd0-b82a-7ab99b4b7831-registration-dir\") pod \"aws-ebs-csi-driver-node-gd7sb\" (UID: \"94f8b350-f4aa-4dd0-b82a-7ab99b4b7831\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gd7sb" Apr 21 04:39:06.362577 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.362572 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-987nh\" (UniqueName: \"kubernetes.io/projected/94f8b350-f4aa-4dd0-b82a-7ab99b4b7831-kube-api-access-987nh\") pod \"aws-ebs-csi-driver-node-gd7sb\" (UID: \"94f8b350-f4aa-4dd0-b82a-7ab99b4b7831\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gd7sb" Apr 21 04:39:06.363162 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.362631 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/78c76e75-e505-48b3-b431-e14495f564b9-iptables-alerter-script\") pod \"iptables-alerter-qjbk5\" (UID: \"78c76e75-e505-48b3-b431-e14495f564b9\") " pod="openshift-network-operator/iptables-alerter-qjbk5" Apr 21 04:39:06.363162 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.362676 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/78c76e75-e505-48b3-b431-e14495f564b9-host-slash\") pod \"iptables-alerter-qjbk5\" (UID: \"78c76e75-e505-48b3-b431-e14495f564b9\") " pod="openshift-network-operator/iptables-alerter-qjbk5" Apr 21 04:39:06.363162 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.362709 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-host-slash\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.363162 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.362746 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-etc-openvswitch\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.363162 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.362776 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/94f8b350-f4aa-4dd0-b82a-7ab99b4b7831-sys-fs\") pod \"aws-ebs-csi-driver-node-gd7sb\" (UID: \"94f8b350-f4aa-4dd0-b82a-7ab99b4b7831\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gd7sb" Apr 21 04:39:06.363162 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.362800 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/800e074a-3700-4f73-b98f-4007eadc4414-etc-systemd\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.363162 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.362826 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fed25630-f93d-40db-800e-f8042fc4f7ca-host\") pod \"node-ca-hm47b\" (UID: \"fed25630-f93d-40db-800e-f8042fc4f7ca\") " pod="openshift-image-registry/node-ca-hm47b" Apr 21 04:39:06.363162 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.362857 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/94f8b350-f4aa-4dd0-b82a-7ab99b4b7831-etc-selinux\") pod \"aws-ebs-csi-driver-node-gd7sb\" (UID: \"94f8b350-f4aa-4dd0-b82a-7ab99b4b7831\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gd7sb" Apr 21 04:39:06.363162 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.362892 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-log-socket\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.363162 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.362920 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a74e0bd7-e17f-4529-9299-93c38644ab68-hosts-file\") pod \"node-resolver-n7x9t\" (UID: \"a74e0bd7-e17f-4529-9299-93c38644ab68\") " pod="openshift-dns/node-resolver-n7x9t" Apr 21 04:39:06.365986 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.365967 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 04:39:06.388384 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.388359 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-t5hrm" Apr 21 04:39:06.395627 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.395605 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-t5hrm" Apr 21 04:39:06.456629 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.456605 2568 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 04:39:06.463281 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.463260 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmqbk\" (UniqueName: \"kubernetes.io/projected/f1f8e1ef-1033-4a98-837c-e59fe409b8fa-kube-api-access-fmqbk\") pod \"network-check-target-n47bc\" (UID: \"f1f8e1ef-1033-4a98-837c-e59fe409b8fa\") " pod="openshift-network-diagnostics/network-check-target-n47bc" Apr 21 04:39:06.463393 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.463297 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/800e074a-3700-4f73-b98f-4007eadc4414-etc-sysctl-d\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.463393 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.463321 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/800e074a-3700-4f73-b98f-4007eadc4414-etc-tuned\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.463393 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.463345 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-node-log\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.463393 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.463370 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-host-run-ovn-kubernetes\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.463591 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.463393 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8b8f621a-cf82-4827-a595-5b1724d2e7df-ovnkube-config\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.463591 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.463437 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8b8f621a-cf82-4827-a595-5b1724d2e7df-ovn-node-metrics-cert\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.463591 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.463449 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/800e074a-3700-4f73-b98f-4007eadc4414-etc-sysctl-d\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.463591 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.463462 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-node-log\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.463591 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.463464 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8743d92c-6080-4066-ad83-55bb582a3f6c-metrics-certs\") pod \"network-metrics-daemon-z4rqh\" (UID: \"8743d92c-6080-4066-ad83-55bb582a3f6c\") " pod="openshift-multus/network-metrics-daemon-z4rqh" Apr 21 04:39:06.463591 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.463462 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-host-run-ovn-kubernetes\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.463869 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.463608 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/94f8b350-f4aa-4dd0-b82a-7ab99b4b7831-socket-dir\") pod \"aws-ebs-csi-driver-node-gd7sb\" (UID: \"94f8b350-f4aa-4dd0-b82a-7ab99b4b7831\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gd7sb" Apr 21 04:39:06.463869 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:06.463633 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:39:06.463869 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.463652 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/800e074a-3700-4f73-b98f-4007eadc4414-sys\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.463869 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.463654 2568 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 04:39:06.463869 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.463708 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/800e074a-3700-4f73-b98f-4007eadc4414-sys\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.463869 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:06.463729 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8743d92c-6080-4066-ad83-55bb582a3f6c-metrics-certs podName:8743d92c-6080-4066-ad83-55bb582a3f6c nodeName:}" failed. No retries permitted until 2026-04-21 04:39:06.963690437 +0000 UTC m=+2.085248224 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8743d92c-6080-4066-ad83-55bb582a3f6c-metrics-certs") pod "network-metrics-daemon-z4rqh" (UID: "8743d92c-6080-4066-ad83-55bb582a3f6c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:39:06.463869 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.463757 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/94f8b350-f4aa-4dd0-b82a-7ab99b4b7831-socket-dir\") pod \"aws-ebs-csi-driver-node-gd7sb\" (UID: \"94f8b350-f4aa-4dd0-b82a-7ab99b4b7831\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gd7sb" Apr 21 04:39:06.463869 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.463801 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pcdn9\" (UniqueName: \"kubernetes.io/projected/78c76e75-e505-48b3-b431-e14495f564b9-kube-api-access-pcdn9\") pod \"iptables-alerter-qjbk5\" (UID: \"78c76e75-e505-48b3-b431-e14495f564b9\") " pod="openshift-network-operator/iptables-alerter-qjbk5" Apr 21 04:39:06.463869 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.463831 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/12c88bd4-6fd7-4968-a57a-c248d0a14470-agent-certs\") pod \"konnectivity-agent-8djwj\" (UID: \"12c88bd4-6fd7-4968-a57a-c248d0a14470\") " pod="kube-system/konnectivity-agent-8djwj" Apr 21 04:39:06.463869 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.463854 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/94f8b350-f4aa-4dd0-b82a-7ab99b4b7831-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gd7sb\" (UID: \"94f8b350-f4aa-4dd0-b82a-7ab99b4b7831\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gd7sb" Apr 21 04:39:06.464400 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.463884 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4d82872b-b0cd-4247-abb3-ce1e75dfd32b-system-cni-dir\") pod \"multus-additional-cni-plugins-km9z2\" (UID: \"4d82872b-b0cd-4247-abb3-ce1e75dfd32b\") " pod="openshift-multus/multus-additional-cni-plugins-km9z2" Apr 21 04:39:06.464400 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.463976 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/94f8b350-f4aa-4dd0-b82a-7ab99b4b7831-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gd7sb\" (UID: \"94f8b350-f4aa-4dd0-b82a-7ab99b4b7831\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gd7sb" Apr 21 04:39:06.464400 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.463998 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/800e074a-3700-4f73-b98f-4007eadc4414-etc-modprobe-d\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.464400 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.464028 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/800e074a-3700-4f73-b98f-4007eadc4414-etc-sysconfig\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.464400 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.464033 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8b8f621a-cf82-4827-a595-5b1724d2e7df-ovnkube-config\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.464400 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.464056 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-multus-socket-dir-parent\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.464400 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.464097 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-host-var-lib-cni-bin\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.464400 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.464099 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/800e074a-3700-4f73-b98f-4007eadc4414-etc-modprobe-d\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.464400 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.464120 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-multus-conf-dir\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.464400 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.464149 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-host-cni-bin\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.464400 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.464155 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/800e074a-3700-4f73-b98f-4007eadc4414-etc-sysconfig\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.464400 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.464191 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8b8f621a-cf82-4827-a595-5b1724d2e7df-ovnkube-script-lib\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.464400 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.464196 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-host-cni-bin\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.464400 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.464218 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-host-run-multus-certs\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.464400 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.464245 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fed25630-f93d-40db-800e-f8042fc4f7ca-serviceca\") pod \"node-ca-hm47b\" (UID: \"fed25630-f93d-40db-800e-f8042fc4f7ca\") " pod="openshift-image-registry/node-ca-hm47b" Apr 21 04:39:06.464400 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.464309 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rp2n9\" (UniqueName: \"kubernetes.io/projected/fed25630-f93d-40db-800e-f8042fc4f7ca-kube-api-access-rp2n9\") pod \"node-ca-hm47b\" (UID: \"fed25630-f93d-40db-800e-f8042fc4f7ca\") " pod="openshift-image-registry/node-ca-hm47b" Apr 21 04:39:06.464400 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.464354 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/12c88bd4-6fd7-4968-a57a-c248d0a14470-konnectivity-ca\") pod \"konnectivity-agent-8djwj\" (UID: \"12c88bd4-6fd7-4968-a57a-c248d0a14470\") " pod="kube-system/konnectivity-agent-8djwj" Apr 21 04:39:06.465223 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.464466 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-system-cni-dir\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.465223 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.464496 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4d82872b-b0cd-4247-abb3-ce1e75dfd32b-cni-binary-copy\") pod \"multus-additional-cni-plugins-km9z2\" (UID: \"4d82872b-b0cd-4247-abb3-ce1e75dfd32b\") " pod="openshift-multus/multus-additional-cni-plugins-km9z2" Apr 21 04:39:06.465223 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.464534 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/800e074a-3700-4f73-b98f-4007eadc4414-etc-kubernetes\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.465223 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.464559 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-host-kubelet\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.465223 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.464596 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75dgk\" (UniqueName: \"kubernetes.io/projected/8b8f621a-cf82-4827-a595-5b1724d2e7df-kube-api-access-75dgk\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.465223 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.464656 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-host-kubelet\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.465223 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.464688 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8b8f621a-cf82-4827-a595-5b1724d2e7df-ovnkube-script-lib\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.465223 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.464690 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/94f8b350-f4aa-4dd0-b82a-7ab99b4b7831-device-dir\") pod \"aws-ebs-csi-driver-node-gd7sb\" (UID: \"94f8b350-f4aa-4dd0-b82a-7ab99b4b7831\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gd7sb" Apr 21 04:39:06.465223 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.464728 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/94f8b350-f4aa-4dd0-b82a-7ab99b4b7831-device-dir\") pod \"aws-ebs-csi-driver-node-gd7sb\" (UID: \"94f8b350-f4aa-4dd0-b82a-7ab99b4b7831\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gd7sb" Apr 21 04:39:06.465223 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.464731 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4d82872b-b0cd-4247-abb3-ce1e75dfd32b-cnibin\") pod \"multus-additional-cni-plugins-km9z2\" (UID: \"4d82872b-b0cd-4247-abb3-ce1e75dfd32b\") " pod="openshift-multus/multus-additional-cni-plugins-km9z2" Apr 21 04:39:06.465223 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.464755 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fed25630-f93d-40db-800e-f8042fc4f7ca-serviceca\") pod \"node-ca-hm47b\" (UID: \"fed25630-f93d-40db-800e-f8042fc4f7ca\") " pod="openshift-image-registry/node-ca-hm47b" Apr 21 04:39:06.465223 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.464865 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-host-cni-netd\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.465223 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.464908 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/800e074a-3700-4f73-b98f-4007eadc4414-etc-kubernetes\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.465223 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.464953 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.465223 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.464983 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8b8f621a-cf82-4827-a595-5b1724d2e7df-env-overrides\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.465223 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.465013 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4d82872b-b0cd-4247-abb3-ce1e75dfd32b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-km9z2\" (UID: \"4d82872b-b0cd-4247-abb3-ce1e75dfd32b\") " pod="openshift-multus/multus-additional-cni-plugins-km9z2" Apr 21 04:39:06.465223 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.465019 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-host-cni-netd\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.466007 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.465050 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/12c88bd4-6fd7-4968-a57a-c248d0a14470-konnectivity-ca\") pod \"konnectivity-agent-8djwj\" (UID: \"12c88bd4-6fd7-4968-a57a-c248d0a14470\") " pod="kube-system/konnectivity-agent-8djwj" Apr 21 04:39:06.466007 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.465095 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.466007 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.465062 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-run-systemd\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.466007 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.465138 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/800e074a-3700-4f73-b98f-4007eadc4414-run\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.466007 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.465174 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/800e074a-3700-4f73-b98f-4007eadc4414-tmp\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.466007 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.465230 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqs4p\" (UniqueName: \"kubernetes.io/projected/800e074a-3700-4f73-b98f-4007eadc4414-kube-api-access-mqs4p\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.466007 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.465261 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-host-run-netns\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.466007 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.465283 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/800e074a-3700-4f73-b98f-4007eadc4414-run\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.466007 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.465285 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-var-lib-openvswitch\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.466007 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.465321 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-var-lib-openvswitch\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.466007 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.465321 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-499vz\" (UniqueName: \"kubernetes.io/projected/8743d92c-6080-4066-ad83-55bb582a3f6c-kube-api-access-499vz\") pod \"network-metrics-daemon-z4rqh\" (UID: \"8743d92c-6080-4066-ad83-55bb582a3f6c\") " pod="openshift-multus/network-metrics-daemon-z4rqh" Apr 21 04:39:06.466007 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.465401 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8b8f621a-cf82-4827-a595-5b1724d2e7df-env-overrides\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.466007 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.465537 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-host-run-netns\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.466007 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.465572 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-run-systemd\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.466007 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.465679 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-os-release\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.466007 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.465712 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/800e074a-3700-4f73-b98f-4007eadc4414-etc-sysctl-conf\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.466007 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.465737 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/800e074a-3700-4f73-b98f-4007eadc4414-lib-modules\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.466842 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.465837 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/800e074a-3700-4f73-b98f-4007eadc4414-lib-modules\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.466842 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.465840 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/800e074a-3700-4f73-b98f-4007eadc4414-etc-sysctl-conf\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.466842 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.465884 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/800e074a-3700-4f73-b98f-4007eadc4414-var-lib-kubelet\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.466842 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.465932 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-run-ovn\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.466842 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.465962 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/18ab0325-5097-4d89-bf24-9c599b9efbdc-cni-binary-copy\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.466842 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.465971 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/800e074a-3700-4f73-b98f-4007eadc4414-var-lib-kubelet\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.466842 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.465989 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-host-run-k8s-cni-cncf-io\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.466842 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466012 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-hostroot\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.466842 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466039 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-run-ovn\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.466842 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466092 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4d82872b-b0cd-4247-abb3-ce1e75dfd32b-os-release\") pod \"multus-additional-cni-plugins-km9z2\" (UID: \"4d82872b-b0cd-4247-abb3-ce1e75dfd32b\") " pod="openshift-multus/multus-additional-cni-plugins-km9z2" Apr 21 04:39:06.466842 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466122 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/800e074a-3700-4f73-b98f-4007eadc4414-host\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.466842 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466148 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-systemd-units\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.466842 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466174 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-run-openvswitch\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.466842 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466199 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a74e0bd7-e17f-4529-9299-93c38644ab68-tmp-dir\") pod \"node-resolver-n7x9t\" (UID: \"a74e0bd7-e17f-4529-9299-93c38644ab68\") " pod="openshift-dns/node-resolver-n7x9t" Apr 21 04:39:06.466842 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466215 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/800e074a-3700-4f73-b98f-4007eadc4414-host\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.466842 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466232 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22bvg\" (UniqueName: \"kubernetes.io/projected/a74e0bd7-e17f-4529-9299-93c38644ab68-kube-api-access-22bvg\") pod \"node-resolver-n7x9t\" (UID: \"a74e0bd7-e17f-4529-9299-93c38644ab68\") " pod="openshift-dns/node-resolver-n7x9t" Apr 21 04:39:06.466842 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466256 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/94f8b350-f4aa-4dd0-b82a-7ab99b4b7831-registration-dir\") pod \"aws-ebs-csi-driver-node-gd7sb\" (UID: \"94f8b350-f4aa-4dd0-b82a-7ab99b4b7831\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gd7sb" Apr 21 04:39:06.467674 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466259 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-run-openvswitch\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.467674 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466282 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-987nh\" (UniqueName: \"kubernetes.io/projected/94f8b350-f4aa-4dd0-b82a-7ab99b4b7831-kube-api-access-987nh\") pod \"aws-ebs-csi-driver-node-gd7sb\" (UID: \"94f8b350-f4aa-4dd0-b82a-7ab99b4b7831\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gd7sb" Apr 21 04:39:06.467674 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466296 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-systemd-units\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.467674 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466323 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-cnibin\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.467674 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466353 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ltqp\" (UniqueName: \"kubernetes.io/projected/18ab0325-5097-4d89-bf24-9c599b9efbdc-kube-api-access-5ltqp\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.467674 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466386 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4d82872b-b0cd-4247-abb3-ce1e75dfd32b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-km9z2\" (UID: \"4d82872b-b0cd-4247-abb3-ce1e75dfd32b\") " pod="openshift-multus/multus-additional-cni-plugins-km9z2" Apr 21 04:39:06.467674 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466415 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/78c76e75-e505-48b3-b431-e14495f564b9-iptables-alerter-script\") pod \"iptables-alerter-qjbk5\" (UID: \"78c76e75-e505-48b3-b431-e14495f564b9\") " pod="openshift-network-operator/iptables-alerter-qjbk5" Apr 21 04:39:06.467674 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466439 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/78c76e75-e505-48b3-b431-e14495f564b9-host-slash\") pod \"iptables-alerter-qjbk5\" (UID: \"78c76e75-e505-48b3-b431-e14495f564b9\") " pod="openshift-network-operator/iptables-alerter-qjbk5" Apr 21 04:39:06.467674 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466463 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-host-slash\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.467674 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466490 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-host-var-lib-kubelet\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.467674 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466520 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-etc-openvswitch\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.467674 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466545 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/94f8b350-f4aa-4dd0-b82a-7ab99b4b7831-sys-fs\") pod \"aws-ebs-csi-driver-node-gd7sb\" (UID: \"94f8b350-f4aa-4dd0-b82a-7ab99b4b7831\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gd7sb" Apr 21 04:39:06.467674 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466572 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-host-var-lib-cni-multus\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.467674 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466600 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/800e074a-3700-4f73-b98f-4007eadc4414-etc-systemd\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.467674 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466629 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fed25630-f93d-40db-800e-f8042fc4f7ca-host\") pod \"node-ca-hm47b\" (UID: \"fed25630-f93d-40db-800e-f8042fc4f7ca\") " pod="openshift-image-registry/node-ca-hm47b" Apr 21 04:39:06.467674 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466654 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/94f8b350-f4aa-4dd0-b82a-7ab99b4b7831-etc-selinux\") pod \"aws-ebs-csi-driver-node-gd7sb\" (UID: \"94f8b350-f4aa-4dd0-b82a-7ab99b4b7831\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gd7sb" Apr 21 04:39:06.467674 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466679 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/18ab0325-5097-4d89-bf24-9c599b9efbdc-multus-daemon-config\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.468167 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466707 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4d82872b-b0cd-4247-abb3-ce1e75dfd32b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-km9z2\" (UID: \"4d82872b-b0cd-4247-abb3-ce1e75dfd32b\") " pod="openshift-multus/multus-additional-cni-plugins-km9z2" Apr 21 04:39:06.468167 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466721 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a74e0bd7-e17f-4529-9299-93c38644ab68-tmp-dir\") pod \"node-resolver-n7x9t\" (UID: \"a74e0bd7-e17f-4529-9299-93c38644ab68\") " pod="openshift-dns/node-resolver-n7x9t" Apr 21 04:39:06.468167 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466736 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/94f8b350-f4aa-4dd0-b82a-7ab99b4b7831-registration-dir\") pod \"aws-ebs-csi-driver-node-gd7sb\" (UID: \"94f8b350-f4aa-4dd0-b82a-7ab99b4b7831\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gd7sb" Apr 21 04:39:06.468167 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466734 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-log-socket\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.468167 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466795 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a74e0bd7-e17f-4529-9299-93c38644ab68-hosts-file\") pod \"node-resolver-n7x9t\" (UID: \"a74e0bd7-e17f-4529-9299-93c38644ab68\") " pod="openshift-dns/node-resolver-n7x9t" Apr 21 04:39:06.468167 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466824 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-multus-cni-dir\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.468167 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466828 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fed25630-f93d-40db-800e-f8042fc4f7ca-host\") pod \"node-ca-hm47b\" (UID: \"fed25630-f93d-40db-800e-f8042fc4f7ca\") " pod="openshift-image-registry/node-ca-hm47b" Apr 21 04:39:06.468167 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466860 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-host-run-netns\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.468167 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466901 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-host-slash\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.468167 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466937 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-etc-openvswitch\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.468167 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466950 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/800e074a-3700-4f73-b98f-4007eadc4414-etc-systemd\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.468167 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466950 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8b8f621a-cf82-4827-a595-5b1724d2e7df-log-socket\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.468167 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466958 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/94f8b350-f4aa-4dd0-b82a-7ab99b4b7831-sys-fs\") pod \"aws-ebs-csi-driver-node-gd7sb\" (UID: \"94f8b350-f4aa-4dd0-b82a-7ab99b4b7831\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gd7sb" Apr 21 04:39:06.468167 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466982 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-etc-kubernetes\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.468167 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.466991 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a74e0bd7-e17f-4529-9299-93c38644ab68-hosts-file\") pod \"node-resolver-n7x9t\" (UID: \"a74e0bd7-e17f-4529-9299-93c38644ab68\") " pod="openshift-dns/node-resolver-n7x9t" Apr 21 04:39:06.468167 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.467014 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcslb\" (UniqueName: \"kubernetes.io/projected/4d82872b-b0cd-4247-abb3-ce1e75dfd32b-kube-api-access-xcslb\") pod \"multus-additional-cni-plugins-km9z2\" (UID: \"4d82872b-b0cd-4247-abb3-ce1e75dfd32b\") " pod="openshift-multus/multus-additional-cni-plugins-km9z2" Apr 21 04:39:06.468167 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.467083 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/94f8b350-f4aa-4dd0-b82a-7ab99b4b7831-etc-selinux\") pod \"aws-ebs-csi-driver-node-gd7sb\" (UID: \"94f8b350-f4aa-4dd0-b82a-7ab99b4b7831\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gd7sb" Apr 21 04:39:06.468752 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.467107 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8b8f621a-cf82-4827-a595-5b1724d2e7df-ovn-node-metrics-cert\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.468752 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.467127 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/78c76e75-e505-48b3-b431-e14495f564b9-host-slash\") pod \"iptables-alerter-qjbk5\" (UID: \"78c76e75-e505-48b3-b431-e14495f564b9\") " pod="openshift-network-operator/iptables-alerter-qjbk5" Apr 21 04:39:06.468752 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.467182 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/800e074a-3700-4f73-b98f-4007eadc4414-etc-tuned\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.468752 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.467321 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/800e074a-3700-4f73-b98f-4007eadc4414-tmp\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.468752 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.467485 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/78c76e75-e505-48b3-b431-e14495f564b9-iptables-alerter-script\") pod \"iptables-alerter-qjbk5\" (UID: \"78c76e75-e505-48b3-b431-e14495f564b9\") " pod="openshift-network-operator/iptables-alerter-qjbk5" Apr 21 04:39:06.468752 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.467524 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/12c88bd4-6fd7-4968-a57a-c248d0a14470-agent-certs\") pod \"konnectivity-agent-8djwj\" (UID: \"12c88bd4-6fd7-4968-a57a-c248d0a14470\") " pod="kube-system/konnectivity-agent-8djwj" Apr 21 04:39:06.469422 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:06.469388 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb815c8ee131098fba59a15d8b002ae1a.slice/crio-0280f5cebd5ad15b004d7cb69d77ef972fdd5b6b9a727a78e571fbde01d12fd6 WatchSource:0}: Error finding container 0280f5cebd5ad15b004d7cb69d77ef972fdd5b6b9a727a78e571fbde01d12fd6: Status 404 returned error can't find the container with id 0280f5cebd5ad15b004d7cb69d77ef972fdd5b6b9a727a78e571fbde01d12fd6 Apr 21 04:39:06.469627 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:06.469609 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4404daed50f62015edf6e395cc7a9dec.slice/crio-f96ab3b3e28b60a914c6879cd6a065bb9c01f2cbc83200d485fb4addc992f538 WatchSource:0}: Error finding container f96ab3b3e28b60a914c6879cd6a065bb9c01f2cbc83200d485fb4addc992f538: Status 404 returned error can't find the container with id f96ab3b3e28b60a914c6879cd6a065bb9c01f2cbc83200d485fb4addc992f538 Apr 21 04:39:06.472859 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.472828 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-75dgk\" (UniqueName: \"kubernetes.io/projected/8b8f621a-cf82-4827-a595-5b1724d2e7df-kube-api-access-75dgk\") pod \"ovnkube-node-jhdlv\" (UID: \"8b8f621a-cf82-4827-a595-5b1724d2e7df\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.474446 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.474427 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:39:06.477170 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.476751 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-22bvg\" (UniqueName: \"kubernetes.io/projected/a74e0bd7-e17f-4529-9299-93c38644ab68-kube-api-access-22bvg\") pod \"node-resolver-n7x9t\" (UID: \"a74e0bd7-e17f-4529-9299-93c38644ab68\") " pod="openshift-dns/node-resolver-n7x9t" Apr 21 04:39:06.477170 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.477036 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqs4p\" (UniqueName: \"kubernetes.io/projected/800e074a-3700-4f73-b98f-4007eadc4414-kube-api-access-mqs4p\") pod \"tuned-vq49v\" (UID: \"800e074a-3700-4f73-b98f-4007eadc4414\") " pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.478427 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.477541 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcdn9\" (UniqueName: \"kubernetes.io/projected/78c76e75-e505-48b3-b431-e14495f564b9-kube-api-access-pcdn9\") pod \"iptables-alerter-qjbk5\" (UID: \"78c76e75-e505-48b3-b431-e14495f564b9\") " pod="openshift-network-operator/iptables-alerter-qjbk5" Apr 21 04:39:06.478427 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.477563 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp2n9\" (UniqueName: \"kubernetes.io/projected/fed25630-f93d-40db-800e-f8042fc4f7ca-kube-api-access-rp2n9\") pod \"node-ca-hm47b\" (UID: \"fed25630-f93d-40db-800e-f8042fc4f7ca\") " pod="openshift-image-registry/node-ca-hm47b" Apr 21 04:39:06.478750 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.478728 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-499vz\" (UniqueName: \"kubernetes.io/projected/8743d92c-6080-4066-ad83-55bb582a3f6c-kube-api-access-499vz\") pod \"network-metrics-daemon-z4rqh\" (UID: \"8743d92c-6080-4066-ad83-55bb582a3f6c\") " pod="openshift-multus/network-metrics-daemon-z4rqh" Apr 21 04:39:06.478907 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.478867 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-987nh\" (UniqueName: \"kubernetes.io/projected/94f8b350-f4aa-4dd0-b82a-7ab99b4b7831-kube-api-access-987nh\") pod \"aws-ebs-csi-driver-node-gd7sb\" (UID: \"94f8b350-f4aa-4dd0-b82a-7ab99b4b7831\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gd7sb" Apr 21 04:39:06.482723 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.482687 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-241.ec2.internal" event={"ID":"4404daed50f62015edf6e395cc7a9dec","Type":"ContainerStarted","Data":"f96ab3b3e28b60a914c6879cd6a065bb9c01f2cbc83200d485fb4addc992f538"} Apr 21 04:39:06.483443 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.483425 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-241.ec2.internal" event={"ID":"b815c8ee131098fba59a15d8b002ae1a","Type":"ContainerStarted","Data":"0280f5cebd5ad15b004d7cb69d77ef972fdd5b6b9a727a78e571fbde01d12fd6"} Apr 21 04:39:06.487378 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.487363 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hm47b" Apr 21 04:39:06.493992 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:06.493973 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfed25630_f93d_40db_800e_f8042fc4f7ca.slice/crio-719c622ee97c9a059febbb68dd2fa8d77ac2462715f1700ce030f536259a8420 WatchSource:0}: Error finding container 719c622ee97c9a059febbb68dd2fa8d77ac2462715f1700ce030f536259a8420: Status 404 returned error can't find the container with id 719c622ee97c9a059febbb68dd2fa8d77ac2462715f1700ce030f536259a8420 Apr 21 04:39:06.568167 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568088 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-system-cni-dir\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.568167 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568118 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4d82872b-b0cd-4247-abb3-ce1e75dfd32b-cni-binary-copy\") pod \"multus-additional-cni-plugins-km9z2\" (UID: \"4d82872b-b0cd-4247-abb3-ce1e75dfd32b\") " pod="openshift-multus/multus-additional-cni-plugins-km9z2" Apr 21 04:39:06.568356 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568176 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-system-cni-dir\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.568356 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568217 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4d82872b-b0cd-4247-abb3-ce1e75dfd32b-cnibin\") pod \"multus-additional-cni-plugins-km9z2\" (UID: \"4d82872b-b0cd-4247-abb3-ce1e75dfd32b\") " pod="openshift-multus/multus-additional-cni-plugins-km9z2" Apr 21 04:39:06.568356 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568248 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4d82872b-b0cd-4247-abb3-ce1e75dfd32b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-km9z2\" (UID: \"4d82872b-b0cd-4247-abb3-ce1e75dfd32b\") " pod="openshift-multus/multus-additional-cni-plugins-km9z2" Apr 21 04:39:06.568356 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568280 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-os-release\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.568356 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568306 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/18ab0325-5097-4d89-bf24-9c599b9efbdc-cni-binary-copy\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.568356 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568318 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4d82872b-b0cd-4247-abb3-ce1e75dfd32b-cnibin\") pod \"multus-additional-cni-plugins-km9z2\" (UID: \"4d82872b-b0cd-4247-abb3-ce1e75dfd32b\") " pod="openshift-multus/multus-additional-cni-plugins-km9z2" Apr 21 04:39:06.568356 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568331 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-host-run-k8s-cni-cncf-io\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.568356 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568356 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-hostroot\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.568737 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568382 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4d82872b-b0cd-4247-abb3-ce1e75dfd32b-os-release\") pod \"multus-additional-cni-plugins-km9z2\" (UID: \"4d82872b-b0cd-4247-abb3-ce1e75dfd32b\") " pod="openshift-multus/multus-additional-cni-plugins-km9z2" Apr 21 04:39:06.568737 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568399 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-host-run-k8s-cni-cncf-io\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.568737 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568399 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-os-release\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.568737 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568413 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-cnibin\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.568737 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568437 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-hostroot\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.568737 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568446 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ltqp\" (UniqueName: \"kubernetes.io/projected/18ab0325-5097-4d89-bf24-9c599b9efbdc-kube-api-access-5ltqp\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.568737 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568460 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-cnibin\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.568737 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568466 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4d82872b-b0cd-4247-abb3-ce1e75dfd32b-os-release\") pod \"multus-additional-cni-plugins-km9z2\" (UID: \"4d82872b-b0cd-4247-abb3-ce1e75dfd32b\") " pod="openshift-multus/multus-additional-cni-plugins-km9z2" Apr 21 04:39:06.568737 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568475 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4d82872b-b0cd-4247-abb3-ce1e75dfd32b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-km9z2\" (UID: \"4d82872b-b0cd-4247-abb3-ce1e75dfd32b\") " pod="openshift-multus/multus-additional-cni-plugins-km9z2" Apr 21 04:39:06.568737 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568507 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-host-var-lib-kubelet\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.568737 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568533 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-host-var-lib-cni-multus\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.568737 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568559 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/18ab0325-5097-4d89-bf24-9c599b9efbdc-multus-daemon-config\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.568737 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568561 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-host-var-lib-kubelet\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.568737 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568583 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4d82872b-b0cd-4247-abb3-ce1e75dfd32b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-km9z2\" (UID: \"4d82872b-b0cd-4247-abb3-ce1e75dfd32b\") " pod="openshift-multus/multus-additional-cni-plugins-km9z2" Apr 21 04:39:06.568737 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568586 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-host-var-lib-cni-multus\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.568737 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568610 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-multus-cni-dir\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.568737 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568647 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-host-run-netns\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.568737 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568665 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-multus-cni-dir\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.569512 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568695 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-etc-kubernetes\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.569512 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568705 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-host-run-netns\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.569512 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568721 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xcslb\" (UniqueName: \"kubernetes.io/projected/4d82872b-b0cd-4247-abb3-ce1e75dfd32b-kube-api-access-xcslb\") pod \"multus-additional-cni-plugins-km9z2\" (UID: \"4d82872b-b0cd-4247-abb3-ce1e75dfd32b\") " pod="openshift-multus/multus-additional-cni-plugins-km9z2" Apr 21 04:39:06.569512 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568743 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-etc-kubernetes\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.569512 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568774 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmqbk\" (UniqueName: \"kubernetes.io/projected/f1f8e1ef-1033-4a98-837c-e59fe409b8fa-kube-api-access-fmqbk\") pod \"network-check-target-n47bc\" (UID: \"f1f8e1ef-1033-4a98-837c-e59fe409b8fa\") " pod="openshift-network-diagnostics/network-check-target-n47bc" Apr 21 04:39:06.569512 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568808 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4d82872b-b0cd-4247-abb3-ce1e75dfd32b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-km9z2\" (UID: \"4d82872b-b0cd-4247-abb3-ce1e75dfd32b\") " pod="openshift-multus/multus-additional-cni-plugins-km9z2" Apr 21 04:39:06.569512 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568809 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4d82872b-b0cd-4247-abb3-ce1e75dfd32b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-km9z2\" (UID: \"4d82872b-b0cd-4247-abb3-ce1e75dfd32b\") " pod="openshift-multus/multus-additional-cni-plugins-km9z2" Apr 21 04:39:06.569512 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568822 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4d82872b-b0cd-4247-abb3-ce1e75dfd32b-system-cni-dir\") pod \"multus-additional-cni-plugins-km9z2\" (UID: \"4d82872b-b0cd-4247-abb3-ce1e75dfd32b\") " pod="openshift-multus/multus-additional-cni-plugins-km9z2" Apr 21 04:39:06.569512 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568851 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-multus-socket-dir-parent\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.569512 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568875 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-host-var-lib-cni-bin\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.569512 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568899 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4d82872b-b0cd-4247-abb3-ce1e75dfd32b-system-cni-dir\") pod \"multus-additional-cni-plugins-km9z2\" (UID: \"4d82872b-b0cd-4247-abb3-ce1e75dfd32b\") " pod="openshift-multus/multus-additional-cni-plugins-km9z2" Apr 21 04:39:06.569512 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568899 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-multus-conf-dir\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.569512 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568934 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-multus-conf-dir\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.569512 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568937 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-host-run-multus-certs\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.569512 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568960 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-multus-socket-dir-parent\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.569512 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568969 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-host-run-multus-certs\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.569512 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568899 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/18ab0325-5097-4d89-bf24-9c599b9efbdc-cni-binary-copy\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.570302 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568982 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4d82872b-b0cd-4247-abb3-ce1e75dfd32b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-km9z2\" (UID: \"4d82872b-b0cd-4247-abb3-ce1e75dfd32b\") " pod="openshift-multus/multus-additional-cni-plugins-km9z2" Apr 21 04:39:06.570302 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.568996 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/18ab0325-5097-4d89-bf24-9c599b9efbdc-host-var-lib-cni-bin\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.570302 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.569092 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/18ab0325-5097-4d89-bf24-9c599b9efbdc-multus-daemon-config\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.570302 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.569239 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4d82872b-b0cd-4247-abb3-ce1e75dfd32b-cni-binary-copy\") pod \"multus-additional-cni-plugins-km9z2\" (UID: \"4d82872b-b0cd-4247-abb3-ce1e75dfd32b\") " pod="openshift-multus/multus-additional-cni-plugins-km9z2" Apr 21 04:39:06.577006 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:06.576986 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:39:06.577006 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:06.577004 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:39:06.577006 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:06.577013 2568 projected.go:194] Error preparing data for projected volume kube-api-access-fmqbk for pod openshift-network-diagnostics/network-check-target-n47bc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:39:06.577235 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:06.577062 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f1f8e1ef-1033-4a98-837c-e59fe409b8fa-kube-api-access-fmqbk podName:f1f8e1ef-1033-4a98-837c-e59fe409b8fa nodeName:}" failed. No retries permitted until 2026-04-21 04:39:07.077048334 +0000 UTC m=+2.198606142 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-fmqbk" (UniqueName: "kubernetes.io/projected/f1f8e1ef-1033-4a98-837c-e59fe409b8fa-kube-api-access-fmqbk") pod "network-check-target-n47bc" (UID: "f1f8e1ef-1033-4a98-837c-e59fe409b8fa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:39:06.579099 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.579080 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcslb\" (UniqueName: \"kubernetes.io/projected/4d82872b-b0cd-4247-abb3-ce1e75dfd32b-kube-api-access-xcslb\") pod \"multus-additional-cni-plugins-km9z2\" (UID: \"4d82872b-b0cd-4247-abb3-ce1e75dfd32b\") " pod="openshift-multus/multus-additional-cni-plugins-km9z2" Apr 21 04:39:06.579180 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.579099 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ltqp\" (UniqueName: \"kubernetes.io/projected/18ab0325-5097-4d89-bf24-9c599b9efbdc-kube-api-access-5ltqp\") pod \"multus-2pbgh\" (UID: \"18ab0325-5097-4d89-bf24-9c599b9efbdc\") " pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.680794 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.680767 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:06.686747 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:06.686721 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b8f621a_cf82_4827_a595_5b1724d2e7df.slice/crio-ef48b21e050f5b1859c8a43a4d350b0137a625adc7a690af0f61ad4ff30d2546 WatchSource:0}: Error finding container ef48b21e050f5b1859c8a43a4d350b0137a625adc7a690af0f61ad4ff30d2546: Status 404 returned error can't find the container with id ef48b21e050f5b1859c8a43a4d350b0137a625adc7a690af0f61ad4ff30d2546 Apr 21 04:39:06.705804 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.705777 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-8djwj" Apr 21 04:39:06.711354 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.711334 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vq49v" Apr 21 04:39:06.711610 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:06.711586 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12c88bd4_6fd7_4968_a57a_c248d0a14470.slice/crio-0072db639aca3d12df92bbc6b722f0fd74e9e72424eed247d646c9e1b833e63e WatchSource:0}: Error finding container 0072db639aca3d12df92bbc6b722f0fd74e9e72424eed247d646c9e1b833e63e: Status 404 returned error can't find the container with id 0072db639aca3d12df92bbc6b722f0fd74e9e72424eed247d646c9e1b833e63e Apr 21 04:39:06.717766 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:06.717744 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod800e074a_3700_4f73_b98f_4007eadc4414.slice/crio-5c00c55e372deffb815c4b076ac82119f4b08618529aeeef0953d0d9890bea48 WatchSource:0}: Error finding container 5c00c55e372deffb815c4b076ac82119f4b08618529aeeef0953d0d9890bea48: Status 404 returned error can't find the container with id 5c00c55e372deffb815c4b076ac82119f4b08618529aeeef0953d0d9890bea48 Apr 21 04:39:06.722947 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.722932 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-n7x9t" Apr 21 04:39:06.728481 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:06.728461 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda74e0bd7_e17f_4529_9299_93c38644ab68.slice/crio-b515968291f0097f66c04cfdaa040d193f1bb261e943cd751c3a7a50a26f3134 WatchSource:0}: Error finding container b515968291f0097f66c04cfdaa040d193f1bb261e943cd751c3a7a50a26f3134: Status 404 returned error can't find the container with id b515968291f0097f66c04cfdaa040d193f1bb261e943cd751c3a7a50a26f3134 Apr 21 04:39:06.737894 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.737876 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qjbk5" Apr 21 04:39:06.743309 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:06.743285 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78c76e75_e505_48b3_b431_e14495f564b9.slice/crio-8852bba5884117550c333f54c7096d5c6c6fded31cb4abdff0197db3af34dc03 WatchSource:0}: Error finding container 8852bba5884117550c333f54c7096d5c6c6fded31cb4abdff0197db3af34dc03: Status 404 returned error can't find the container with id 8852bba5884117550c333f54c7096d5c6c6fded31cb4abdff0197db3af34dc03 Apr 21 04:39:06.753686 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.753667 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gd7sb" Apr 21 04:39:06.761023 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:06.760998 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94f8b350_f4aa_4dd0_b82a_7ab99b4b7831.slice/crio-7e1c0798333b1333cdb220fa5ccc6b351e8b1445933cbaf0b17a2fc7f1ba5ba1 WatchSource:0}: Error finding container 7e1c0798333b1333cdb220fa5ccc6b351e8b1445933cbaf0b17a2fc7f1ba5ba1: Status 404 returned error can't find the container with id 7e1c0798333b1333cdb220fa5ccc6b351e8b1445933cbaf0b17a2fc7f1ba5ba1 Apr 21 04:39:06.796104 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.796046 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2pbgh" Apr 21 04:39:06.800783 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.800764 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-km9z2" Apr 21 04:39:06.801994 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:06.801972 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18ab0325_5097_4d89_bf24_9c599b9efbdc.slice/crio-0139a49f9221e3a1181606230ead5ddf24a3f2336ae64a623a693b2d0e0fe29f WatchSource:0}: Error finding container 0139a49f9221e3a1181606230ead5ddf24a3f2336ae64a623a693b2d0e0fe29f: Status 404 returned error can't find the container with id 0139a49f9221e3a1181606230ead5ddf24a3f2336ae64a623a693b2d0e0fe29f Apr 21 04:39:06.806723 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:39:06.806701 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d82872b_b0cd_4247_abb3_ce1e75dfd32b.slice/crio-c319d2e70e54636ed6e49c1dd168143b0513a29e1f870d9beb1f479af4bfa722 WatchSource:0}: Error finding container c319d2e70e54636ed6e49c1dd168143b0513a29e1f870d9beb1f479af4bfa722: Status 404 returned error can't find the container with id c319d2e70e54636ed6e49c1dd168143b0513a29e1f870d9beb1f479af4bfa722 Apr 21 04:39:06.973212 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:06.973180 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8743d92c-6080-4066-ad83-55bb582a3f6c-metrics-certs\") pod \"network-metrics-daemon-z4rqh\" (UID: \"8743d92c-6080-4066-ad83-55bb582a3f6c\") " pod="openshift-multus/network-metrics-daemon-z4rqh" Apr 21 04:39:06.973396 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:06.973378 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:39:06.973479 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:06.973448 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8743d92c-6080-4066-ad83-55bb582a3f6c-metrics-certs podName:8743d92c-6080-4066-ad83-55bb582a3f6c nodeName:}" failed. No retries permitted until 2026-04-21 04:39:07.973427618 +0000 UTC m=+3.094985418 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8743d92c-6080-4066-ad83-55bb582a3f6c-metrics-certs") pod "network-metrics-daemon-z4rqh" (UID: "8743d92c-6080-4066-ad83-55bb582a3f6c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:39:07.087083 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:07.086771 2568 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 04:39:07.175615 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:07.175499 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmqbk\" (UniqueName: \"kubernetes.io/projected/f1f8e1ef-1033-4a98-837c-e59fe409b8fa-kube-api-access-fmqbk\") pod \"network-check-target-n47bc\" (UID: \"f1f8e1ef-1033-4a98-837c-e59fe409b8fa\") " pod="openshift-network-diagnostics/network-check-target-n47bc" Apr 21 04:39:07.175788 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:07.175645 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:39:07.175788 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:07.175662 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:39:07.175788 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:07.175674 2568 projected.go:194] Error preparing data for projected volume kube-api-access-fmqbk for pod openshift-network-diagnostics/network-check-target-n47bc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:39:07.175788 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:07.175724 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f1f8e1ef-1033-4a98-837c-e59fe409b8fa-kube-api-access-fmqbk podName:f1f8e1ef-1033-4a98-837c-e59fe409b8fa nodeName:}" failed. No retries permitted until 2026-04-21 04:39:08.17570544 +0000 UTC m=+3.297263231 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-fmqbk" (UniqueName: "kubernetes.io/projected/f1f8e1ef-1033-4a98-837c-e59fe409b8fa-kube-api-access-fmqbk") pod "network-check-target-n47bc" (UID: "f1f8e1ef-1033-4a98-837c-e59fe409b8fa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:39:07.251611 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:07.251370 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 04:39:07.380133 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:07.379912 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 04:39:07.396559 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:07.396470 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 04:34:06 +0000 UTC" deadline="2027-11-24 21:24:47.351598524 +0000 UTC" Apr 21 04:39:07.396559 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:07.396501 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13984h45m39.955101161s" Apr 21 04:39:07.513483 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:07.513204 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2pbgh" event={"ID":"18ab0325-5097-4d89-bf24-9c599b9efbdc","Type":"ContainerStarted","Data":"0139a49f9221e3a1181606230ead5ddf24a3f2336ae64a623a693b2d0e0fe29f"} Apr 21 04:39:07.543245 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:07.543193 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vq49v" event={"ID":"800e074a-3700-4f73-b98f-4007eadc4414","Type":"ContainerStarted","Data":"5c00c55e372deffb815c4b076ac82119f4b08618529aeeef0953d0d9890bea48"} Apr 21 04:39:07.577385 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:07.577340 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hm47b" event={"ID":"fed25630-f93d-40db-800e-f8042fc4f7ca","Type":"ContainerStarted","Data":"719c622ee97c9a059febbb68dd2fa8d77ac2462715f1700ce030f536259a8420"} Apr 21 04:39:07.595298 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:07.595239 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-km9z2" event={"ID":"4d82872b-b0cd-4247-abb3-ce1e75dfd32b","Type":"ContainerStarted","Data":"c319d2e70e54636ed6e49c1dd168143b0513a29e1f870d9beb1f479af4bfa722"} Apr 21 04:39:07.612191 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:07.612102 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gd7sb" event={"ID":"94f8b350-f4aa-4dd0-b82a-7ab99b4b7831","Type":"ContainerStarted","Data":"7e1c0798333b1333cdb220fa5ccc6b351e8b1445933cbaf0b17a2fc7f1ba5ba1"} Apr 21 04:39:07.639663 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:07.639620 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qjbk5" event={"ID":"78c76e75-e505-48b3-b431-e14495f564b9","Type":"ContainerStarted","Data":"8852bba5884117550c333f54c7096d5c6c6fded31cb4abdff0197db3af34dc03"} Apr 21 04:39:07.648690 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:07.648650 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-n7x9t" event={"ID":"a74e0bd7-e17f-4529-9299-93c38644ab68","Type":"ContainerStarted","Data":"b515968291f0097f66c04cfdaa040d193f1bb261e943cd751c3a7a50a26f3134"} Apr 21 04:39:07.660697 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:07.660658 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-8djwj" event={"ID":"12c88bd4-6fd7-4968-a57a-c248d0a14470","Type":"ContainerStarted","Data":"0072db639aca3d12df92bbc6b722f0fd74e9e72424eed247d646c9e1b833e63e"} Apr 21 04:39:07.663866 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:07.663831 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" event={"ID":"8b8f621a-cf82-4827-a595-5b1724d2e7df","Type":"ContainerStarted","Data":"ef48b21e050f5b1859c8a43a4d350b0137a625adc7a690af0f61ad4ff30d2546"} Apr 21 04:39:07.981960 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:07.981681 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8743d92c-6080-4066-ad83-55bb582a3f6c-metrics-certs\") pod \"network-metrics-daemon-z4rqh\" (UID: \"8743d92c-6080-4066-ad83-55bb582a3f6c\") " pod="openshift-multus/network-metrics-daemon-z4rqh" Apr 21 04:39:07.982189 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:07.982162 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:39:07.982257 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:07.982238 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8743d92c-6080-4066-ad83-55bb582a3f6c-metrics-certs podName:8743d92c-6080-4066-ad83-55bb582a3f6c nodeName:}" failed. No retries permitted until 2026-04-21 04:39:09.982216346 +0000 UTC m=+5.103774154 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8743d92c-6080-4066-ad83-55bb582a3f6c-metrics-certs") pod "network-metrics-daemon-z4rqh" (UID: "8743d92c-6080-4066-ad83-55bb582a3f6c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:39:08.183521 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:08.183487 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmqbk\" (UniqueName: \"kubernetes.io/projected/f1f8e1ef-1033-4a98-837c-e59fe409b8fa-kube-api-access-fmqbk\") pod \"network-check-target-n47bc\" (UID: \"f1f8e1ef-1033-4a98-837c-e59fe409b8fa\") " pod="openshift-network-diagnostics/network-check-target-n47bc" Apr 21 04:39:08.183702 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:08.183686 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:39:08.183796 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:08.183712 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:39:08.183796 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:08.183725 2568 projected.go:194] Error preparing data for projected volume kube-api-access-fmqbk for pod openshift-network-diagnostics/network-check-target-n47bc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:39:08.183796 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:08.183787 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f1f8e1ef-1033-4a98-837c-e59fe409b8fa-kube-api-access-fmqbk podName:f1f8e1ef-1033-4a98-837c-e59fe409b8fa nodeName:}" failed. No retries permitted until 2026-04-21 04:39:10.183765861 +0000 UTC m=+5.305323669 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-fmqbk" (UniqueName: "kubernetes.io/projected/f1f8e1ef-1033-4a98-837c-e59fe409b8fa-kube-api-access-fmqbk") pod "network-check-target-n47bc" (UID: "f1f8e1ef-1033-4a98-837c-e59fe409b8fa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:39:08.396668 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:08.396623 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 04:34:06 +0000 UTC" deadline="2027-10-07 16:34:17.227196928 +0000 UTC" Apr 21 04:39:08.396668 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:08.396666 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12827h55m8.830534434s" Apr 21 04:39:08.481566 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:08.480852 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z4rqh" Apr 21 04:39:08.481566 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:08.480999 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z4rqh" podUID="8743d92c-6080-4066-ad83-55bb582a3f6c" Apr 21 04:39:08.481566 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:08.481129 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n47bc" Apr 21 04:39:08.481566 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:08.481208 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n47bc" podUID="f1f8e1ef-1033-4a98-837c-e59fe409b8fa" Apr 21 04:39:09.999025 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:09.998984 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8743d92c-6080-4066-ad83-55bb582a3f6c-metrics-certs\") pod \"network-metrics-daemon-z4rqh\" (UID: \"8743d92c-6080-4066-ad83-55bb582a3f6c\") " pod="openshift-multus/network-metrics-daemon-z4rqh" Apr 21 04:39:09.999522 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:09.999204 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:39:09.999522 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:09.999269 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8743d92c-6080-4066-ad83-55bb582a3f6c-metrics-certs podName:8743d92c-6080-4066-ad83-55bb582a3f6c nodeName:}" failed. No retries permitted until 2026-04-21 04:39:13.999249588 +0000 UTC m=+9.120807378 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8743d92c-6080-4066-ad83-55bb582a3f6c-metrics-certs") pod "network-metrics-daemon-z4rqh" (UID: "8743d92c-6080-4066-ad83-55bb582a3f6c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:39:10.201325 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:10.201284 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmqbk\" (UniqueName: \"kubernetes.io/projected/f1f8e1ef-1033-4a98-837c-e59fe409b8fa-kube-api-access-fmqbk\") pod \"network-check-target-n47bc\" (UID: \"f1f8e1ef-1033-4a98-837c-e59fe409b8fa\") " pod="openshift-network-diagnostics/network-check-target-n47bc" Apr 21 04:39:10.201501 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:10.201465 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:39:10.201501 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:10.201486 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:39:10.201501 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:10.201498 2568 projected.go:194] Error preparing data for projected volume kube-api-access-fmqbk for pod openshift-network-diagnostics/network-check-target-n47bc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:39:10.201642 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:10.201564 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f1f8e1ef-1033-4a98-837c-e59fe409b8fa-kube-api-access-fmqbk podName:f1f8e1ef-1033-4a98-837c-e59fe409b8fa nodeName:}" failed. No retries permitted until 2026-04-21 04:39:14.201545019 +0000 UTC m=+9.323102829 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-fmqbk" (UniqueName: "kubernetes.io/projected/f1f8e1ef-1033-4a98-837c-e59fe409b8fa-kube-api-access-fmqbk") pod "network-check-target-n47bc" (UID: "f1f8e1ef-1033-4a98-837c-e59fe409b8fa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:39:10.481468 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:10.480889 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n47bc" Apr 21 04:39:10.481468 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:10.481031 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n47bc" podUID="f1f8e1ef-1033-4a98-837c-e59fe409b8fa" Apr 21 04:39:10.481468 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:10.481434 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z4rqh" Apr 21 04:39:10.481827 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:10.481552 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z4rqh" podUID="8743d92c-6080-4066-ad83-55bb582a3f6c" Apr 21 04:39:12.480639 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:12.480389 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n47bc" Apr 21 04:39:12.480639 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:12.480520 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z4rqh" Apr 21 04:39:12.480639 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:12.480519 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n47bc" podUID="f1f8e1ef-1033-4a98-837c-e59fe409b8fa" Apr 21 04:39:12.480639 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:12.480603 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z4rqh" podUID="8743d92c-6080-4066-ad83-55bb582a3f6c" Apr 21 04:39:14.033524 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:14.033291 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8743d92c-6080-4066-ad83-55bb582a3f6c-metrics-certs\") pod \"network-metrics-daemon-z4rqh\" (UID: \"8743d92c-6080-4066-ad83-55bb582a3f6c\") " pod="openshift-multus/network-metrics-daemon-z4rqh" Apr 21 04:39:14.033524 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:14.033472 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:39:14.034005 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:14.033541 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8743d92c-6080-4066-ad83-55bb582a3f6c-metrics-certs podName:8743d92c-6080-4066-ad83-55bb582a3f6c nodeName:}" failed. No retries permitted until 2026-04-21 04:39:22.033522775 +0000 UTC m=+17.155080565 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8743d92c-6080-4066-ad83-55bb582a3f6c-metrics-certs") pod "network-metrics-daemon-z4rqh" (UID: "8743d92c-6080-4066-ad83-55bb582a3f6c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:39:14.234855 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:14.234748 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmqbk\" (UniqueName: \"kubernetes.io/projected/f1f8e1ef-1033-4a98-837c-e59fe409b8fa-kube-api-access-fmqbk\") pod \"network-check-target-n47bc\" (UID: \"f1f8e1ef-1033-4a98-837c-e59fe409b8fa\") " pod="openshift-network-diagnostics/network-check-target-n47bc" Apr 21 04:39:14.235050 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:14.234937 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:39:14.235050 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:14.234964 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:39:14.235050 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:14.234977 2568 projected.go:194] Error preparing data for projected volume kube-api-access-fmqbk for pod openshift-network-diagnostics/network-check-target-n47bc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:39:14.235050 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:14.235044 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f1f8e1ef-1033-4a98-837c-e59fe409b8fa-kube-api-access-fmqbk podName:f1f8e1ef-1033-4a98-837c-e59fe409b8fa nodeName:}" failed. No retries permitted until 2026-04-21 04:39:22.235024966 +0000 UTC m=+17.356582763 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-fmqbk" (UniqueName: "kubernetes.io/projected/f1f8e1ef-1033-4a98-837c-e59fe409b8fa-kube-api-access-fmqbk") pod "network-check-target-n47bc" (UID: "f1f8e1ef-1033-4a98-837c-e59fe409b8fa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:39:14.480879 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:14.480846 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n47bc" Apr 21 04:39:14.481089 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:14.480971 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n47bc" podUID="f1f8e1ef-1033-4a98-837c-e59fe409b8fa" Apr 21 04:39:14.481165 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:14.481092 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z4rqh" Apr 21 04:39:14.481213 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:14.481190 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z4rqh" podUID="8743d92c-6080-4066-ad83-55bb582a3f6c" Apr 21 04:39:16.480914 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:16.480879 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z4rqh" Apr 21 04:39:16.480914 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:16.480917 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n47bc" Apr 21 04:39:16.481418 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:16.480995 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z4rqh" podUID="8743d92c-6080-4066-ad83-55bb582a3f6c" Apr 21 04:39:16.481418 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:16.481130 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n47bc" podUID="f1f8e1ef-1033-4a98-837c-e59fe409b8fa" Apr 21 04:39:18.481256 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:18.481206 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n47bc" Apr 21 04:39:18.481256 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:18.481232 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z4rqh" Apr 21 04:39:18.481782 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:18.481331 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n47bc" podUID="f1f8e1ef-1033-4a98-837c-e59fe409b8fa" Apr 21 04:39:18.481782 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:18.481460 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z4rqh" podUID="8743d92c-6080-4066-ad83-55bb582a3f6c" Apr 21 04:39:20.481323 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:20.481283 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n47bc" Apr 21 04:39:20.481747 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:20.481305 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z4rqh" Apr 21 04:39:20.481747 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:20.481409 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n47bc" podUID="f1f8e1ef-1033-4a98-837c-e59fe409b8fa" Apr 21 04:39:20.481747 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:20.481538 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z4rqh" podUID="8743d92c-6080-4066-ad83-55bb582a3f6c" Apr 21 04:39:22.091182 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:22.091152 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8743d92c-6080-4066-ad83-55bb582a3f6c-metrics-certs\") pod \"network-metrics-daemon-z4rqh\" (UID: \"8743d92c-6080-4066-ad83-55bb582a3f6c\") " pod="openshift-multus/network-metrics-daemon-z4rqh" Apr 21 04:39:22.091684 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:22.091294 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:39:22.091684 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:22.091367 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8743d92c-6080-4066-ad83-55bb582a3f6c-metrics-certs podName:8743d92c-6080-4066-ad83-55bb582a3f6c nodeName:}" failed. No retries permitted until 2026-04-21 04:39:38.091345453 +0000 UTC m=+33.212903258 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8743d92c-6080-4066-ad83-55bb582a3f6c-metrics-certs") pod "network-metrics-daemon-z4rqh" (UID: "8743d92c-6080-4066-ad83-55bb582a3f6c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:39:22.292510 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:22.292467 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmqbk\" (UniqueName: \"kubernetes.io/projected/f1f8e1ef-1033-4a98-837c-e59fe409b8fa-kube-api-access-fmqbk\") pod \"network-check-target-n47bc\" (UID: \"f1f8e1ef-1033-4a98-837c-e59fe409b8fa\") " pod="openshift-network-diagnostics/network-check-target-n47bc" Apr 21 04:39:22.292700 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:22.292635 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:39:22.292700 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:22.292662 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:39:22.292700 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:22.292675 2568 projected.go:194] Error preparing data for projected volume kube-api-access-fmqbk for pod openshift-network-diagnostics/network-check-target-n47bc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:39:22.292857 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:22.292740 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f1f8e1ef-1033-4a98-837c-e59fe409b8fa-kube-api-access-fmqbk podName:f1f8e1ef-1033-4a98-837c-e59fe409b8fa nodeName:}" failed. No retries permitted until 2026-04-21 04:39:38.292722582 +0000 UTC m=+33.414280374 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-fmqbk" (UniqueName: "kubernetes.io/projected/f1f8e1ef-1033-4a98-837c-e59fe409b8fa-kube-api-access-fmqbk") pod "network-check-target-n47bc" (UID: "f1f8e1ef-1033-4a98-837c-e59fe409b8fa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:39:22.481338 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:22.481294 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z4rqh" Apr 21 04:39:22.481504 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:22.481297 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n47bc" Apr 21 04:39:22.481504 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:22.481438 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z4rqh" podUID="8743d92c-6080-4066-ad83-55bb582a3f6c" Apr 21 04:39:22.481504 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:22.481492 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n47bc" podUID="f1f8e1ef-1033-4a98-837c-e59fe409b8fa" Apr 21 04:39:24.481170 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:24.481131 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n47bc" Apr 21 04:39:24.481612 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:24.481139 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z4rqh" Apr 21 04:39:24.481612 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:24.481258 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n47bc" podUID="f1f8e1ef-1033-4a98-837c-e59fe409b8fa" Apr 21 04:39:24.481612 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:24.481319 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z4rqh" podUID="8743d92c-6080-4066-ad83-55bb582a3f6c" Apr 21 04:39:25.701235 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:25.700660 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-km9z2" event={"ID":"4d82872b-b0cd-4247-abb3-ce1e75dfd32b","Type":"ContainerStarted","Data":"5b2dccc535a53ee5fae12527365dfe49ef80fa15d2fc174eb3db231d88dfc1ff"} Apr 21 04:39:25.702346 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:25.702323 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gd7sb" event={"ID":"94f8b350-f4aa-4dd0-b82a-7ab99b4b7831","Type":"ContainerStarted","Data":"9d86586c4fff37e78a11fdfab4ae9e57d302315e9e6dd16534d2e4d9ed90b419"} Apr 21 04:39:25.703819 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:25.703796 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-n7x9t" event={"ID":"a74e0bd7-e17f-4529-9299-93c38644ab68","Type":"ContainerStarted","Data":"341ed2a653924e9d70229fcf37b522157e77c0dc1f09f51f71f1633738bca99d"} Apr 21 04:39:25.707212 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:25.705361 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-8djwj" event={"ID":"12c88bd4-6fd7-4968-a57a-c248d0a14470","Type":"ContainerStarted","Data":"ff0519222670856d49f938463eb548f1c35e436f024f66720b077af42f866316"} Apr 21 04:39:25.707927 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:25.707907 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jhdlv_8b8f621a-cf82-4827-a595-5b1724d2e7df/ovn-acl-logging/0.log" Apr 21 04:39:25.708347 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:25.708329 2568 generic.go:358] "Generic (PLEG): container finished" podID="8b8f621a-cf82-4827-a595-5b1724d2e7df" containerID="383dbd242e95fdd1407764fece870fa46a3bf892526cccb46593e29972d0d9e3" exitCode=1 Apr 21 04:39:25.708416 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:25.708389 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" event={"ID":"8b8f621a-cf82-4827-a595-5b1724d2e7df","Type":"ContainerStarted","Data":"46c92d243759ae4845847cea7d2165f4622eb30167c6488efc634ab4583c911b"} Apr 21 04:39:25.708416 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:25.708412 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" event={"ID":"8b8f621a-cf82-4827-a595-5b1724d2e7df","Type":"ContainerStarted","Data":"e7ed2b815debda34a5186f07390d73955783e2e5cad9dada25e402a25374f7df"} Apr 21 04:39:25.708480 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:25.708425 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" event={"ID":"8b8f621a-cf82-4827-a595-5b1724d2e7df","Type":"ContainerStarted","Data":"453bb19cd003df57fa221316d910f41f55a5b30520e04194749b0b8f8d3289be"} Apr 21 04:39:25.708480 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:25.708433 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" event={"ID":"8b8f621a-cf82-4827-a595-5b1724d2e7df","Type":"ContainerStarted","Data":"999e38618f4523e7bcb04430ceb8cef49ec658c99eae7f3f25618000775b8a00"} Apr 21 04:39:25.708480 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:25.708442 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" event={"ID":"8b8f621a-cf82-4827-a595-5b1724d2e7df","Type":"ContainerDied","Data":"383dbd242e95fdd1407764fece870fa46a3bf892526cccb46593e29972d0d9e3"} Apr 21 04:39:25.708480 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:25.708451 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" event={"ID":"8b8f621a-cf82-4827-a595-5b1724d2e7df","Type":"ContainerStarted","Data":"6332b837c693ed7d0a4faffbbc619c77e73b4bfc09bf3a26e7c4f6597989fcb7"} Apr 21 04:39:25.709718 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:25.709700 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-241.ec2.internal" event={"ID":"4404daed50f62015edf6e395cc7a9dec","Type":"ContainerStarted","Data":"32477212fa75f6808f8a88066b3b7c625a7165836dbd030317fe3c3f30da5d5d"} Apr 21 04:39:25.710907 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:25.710888 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-241.ec2.internal" event={"ID":"b815c8ee131098fba59a15d8b002ae1a","Type":"ContainerStarted","Data":"09c6c7f3729fc7ab00a8033a086116ddfe4e6806545d56a9840243360aca98ca"} Apr 21 04:39:25.712212 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:25.712186 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2pbgh" event={"ID":"18ab0325-5097-4d89-bf24-9c599b9efbdc","Type":"ContainerStarted","Data":"f2c0c2f91c986e13c48bcd83546c0a6341496c4e6c69d306c8f8f904871faf8b"} Apr 21 04:39:25.713496 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:25.713469 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vq49v" event={"ID":"800e074a-3700-4f73-b98f-4007eadc4414","Type":"ContainerStarted","Data":"db6953dc179f369e8c71170137a8170e42c1aa410b82943479a021239c5cd940"} Apr 21 04:39:25.714767 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:25.714749 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hm47b" event={"ID":"fed25630-f93d-40db-800e-f8042fc4f7ca","Type":"ContainerStarted","Data":"776af65cfb46ce8cd69d220ab53a3c3753f9e8e6b3562116697a0e90cf5cc43c"} Apr 21 04:39:25.745429 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:25.745384 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-hm47b" podStartSLOduration=2.643972171 podStartE2EDuration="20.745368828s" podCreationTimestamp="2026-04-21 04:39:05 +0000 UTC" firstStartedPulling="2026-04-21 04:39:06.495999821 +0000 UTC m=+1.617557620" lastFinishedPulling="2026-04-21 04:39:24.597396478 +0000 UTC m=+19.718954277" observedRunningTime="2026-04-21 04:39:25.745059121 +0000 UTC m=+20.866616931" watchObservedRunningTime="2026-04-21 04:39:25.745368828 +0000 UTC m=+20.866926638" Apr 21 04:39:25.763117 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:25.763054 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-2pbgh" podStartSLOduration=2.57828859 podStartE2EDuration="20.763037005s" podCreationTimestamp="2026-04-21 04:39:05 +0000 UTC" firstStartedPulling="2026-04-21 04:39:06.80378163 +0000 UTC m=+1.925339433" lastFinishedPulling="2026-04-21 04:39:24.988530054 +0000 UTC m=+20.110087848" observedRunningTime="2026-04-21 04:39:25.762298565 +0000 UTC m=+20.883856371" watchObservedRunningTime="2026-04-21 04:39:25.763037005 +0000 UTC m=+20.884594817" Apr 21 04:39:25.794264 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:25.794227 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-241.ec2.internal" podStartSLOduration=20.79421443 podStartE2EDuration="20.79421443s" podCreationTimestamp="2026-04-21 04:39:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:39:25.794084708 +0000 UTC m=+20.915642510" watchObservedRunningTime="2026-04-21 04:39:25.79421443 +0000 UTC m=+20.915772241" Apr 21 04:39:25.830022 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:25.829966 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-8djwj" podStartSLOduration=2.945294763 podStartE2EDuration="20.82995324s" podCreationTimestamp="2026-04-21 04:39:05 +0000 UTC" firstStartedPulling="2026-04-21 04:39:06.713982702 +0000 UTC m=+1.835540486" lastFinishedPulling="2026-04-21 04:39:24.598641164 +0000 UTC m=+19.720198963" observedRunningTime="2026-04-21 04:39:25.809558987 +0000 UTC m=+20.931116794" watchObservedRunningTime="2026-04-21 04:39:25.82995324 +0000 UTC m=+20.951511060" Apr 21 04:39:25.830136 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:25.830094 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-vq49v" podStartSLOduration=2.94999188 podStartE2EDuration="20.83006262s" podCreationTimestamp="2026-04-21 04:39:05 +0000 UTC" firstStartedPulling="2026-04-21 04:39:06.719248198 +0000 UTC m=+1.840805983" lastFinishedPulling="2026-04-21 04:39:24.599318925 +0000 UTC m=+19.720876723" observedRunningTime="2026-04-21 04:39:25.829589123 +0000 UTC m=+20.951146929" watchObservedRunningTime="2026-04-21 04:39:25.83006262 +0000 UTC m=+20.951620427" Apr 21 04:39:25.846703 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:25.846656 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-n7x9t" podStartSLOduration=2.979227297 podStartE2EDuration="20.846640458s" podCreationTimestamp="2026-04-21 04:39:05 +0000 UTC" firstStartedPulling="2026-04-21 04:39:06.729992005 +0000 UTC m=+1.851549793" lastFinishedPulling="2026-04-21 04:39:24.597405161 +0000 UTC m=+19.718962954" observedRunningTime="2026-04-21 04:39:25.84459349 +0000 UTC m=+20.966151296" watchObservedRunningTime="2026-04-21 04:39:25.846640458 +0000 UTC m=+20.968198267" Apr 21 04:39:26.480431 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:26.480404 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n47bc" Apr 21 04:39:26.480540 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:26.480518 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n47bc" podUID="f1f8e1ef-1033-4a98-837c-e59fe409b8fa" Apr 21 04:39:26.480600 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:26.480407 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z4rqh" Apr 21 04:39:26.480707 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:26.480686 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z4rqh" podUID="8743d92c-6080-4066-ad83-55bb582a3f6c" Apr 21 04:39:26.577478 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:26.577456 2568 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 04:39:26.718012 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:26.717934 2568 generic.go:358] "Generic (PLEG): container finished" podID="4d82872b-b0cd-4247-abb3-ce1e75dfd32b" containerID="5b2dccc535a53ee5fae12527365dfe49ef80fa15d2fc174eb3db231d88dfc1ff" exitCode=0 Apr 21 04:39:26.718437 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:26.718010 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-km9z2" event={"ID":"4d82872b-b0cd-4247-abb3-ce1e75dfd32b","Type":"ContainerDied","Data":"5b2dccc535a53ee5fae12527365dfe49ef80fa15d2fc174eb3db231d88dfc1ff"} Apr 21 04:39:26.719602 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:26.719578 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gd7sb" event={"ID":"94f8b350-f4aa-4dd0-b82a-7ab99b4b7831","Type":"ContainerStarted","Data":"8fd6564e8bbdc7cf4694fa56935f5f32a72b0debdefc7673ca894216efd4bb3e"} Apr 21 04:39:26.720797 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:26.720705 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qjbk5" event={"ID":"78c76e75-e505-48b3-b431-e14495f564b9","Type":"ContainerStarted","Data":"176d1a5cf91f9a1899ded6e5173fece648ce83d0086a9c768dfacdf62b4c273a"} Apr 21 04:39:26.722223 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:26.722203 2568 generic.go:358] "Generic (PLEG): container finished" podID="4404daed50f62015edf6e395cc7a9dec" containerID="32477212fa75f6808f8a88066b3b7c625a7165836dbd030317fe3c3f30da5d5d" exitCode=0 Apr 21 04:39:26.722342 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:26.722314 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-241.ec2.internal" event={"ID":"4404daed50f62015edf6e395cc7a9dec","Type":"ContainerDied","Data":"32477212fa75f6808f8a88066b3b7c625a7165836dbd030317fe3c3f30da5d5d"} Apr 21 04:39:26.722446 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:26.722351 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-241.ec2.internal" event={"ID":"4404daed50f62015edf6e395cc7a9dec","Type":"ContainerStarted","Data":"d4d150f8814ad566618b6d634ea07d5fec8aa3cbf9cbf0408a2ae132b13730d5"} Apr 21 04:39:26.754790 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:26.754748 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-241.ec2.internal" podStartSLOduration=21.754732637 podStartE2EDuration="21.754732637s" podCreationTimestamp="2026-04-21 04:39:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:39:26.7545491 +0000 UTC m=+21.876106907" watchObservedRunningTime="2026-04-21 04:39:26.754732637 +0000 UTC m=+21.876290444" Apr 21 04:39:26.769098 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:26.769033 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-qjbk5" podStartSLOduration=3.9162601070000003 podStartE2EDuration="21.769019109s" podCreationTimestamp="2026-04-21 04:39:05 +0000 UTC" firstStartedPulling="2026-04-21 04:39:06.744660902 +0000 UTC m=+1.866218688" lastFinishedPulling="2026-04-21 04:39:24.597419897 +0000 UTC m=+19.718977690" observedRunningTime="2026-04-21 04:39:26.768551551 +0000 UTC m=+21.890109357" watchObservedRunningTime="2026-04-21 04:39:26.769019109 +0000 UTC m=+21.890576940" Apr 21 04:39:27.429635 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:27.429528 2568 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T04:39:26.577474418Z","UUID":"32506d9e-523e-40a1-a63d-b3d738834c54","Handler":null,"Name":"","Endpoint":""} Apr 21 04:39:27.431659 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:27.431485 2568 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 04:39:27.431659 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:27.431513 2568 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 04:39:27.726279 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:27.726227 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gd7sb" event={"ID":"94f8b350-f4aa-4dd0-b82a-7ab99b4b7831","Type":"ContainerStarted","Data":"7ace14eced44ee6467bb9523807ce643845b473fb07437baad88d28777fd2d57"} Apr 21 04:39:27.729328 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:27.729303 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jhdlv_8b8f621a-cf82-4827-a595-5b1724d2e7df/ovn-acl-logging/0.log" Apr 21 04:39:27.729729 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:27.729702 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" event={"ID":"8b8f621a-cf82-4827-a595-5b1724d2e7df","Type":"ContainerStarted","Data":"86d6d780d8b6b9d98e18612c750c1082ae18febbe609887541ed156350d41279"} Apr 21 04:39:27.744965 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:27.744917 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gd7sb" podStartSLOduration=2.253698371 podStartE2EDuration="22.744904164s" podCreationTimestamp="2026-04-21 04:39:05 +0000 UTC" firstStartedPulling="2026-04-21 04:39:06.762534515 +0000 UTC m=+1.884092299" lastFinishedPulling="2026-04-21 04:39:27.2537403 +0000 UTC m=+22.375298092" observedRunningTime="2026-04-21 04:39:27.744343255 +0000 UTC m=+22.865901076" watchObservedRunningTime="2026-04-21 04:39:27.744904164 +0000 UTC m=+22.866461971" Apr 21 04:39:28.252494 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:28.252234 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-8djwj" Apr 21 04:39:28.252946 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:28.252922 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-8djwj" Apr 21 04:39:28.480514 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:28.480482 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n47bc" Apr 21 04:39:28.480698 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:28.480600 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n47bc" podUID="f1f8e1ef-1033-4a98-837c-e59fe409b8fa" Apr 21 04:39:28.480698 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:28.480631 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z4rqh" Apr 21 04:39:28.480791 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:28.480712 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z4rqh" podUID="8743d92c-6080-4066-ad83-55bb582a3f6c" Apr 21 04:39:30.481078 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:30.481033 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n47bc" Apr 21 04:39:30.481693 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:30.481040 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z4rqh" Apr 21 04:39:30.481693 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:30.481170 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n47bc" podUID="f1f8e1ef-1033-4a98-837c-e59fe409b8fa" Apr 21 04:39:30.481693 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:30.481256 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z4rqh" podUID="8743d92c-6080-4066-ad83-55bb582a3f6c" Apr 21 04:39:31.466960 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:31.466920 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-8djwj" Apr 21 04:39:31.467162 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:31.467047 2568 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 04:39:31.467547 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:31.467523 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-8djwj" Apr 21 04:39:31.739808 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:31.739724 2568 generic.go:358] "Generic (PLEG): container finished" podID="4d82872b-b0cd-4247-abb3-ce1e75dfd32b" containerID="f6ccf2b2b2fa8dd96006407d86fbd16d92bef8c38153ad283996cd294367bbd7" exitCode=0 Apr 21 04:39:31.739808 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:31.739799 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-km9z2" event={"ID":"4d82872b-b0cd-4247-abb3-ce1e75dfd32b","Type":"ContainerDied","Data":"f6ccf2b2b2fa8dd96006407d86fbd16d92bef8c38153ad283996cd294367bbd7"} Apr 21 04:39:31.744582 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:31.744564 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jhdlv_8b8f621a-cf82-4827-a595-5b1724d2e7df/ovn-acl-logging/0.log" Apr 21 04:39:31.745279 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:31.745254 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" event={"ID":"8b8f621a-cf82-4827-a595-5b1724d2e7df","Type":"ContainerStarted","Data":"00f570a913871638f6b386deb5d806ddf1cf2131007b635574ac2670bf8dc9dd"} Apr 21 04:39:31.745685 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:31.745634 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:31.745784 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:31.745766 2568 scope.go:117] "RemoveContainer" containerID="383dbd242e95fdd1407764fece870fa46a3bf892526cccb46593e29972d0d9e3" Apr 21 04:39:31.761197 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:31.761181 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:32.481432 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:32.481406 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z4rqh" Apr 21 04:39:32.481763 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:32.481412 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n47bc" Apr 21 04:39:32.481763 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:32.481516 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z4rqh" podUID="8743d92c-6080-4066-ad83-55bb582a3f6c" Apr 21 04:39:32.481763 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:32.481620 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n47bc" podUID="f1f8e1ef-1033-4a98-837c-e59fe409b8fa" Apr 21 04:39:32.749782 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:32.749762 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jhdlv_8b8f621a-cf82-4827-a595-5b1724d2e7df/ovn-acl-logging/0.log" Apr 21 04:39:32.750106 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:32.750063 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" event={"ID":"8b8f621a-cf82-4827-a595-5b1724d2e7df","Type":"ContainerStarted","Data":"f9e26c7b5891b474c0a5726895a648e1ec98ff5e2c5bdee7855b1647e410afc1"} Apr 21 04:39:32.757126 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:32.757109 2568 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 04:39:32.757353 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:32.757337 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:32.776154 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:32.776004 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:32.782819 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:32.782798 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-n47bc"] Apr 21 04:39:32.782925 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:32.782894 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n47bc" Apr 21 04:39:32.783038 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:32.783018 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n47bc" podUID="f1f8e1ef-1033-4a98-837c-e59fe409b8fa" Apr 21 04:39:32.789490 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:32.787036 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-z4rqh"] Apr 21 04:39:32.789490 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:32.787204 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z4rqh" Apr 21 04:39:32.789490 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:32.787356 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z4rqh" podUID="8743d92c-6080-4066-ad83-55bb582a3f6c" Apr 21 04:39:32.792143 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:32.792091 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" podStartSLOduration=9.863823422 podStartE2EDuration="27.792058289s" podCreationTimestamp="2026-04-21 04:39:05 +0000 UTC" firstStartedPulling="2026-04-21 04:39:06.688579081 +0000 UTC m=+1.810136872" lastFinishedPulling="2026-04-21 04:39:24.616813952 +0000 UTC m=+19.738371739" observedRunningTime="2026-04-21 04:39:32.790512822 +0000 UTC m=+27.912070630" watchObservedRunningTime="2026-04-21 04:39:32.792058289 +0000 UTC m=+27.913616098" Apr 21 04:39:33.508174 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:33.508131 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:39:33.754660 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:33.754627 2568 generic.go:358] "Generic (PLEG): container finished" podID="4d82872b-b0cd-4247-abb3-ce1e75dfd32b" containerID="c2fe50e9ad9ac8ca1d891cd476090709a91afdaf0c25d4e5f8be4d5f6c5124ff" exitCode=0 Apr 21 04:39:33.755340 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:33.754697 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-km9z2" event={"ID":"4d82872b-b0cd-4247-abb3-ce1e75dfd32b","Type":"ContainerDied","Data":"c2fe50e9ad9ac8ca1d891cd476090709a91afdaf0c25d4e5f8be4d5f6c5124ff"} Apr 21 04:39:34.481311 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:34.481283 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z4rqh" Apr 21 04:39:34.481311 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:34.481296 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n47bc" Apr 21 04:39:34.481495 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:34.481382 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z4rqh" podUID="8743d92c-6080-4066-ad83-55bb582a3f6c" Apr 21 04:39:34.481495 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:34.481438 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n47bc" podUID="f1f8e1ef-1033-4a98-837c-e59fe409b8fa" Apr 21 04:39:34.758966 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:34.758932 2568 generic.go:358] "Generic (PLEG): container finished" podID="4d82872b-b0cd-4247-abb3-ce1e75dfd32b" containerID="8f6ef7246c6299c9f01d9e0bb133a5909c98fe20b0e2db781a6dc523ef6d0e8d" exitCode=0 Apr 21 04:39:34.759460 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:34.758978 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-km9z2" event={"ID":"4d82872b-b0cd-4247-abb3-ce1e75dfd32b","Type":"ContainerDied","Data":"8f6ef7246c6299c9f01d9e0bb133a5909c98fe20b0e2db781a6dc523ef6d0e8d"} Apr 21 04:39:36.480602 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:36.480556 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n47bc" Apr 21 04:39:36.481117 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:36.480695 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n47bc" podUID="f1f8e1ef-1033-4a98-837c-e59fe409b8fa" Apr 21 04:39:36.481117 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:36.480746 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z4rqh" Apr 21 04:39:36.481117 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:36.480849 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z4rqh" podUID="8743d92c-6080-4066-ad83-55bb582a3f6c" Apr 21 04:39:36.701575 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:36.701547 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-241.ec2.internal" event="NodeReady" Apr 21 04:39:36.701751 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:36.701672 2568 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 04:39:36.746920 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:36.746831 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-qjtkj"] Apr 21 04:39:36.750649 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:36.750623 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-k8p5j"] Apr 21 04:39:36.750810 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:36.750791 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qjtkj" Apr 21 04:39:36.753801 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:36.753770 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-k8p5j" Apr 21 04:39:36.754864 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:36.754842 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 04:39:36.754983 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:36.754875 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 04:39:36.754983 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:36.754899 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-f9n52\"" Apr 21 04:39:36.756429 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:36.756410 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 04:39:36.756429 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:36.756427 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 04:39:36.756770 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:36.756414 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 04:39:36.756770 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:36.756668 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ljx4n\"" Apr 21 04:39:36.761233 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:36.761211 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qjtkj"] Apr 21 04:39:36.761461 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:36.761445 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-k8p5j"] Apr 21 04:39:36.904110 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:36.904048 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4qtg\" (UniqueName: \"kubernetes.io/projected/f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8-kube-api-access-m4qtg\") pod \"ingress-canary-k8p5j\" (UID: \"f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8\") " pod="openshift-ingress-canary/ingress-canary-k8p5j" Apr 21 04:39:36.904292 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:36.904161 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70e12877-2720-42f7-b047-316b48c6b8fe-metrics-tls\") pod \"dns-default-qjtkj\" (UID: \"70e12877-2720-42f7-b047-316b48c6b8fe\") " pod="openshift-dns/dns-default-qjtkj" Apr 21 04:39:36.904292 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:36.904212 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70e12877-2720-42f7-b047-316b48c6b8fe-config-volume\") pod \"dns-default-qjtkj\" (UID: \"70e12877-2720-42f7-b047-316b48c6b8fe\") " pod="openshift-dns/dns-default-qjtkj" Apr 21 04:39:36.904292 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:36.904241 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/70e12877-2720-42f7-b047-316b48c6b8fe-tmp-dir\") pod \"dns-default-qjtkj\" (UID: \"70e12877-2720-42f7-b047-316b48c6b8fe\") " pod="openshift-dns/dns-default-qjtkj" Apr 21 04:39:36.904406 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:36.904312 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4758\" (UniqueName: \"kubernetes.io/projected/70e12877-2720-42f7-b047-316b48c6b8fe-kube-api-access-v4758\") pod \"dns-default-qjtkj\" (UID: \"70e12877-2720-42f7-b047-316b48c6b8fe\") " pod="openshift-dns/dns-default-qjtkj" Apr 21 04:39:36.904406 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:36.904344 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8-cert\") pod \"ingress-canary-k8p5j\" (UID: \"f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8\") " pod="openshift-ingress-canary/ingress-canary-k8p5j" Apr 21 04:39:37.005296 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:37.005210 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8-cert\") pod \"ingress-canary-k8p5j\" (UID: \"f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8\") " pod="openshift-ingress-canary/ingress-canary-k8p5j" Apr 21 04:39:37.005296 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:37.005284 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m4qtg\" (UniqueName: \"kubernetes.io/projected/f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8-kube-api-access-m4qtg\") pod \"ingress-canary-k8p5j\" (UID: \"f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8\") " pod="openshift-ingress-canary/ingress-canary-k8p5j" Apr 21 04:39:37.005518 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:37.005314 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70e12877-2720-42f7-b047-316b48c6b8fe-metrics-tls\") pod \"dns-default-qjtkj\" (UID: \"70e12877-2720-42f7-b047-316b48c6b8fe\") " pod="openshift-dns/dns-default-qjtkj" Apr 21 04:39:37.005518 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:37.005342 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70e12877-2720-42f7-b047-316b48c6b8fe-config-volume\") pod \"dns-default-qjtkj\" (UID: \"70e12877-2720-42f7-b047-316b48c6b8fe\") " pod="openshift-dns/dns-default-qjtkj" Apr 21 04:39:37.005518 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:37.005363 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:39:37.005518 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:37.005369 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/70e12877-2720-42f7-b047-316b48c6b8fe-tmp-dir\") pod \"dns-default-qjtkj\" (UID: \"70e12877-2720-42f7-b047-316b48c6b8fe\") " pod="openshift-dns/dns-default-qjtkj" Apr 21 04:39:37.005518 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:37.005426 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8-cert podName:f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:37.50540655 +0000 UTC m=+32.626964339 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8-cert") pod "ingress-canary-k8p5j" (UID: "f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8") : secret "canary-serving-cert" not found Apr 21 04:39:37.005518 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:37.005444 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v4758\" (UniqueName: \"kubernetes.io/projected/70e12877-2720-42f7-b047-316b48c6b8fe-kube-api-access-v4758\") pod \"dns-default-qjtkj\" (UID: \"70e12877-2720-42f7-b047-316b48c6b8fe\") " pod="openshift-dns/dns-default-qjtkj" Apr 21 04:39:37.005518 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:37.005473 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:39:37.005822 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:37.005531 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70e12877-2720-42f7-b047-316b48c6b8fe-metrics-tls podName:70e12877-2720-42f7-b047-316b48c6b8fe nodeName:}" failed. No retries permitted until 2026-04-21 04:39:37.505513616 +0000 UTC m=+32.627071401 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/70e12877-2720-42f7-b047-316b48c6b8fe-metrics-tls") pod "dns-default-qjtkj" (UID: "70e12877-2720-42f7-b047-316b48c6b8fe") : secret "dns-default-metrics-tls" not found Apr 21 04:39:37.005822 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:37.005771 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/70e12877-2720-42f7-b047-316b48c6b8fe-tmp-dir\") pod \"dns-default-qjtkj\" (UID: \"70e12877-2720-42f7-b047-316b48c6b8fe\") " pod="openshift-dns/dns-default-qjtkj" Apr 21 04:39:37.005983 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:37.005958 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70e12877-2720-42f7-b047-316b48c6b8fe-config-volume\") pod \"dns-default-qjtkj\" (UID: \"70e12877-2720-42f7-b047-316b48c6b8fe\") " pod="openshift-dns/dns-default-qjtkj" Apr 21 04:39:37.019018 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:37.018847 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4758\" (UniqueName: \"kubernetes.io/projected/70e12877-2720-42f7-b047-316b48c6b8fe-kube-api-access-v4758\") pod \"dns-default-qjtkj\" (UID: \"70e12877-2720-42f7-b047-316b48c6b8fe\") " pod="openshift-dns/dns-default-qjtkj" Apr 21 04:39:37.019183 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:37.018988 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4qtg\" (UniqueName: \"kubernetes.io/projected/f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8-kube-api-access-m4qtg\") pod \"ingress-canary-k8p5j\" (UID: \"f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8\") " pod="openshift-ingress-canary/ingress-canary-k8p5j" Apr 21 04:39:37.509369 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:37.509331 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8-cert\") pod \"ingress-canary-k8p5j\" (UID: \"f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8\") " pod="openshift-ingress-canary/ingress-canary-k8p5j" Apr 21 04:39:37.509784 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:37.509410 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70e12877-2720-42f7-b047-316b48c6b8fe-metrics-tls\") pod \"dns-default-qjtkj\" (UID: \"70e12877-2720-42f7-b047-316b48c6b8fe\") " pod="openshift-dns/dns-default-qjtkj" Apr 21 04:39:37.509784 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:37.509491 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:39:37.509784 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:37.509516 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:39:37.509784 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:37.509577 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8-cert podName:f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:38.509555776 +0000 UTC m=+33.631113566 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8-cert") pod "ingress-canary-k8p5j" (UID: "f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8") : secret "canary-serving-cert" not found Apr 21 04:39:37.509784 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:37.509595 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70e12877-2720-42f7-b047-316b48c6b8fe-metrics-tls podName:70e12877-2720-42f7-b047-316b48c6b8fe nodeName:}" failed. No retries permitted until 2026-04-21 04:39:38.509588167 +0000 UTC m=+33.631145951 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/70e12877-2720-42f7-b047-316b48c6b8fe-metrics-tls") pod "dns-default-qjtkj" (UID: "70e12877-2720-42f7-b047-316b48c6b8fe") : secret "dns-default-metrics-tls" not found Apr 21 04:39:38.112630 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:38.112585 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8743d92c-6080-4066-ad83-55bb582a3f6c-metrics-certs\") pod \"network-metrics-daemon-z4rqh\" (UID: \"8743d92c-6080-4066-ad83-55bb582a3f6c\") " pod="openshift-multus/network-metrics-daemon-z4rqh" Apr 21 04:39:38.112806 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:38.112725 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:39:38.112806 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:38.112799 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8743d92c-6080-4066-ad83-55bb582a3f6c-metrics-certs podName:8743d92c-6080-4066-ad83-55bb582a3f6c nodeName:}" failed. No retries permitted until 2026-04-21 04:40:10.11278003 +0000 UTC m=+65.234337832 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8743d92c-6080-4066-ad83-55bb582a3f6c-metrics-certs") pod "network-metrics-daemon-z4rqh" (UID: "8743d92c-6080-4066-ad83-55bb582a3f6c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:39:38.314019 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:38.313984 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmqbk\" (UniqueName: \"kubernetes.io/projected/f1f8e1ef-1033-4a98-837c-e59fe409b8fa-kube-api-access-fmqbk\") pod \"network-check-target-n47bc\" (UID: \"f1f8e1ef-1033-4a98-837c-e59fe409b8fa\") " pod="openshift-network-diagnostics/network-check-target-n47bc" Apr 21 04:39:38.314233 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:38.314181 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:39:38.314233 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:38.314204 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:39:38.314233 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:38.314215 2568 projected.go:194] Error preparing data for projected volume kube-api-access-fmqbk for pod openshift-network-diagnostics/network-check-target-n47bc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:39:38.314366 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:38.314272 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f1f8e1ef-1033-4a98-837c-e59fe409b8fa-kube-api-access-fmqbk podName:f1f8e1ef-1033-4a98-837c-e59fe409b8fa nodeName:}" failed. No retries permitted until 2026-04-21 04:40:10.314252586 +0000 UTC m=+65.435810374 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-fmqbk" (UniqueName: "kubernetes.io/projected/f1f8e1ef-1033-4a98-837c-e59fe409b8fa-kube-api-access-fmqbk") pod "network-check-target-n47bc" (UID: "f1f8e1ef-1033-4a98-837c-e59fe409b8fa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:39:38.480655 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:38.480616 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z4rqh" Apr 21 04:39:38.480840 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:38.480614 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n47bc" Apr 21 04:39:38.485049 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:38.485022 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 04:39:38.485049 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:38.485048 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 04:39:38.485254 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:38.485107 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jgxqw\"" Apr 21 04:39:38.485254 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:38.485120 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 04:39:38.485254 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:38.485119 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-979g8\"" Apr 21 04:39:38.516016 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:38.515972 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8-cert\") pod \"ingress-canary-k8p5j\" (UID: \"f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8\") " pod="openshift-ingress-canary/ingress-canary-k8p5j" Apr 21 04:39:38.516432 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:38.516043 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70e12877-2720-42f7-b047-316b48c6b8fe-metrics-tls\") pod \"dns-default-qjtkj\" (UID: \"70e12877-2720-42f7-b047-316b48c6b8fe\") " pod="openshift-dns/dns-default-qjtkj" Apr 21 04:39:38.516432 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:38.516155 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:39:38.516432 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:38.516185 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:39:38.516432 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:38.516234 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8-cert podName:f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:40.516213649 +0000 UTC m=+35.637771449 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8-cert") pod "ingress-canary-k8p5j" (UID: "f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8") : secret "canary-serving-cert" not found Apr 21 04:39:38.516432 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:38.516255 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70e12877-2720-42f7-b047-316b48c6b8fe-metrics-tls podName:70e12877-2720-42f7-b047-316b48c6b8fe nodeName:}" failed. No retries permitted until 2026-04-21 04:39:40.516244966 +0000 UTC m=+35.637802757 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/70e12877-2720-42f7-b047-316b48c6b8fe-metrics-tls") pod "dns-default-qjtkj" (UID: "70e12877-2720-42f7-b047-316b48c6b8fe") : secret "dns-default-metrics-tls" not found Apr 21 04:39:40.530991 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:40.530957 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8-cert\") pod \"ingress-canary-k8p5j\" (UID: \"f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8\") " pod="openshift-ingress-canary/ingress-canary-k8p5j" Apr 21 04:39:40.531382 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:40.531016 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70e12877-2720-42f7-b047-316b48c6b8fe-metrics-tls\") pod \"dns-default-qjtkj\" (UID: \"70e12877-2720-42f7-b047-316b48c6b8fe\") " pod="openshift-dns/dns-default-qjtkj" Apr 21 04:39:40.531382 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:40.531135 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:39:40.531382 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:40.531143 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:39:40.531382 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:40.531198 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70e12877-2720-42f7-b047-316b48c6b8fe-metrics-tls podName:70e12877-2720-42f7-b047-316b48c6b8fe nodeName:}" failed. No retries permitted until 2026-04-21 04:39:44.531184356 +0000 UTC m=+39.652742141 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/70e12877-2720-42f7-b047-316b48c6b8fe-metrics-tls") pod "dns-default-qjtkj" (UID: "70e12877-2720-42f7-b047-316b48c6b8fe") : secret "dns-default-metrics-tls" not found Apr 21 04:39:40.531382 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:40.531213 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8-cert podName:f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:44.531205863 +0000 UTC m=+39.652763647 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8-cert") pod "ingress-canary-k8p5j" (UID: "f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8") : secret "canary-serving-cert" not found Apr 21 04:39:40.773309 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:40.773276 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-km9z2" event={"ID":"4d82872b-b0cd-4247-abb3-ce1e75dfd32b","Type":"ContainerStarted","Data":"11f542bb488413a8679a6018907f81cd94be3c29a7d16d7676714570da0c1373"} Apr 21 04:39:41.777772 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:41.777741 2568 generic.go:358] "Generic (PLEG): container finished" podID="4d82872b-b0cd-4247-abb3-ce1e75dfd32b" containerID="11f542bb488413a8679a6018907f81cd94be3c29a7d16d7676714570da0c1373" exitCode=0 Apr 21 04:39:41.778250 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:41.777801 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-km9z2" event={"ID":"4d82872b-b0cd-4247-abb3-ce1e75dfd32b","Type":"ContainerDied","Data":"11f542bb488413a8679a6018907f81cd94be3c29a7d16d7676714570da0c1373"} Apr 21 04:39:42.782489 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:42.782455 2568 generic.go:358] "Generic (PLEG): container finished" podID="4d82872b-b0cd-4247-abb3-ce1e75dfd32b" containerID="684ba74951d0e848451a4d399318dd7c581529b9ffd400a7b3e6a680e0df4a96" exitCode=0 Apr 21 04:39:42.782880 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:42.782525 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-km9z2" event={"ID":"4d82872b-b0cd-4247-abb3-ce1e75dfd32b","Type":"ContainerDied","Data":"684ba74951d0e848451a4d399318dd7c581529b9ffd400a7b3e6a680e0df4a96"} Apr 21 04:39:43.787809 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:43.787771 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-km9z2" event={"ID":"4d82872b-b0cd-4247-abb3-ce1e75dfd32b","Type":"ContainerStarted","Data":"3beb9c4978a48b783b4f7b5ff8ad3963f096c6f7684e5a55a7dabc9affdd41ea"} Apr 21 04:39:43.811078 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:43.811030 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-km9z2" podStartSLOduration=5.084382388 podStartE2EDuration="38.811015058s" podCreationTimestamp="2026-04-21 04:39:05 +0000 UTC" firstStartedPulling="2026-04-21 04:39:06.808221165 +0000 UTC m=+1.929778950" lastFinishedPulling="2026-04-21 04:39:40.534853823 +0000 UTC m=+35.656411620" observedRunningTime="2026-04-21 04:39:43.810063216 +0000 UTC m=+38.931621022" watchObservedRunningTime="2026-04-21 04:39:43.811015058 +0000 UTC m=+38.932572862" Apr 21 04:39:44.562165 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:44.562126 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8-cert\") pod \"ingress-canary-k8p5j\" (UID: \"f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8\") " pod="openshift-ingress-canary/ingress-canary-k8p5j" Apr 21 04:39:44.562340 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:44.562184 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70e12877-2720-42f7-b047-316b48c6b8fe-metrics-tls\") pod \"dns-default-qjtkj\" (UID: \"70e12877-2720-42f7-b047-316b48c6b8fe\") " pod="openshift-dns/dns-default-qjtkj" Apr 21 04:39:44.562340 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:44.562268 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:39:44.562340 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:44.562270 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:39:44.562340 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:44.562321 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70e12877-2720-42f7-b047-316b48c6b8fe-metrics-tls podName:70e12877-2720-42f7-b047-316b48c6b8fe nodeName:}" failed. No retries permitted until 2026-04-21 04:39:52.562307647 +0000 UTC m=+47.683865431 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/70e12877-2720-42f7-b047-316b48c6b8fe-metrics-tls") pod "dns-default-qjtkj" (UID: "70e12877-2720-42f7-b047-316b48c6b8fe") : secret "dns-default-metrics-tls" not found Apr 21 04:39:44.562340 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:44.562334 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8-cert podName:f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8 nodeName:}" failed. No retries permitted until 2026-04-21 04:39:52.562328434 +0000 UTC m=+47.683886219 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8-cert") pod "ingress-canary-k8p5j" (UID: "f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8") : secret "canary-serving-cert" not found Apr 21 04:39:52.617440 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:52.617403 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8-cert\") pod \"ingress-canary-k8p5j\" (UID: \"f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8\") " pod="openshift-ingress-canary/ingress-canary-k8p5j" Apr 21 04:39:52.617901 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:39:52.617476 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70e12877-2720-42f7-b047-316b48c6b8fe-metrics-tls\") pod \"dns-default-qjtkj\" (UID: \"70e12877-2720-42f7-b047-316b48c6b8fe\") " pod="openshift-dns/dns-default-qjtkj" Apr 21 04:39:52.617901 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:52.617578 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:39:52.617901 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:52.617635 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:39:52.617901 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:52.617650 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8-cert podName:f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8 nodeName:}" failed. No retries permitted until 2026-04-21 04:40:08.617634636 +0000 UTC m=+63.739192421 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8-cert") pod "ingress-canary-k8p5j" (UID: "f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8") : secret "canary-serving-cert" not found Apr 21 04:39:52.617901 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:39:52.617697 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70e12877-2720-42f7-b047-316b48c6b8fe-metrics-tls podName:70e12877-2720-42f7-b047-316b48c6b8fe nodeName:}" failed. No retries permitted until 2026-04-21 04:40:08.617682825 +0000 UTC m=+63.739240610 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/70e12877-2720-42f7-b047-316b48c6b8fe-metrics-tls") pod "dns-default-qjtkj" (UID: "70e12877-2720-42f7-b047-316b48c6b8fe") : secret "dns-default-metrics-tls" not found Apr 21 04:40:04.769350 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:04.769322 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jhdlv" Apr 21 04:40:08.633842 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:08.633805 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8-cert\") pod \"ingress-canary-k8p5j\" (UID: \"f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8\") " pod="openshift-ingress-canary/ingress-canary-k8p5j" Apr 21 04:40:08.634254 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:08.633867 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70e12877-2720-42f7-b047-316b48c6b8fe-metrics-tls\") pod \"dns-default-qjtkj\" (UID: \"70e12877-2720-42f7-b047-316b48c6b8fe\") " pod="openshift-dns/dns-default-qjtkj" Apr 21 04:40:08.634254 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:08.633958 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:40:08.634254 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:08.633961 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:40:08.634254 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:08.634017 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70e12877-2720-42f7-b047-316b48c6b8fe-metrics-tls podName:70e12877-2720-42f7-b047-316b48c6b8fe nodeName:}" failed. No retries permitted until 2026-04-21 04:40:40.634003308 +0000 UTC m=+95.755561093 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/70e12877-2720-42f7-b047-316b48c6b8fe-metrics-tls") pod "dns-default-qjtkj" (UID: "70e12877-2720-42f7-b047-316b48c6b8fe") : secret "dns-default-metrics-tls" not found Apr 21 04:40:08.634254 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:08.634030 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8-cert podName:f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8 nodeName:}" failed. No retries permitted until 2026-04-21 04:40:40.634024833 +0000 UTC m=+95.755582617 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8-cert") pod "ingress-canary-k8p5j" (UID: "f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8") : secret "canary-serving-cert" not found Apr 21 04:40:10.143403 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:10.143352 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8743d92c-6080-4066-ad83-55bb582a3f6c-metrics-certs\") pod \"network-metrics-daemon-z4rqh\" (UID: \"8743d92c-6080-4066-ad83-55bb582a3f6c\") " pod="openshift-multus/network-metrics-daemon-z4rqh" Apr 21 04:40:10.146263 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:10.146245 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 04:40:10.154219 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:10.154199 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 04:40:10.154288 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:10.154269 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8743d92c-6080-4066-ad83-55bb582a3f6c-metrics-certs podName:8743d92c-6080-4066-ad83-55bb582a3f6c nodeName:}" failed. No retries permitted until 2026-04-21 04:41:14.154249129 +0000 UTC m=+129.275806917 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8743d92c-6080-4066-ad83-55bb582a3f6c-metrics-certs") pod "network-metrics-daemon-z4rqh" (UID: "8743d92c-6080-4066-ad83-55bb582a3f6c") : secret "metrics-daemon-secret" not found Apr 21 04:40:10.344508 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:10.344454 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmqbk\" (UniqueName: \"kubernetes.io/projected/f1f8e1ef-1033-4a98-837c-e59fe409b8fa-kube-api-access-fmqbk\") pod \"network-check-target-n47bc\" (UID: \"f1f8e1ef-1033-4a98-837c-e59fe409b8fa\") " pod="openshift-network-diagnostics/network-check-target-n47bc" Apr 21 04:40:10.347263 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:10.347243 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 04:40:10.357349 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:10.357331 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 04:40:10.369039 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:10.369005 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmqbk\" (UniqueName: \"kubernetes.io/projected/f1f8e1ef-1033-4a98-837c-e59fe409b8fa-kube-api-access-fmqbk\") pod \"network-check-target-n47bc\" (UID: \"f1f8e1ef-1033-4a98-837c-e59fe409b8fa\") " pod="openshift-network-diagnostics/network-check-target-n47bc" Apr 21 04:40:10.601230 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:10.601149 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jgxqw\"" Apr 21 04:40:10.608843 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:10.608822 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n47bc" Apr 21 04:40:10.730926 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:10.730890 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-n47bc"] Apr 21 04:40:10.733942 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:40:10.733915 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1f8e1ef_1033_4a98_837c_e59fe409b8fa.slice/crio-90e0b4bac61447b50e62801f92f2bc8671aa443f7a328193454ead87d6ce7367 WatchSource:0}: Error finding container 90e0b4bac61447b50e62801f92f2bc8671aa443f7a328193454ead87d6ce7367: Status 404 returned error can't find the container with id 90e0b4bac61447b50e62801f92f2bc8671aa443f7a328193454ead87d6ce7367 Apr 21 04:40:10.844720 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:10.844684 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-n47bc" event={"ID":"f1f8e1ef-1033-4a98-837c-e59fe409b8fa","Type":"ContainerStarted","Data":"90e0b4bac61447b50e62801f92f2bc8671aa443f7a328193454ead87d6ce7367"} Apr 21 04:40:13.852380 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:13.852293 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-n47bc" event={"ID":"f1f8e1ef-1033-4a98-837c-e59fe409b8fa","Type":"ContainerStarted","Data":"e536ca1df5f49a64d6e053bf15e2c1772a60062fc95b8d1c0d2f90360c19f5a5"} Apr 21 04:40:13.852746 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:13.852410 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-n47bc" Apr 21 04:40:13.867439 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:13.867388 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-n47bc" podStartSLOduration=66.059633226 podStartE2EDuration="1m8.867373954s" podCreationTimestamp="2026-04-21 04:39:05 +0000 UTC" firstStartedPulling="2026-04-21 04:40:10.735621001 +0000 UTC m=+65.857178787" lastFinishedPulling="2026-04-21 04:40:13.543361726 +0000 UTC m=+68.664919515" observedRunningTime="2026-04-21 04:40:13.866467328 +0000 UTC m=+68.988025147" watchObservedRunningTime="2026-04-21 04:40:13.867373954 +0000 UTC m=+68.988931760" Apr 21 04:40:38.182432 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.182308 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-srd87"] Apr 21 04:40:38.186617 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.186595 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-686d5855f4-tcz66"] Apr 21 04:40:38.186759 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.186743 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-srd87" Apr 21 04:40:38.189084 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.189045 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 21 04:40:38.189212 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.189172 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 04:40:38.189271 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.189214 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 04:40:38.189271 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.189255 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-686d5855f4-tcz66" Apr 21 04:40:38.189827 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.189805 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 21 04:40:38.189922 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.189892 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-t2hvb\"" Apr 21 04:40:38.191837 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.191816 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 04:40:38.191965 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.191817 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 21 04:40:38.192662 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.192642 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-r25fb\"" Apr 21 04:40:38.192775 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.192642 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 21 04:40:38.192775 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.192642 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 21 04:40:38.192775 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.192722 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 21 04:40:38.192946 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.192907 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 04:40:38.194930 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.194913 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 21 04:40:38.197821 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.197799 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-srd87"] Apr 21 04:40:38.198844 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.198826 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-686d5855f4-tcz66"] Apr 21 04:40:38.232852 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.232821 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e7fc8dd3-0312-4bab-a12c-6a11df14266e-stats-auth\") pod \"router-default-686d5855f4-tcz66\" (UID: \"e7fc8dd3-0312-4bab-a12c-6a11df14266e\") " pod="openshift-ingress/router-default-686d5855f4-tcz66" Apr 21 04:40:38.232852 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.232853 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/74465859-3c82-4f58-832b-74cc4fbe41ce-tmp\") pod \"insights-operator-585dfdc468-srd87\" (UID: \"74465859-3c82-4f58-832b-74cc4fbe41ce\") " pod="openshift-insights/insights-operator-585dfdc468-srd87" Apr 21 04:40:38.233110 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.232889 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74465859-3c82-4f58-832b-74cc4fbe41ce-serving-cert\") pod \"insights-operator-585dfdc468-srd87\" (UID: \"74465859-3c82-4f58-832b-74cc4fbe41ce\") " pod="openshift-insights/insights-operator-585dfdc468-srd87" Apr 21 04:40:38.233110 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.232968 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7fc8dd3-0312-4bab-a12c-6a11df14266e-service-ca-bundle\") pod \"router-default-686d5855f4-tcz66\" (UID: \"e7fc8dd3-0312-4bab-a12c-6a11df14266e\") " pod="openshift-ingress/router-default-686d5855f4-tcz66" Apr 21 04:40:38.233110 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.233015 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74465859-3c82-4f58-832b-74cc4fbe41ce-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-srd87\" (UID: \"74465859-3c82-4f58-832b-74cc4fbe41ce\") " pod="openshift-insights/insights-operator-585dfdc468-srd87" Apr 21 04:40:38.233110 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.233034 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/74465859-3c82-4f58-832b-74cc4fbe41ce-snapshots\") pod \"insights-operator-585dfdc468-srd87\" (UID: \"74465859-3c82-4f58-832b-74cc4fbe41ce\") " pod="openshift-insights/insights-operator-585dfdc468-srd87" Apr 21 04:40:38.233110 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.233081 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7fc8dd3-0312-4bab-a12c-6a11df14266e-metrics-certs\") pod \"router-default-686d5855f4-tcz66\" (UID: \"e7fc8dd3-0312-4bab-a12c-6a11df14266e\") " pod="openshift-ingress/router-default-686d5855f4-tcz66" Apr 21 04:40:38.233299 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.233134 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74465859-3c82-4f58-832b-74cc4fbe41ce-service-ca-bundle\") pod \"insights-operator-585dfdc468-srd87\" (UID: \"74465859-3c82-4f58-832b-74cc4fbe41ce\") " pod="openshift-insights/insights-operator-585dfdc468-srd87" Apr 21 04:40:38.233299 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.233172 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e7fc8dd3-0312-4bab-a12c-6a11df14266e-default-certificate\") pod \"router-default-686d5855f4-tcz66\" (UID: \"e7fc8dd3-0312-4bab-a12c-6a11df14266e\") " pod="openshift-ingress/router-default-686d5855f4-tcz66" Apr 21 04:40:38.233299 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.233212 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zdpx\" (UniqueName: \"kubernetes.io/projected/74465859-3c82-4f58-832b-74cc4fbe41ce-kube-api-access-7zdpx\") pod \"insights-operator-585dfdc468-srd87\" (UID: \"74465859-3c82-4f58-832b-74cc4fbe41ce\") " pod="openshift-insights/insights-operator-585dfdc468-srd87" Apr 21 04:40:38.233299 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.233242 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stp7l\" (UniqueName: \"kubernetes.io/projected/e7fc8dd3-0312-4bab-a12c-6a11df14266e-kube-api-access-stp7l\") pod \"router-default-686d5855f4-tcz66\" (UID: \"e7fc8dd3-0312-4bab-a12c-6a11df14266e\") " pod="openshift-ingress/router-default-686d5855f4-tcz66" Apr 21 04:40:38.284415 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.284390 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-g25vj"] Apr 21 04:40:38.287217 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.287198 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bhmvl"] Apr 21 04:40:38.287362 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.287344 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g25vj" Apr 21 04:40:38.289844 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.289824 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 21 04:40:38.289844 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.289840 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 04:40:38.290033 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.289824 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 04:40:38.290118 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.290101 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bhmvl" Apr 21 04:40:38.290224 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.290206 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-nh5zd\"" Apr 21 04:40:38.293126 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.293104 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 21 04:40:38.293848 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.293827 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-bd9zl\"" Apr 21 04:40:38.294138 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.294121 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 21 04:40:38.294262 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.294243 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 21 04:40:38.294410 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.294390 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 21 04:40:38.297446 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.297422 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-g25vj"] Apr 21 04:40:38.298081 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.298048 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bhmvl"] Apr 21 04:40:38.334466 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.334433 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e7fc8dd3-0312-4bab-a12c-6a11df14266e-default-certificate\") pod \"router-default-686d5855f4-tcz66\" (UID: \"e7fc8dd3-0312-4bab-a12c-6a11df14266e\") " pod="openshift-ingress/router-default-686d5855f4-tcz66" Apr 21 04:40:38.334466 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.334471 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7zdpx\" (UniqueName: \"kubernetes.io/projected/74465859-3c82-4f58-832b-74cc4fbe41ce-kube-api-access-7zdpx\") pod \"insights-operator-585dfdc468-srd87\" (UID: \"74465859-3c82-4f58-832b-74cc4fbe41ce\") " pod="openshift-insights/insights-operator-585dfdc468-srd87" Apr 21 04:40:38.334724 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.334493 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/8cf0095a-7001-4da1-893d-f6430e613fe9-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-g25vj\" (UID: \"8cf0095a-7001-4da1-893d-f6430e613fe9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g25vj" Apr 21 04:40:38.334724 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.334515 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8cf0095a-7001-4da1-893d-f6430e613fe9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-g25vj\" (UID: \"8cf0095a-7001-4da1-893d-f6430e613fe9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g25vj" Apr 21 04:40:38.334724 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.334563 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stp7l\" (UniqueName: \"kubernetes.io/projected/e7fc8dd3-0312-4bab-a12c-6a11df14266e-kube-api-access-stp7l\") pod \"router-default-686d5855f4-tcz66\" (UID: \"e7fc8dd3-0312-4bab-a12c-6a11df14266e\") " pod="openshift-ingress/router-default-686d5855f4-tcz66" Apr 21 04:40:38.334724 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.334691 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q4m4\" (UniqueName: \"kubernetes.io/projected/8cf0095a-7001-4da1-893d-f6430e613fe9-kube-api-access-8q4m4\") pod \"cluster-monitoring-operator-75587bd455-g25vj\" (UID: \"8cf0095a-7001-4da1-893d-f6430e613fe9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g25vj" Apr 21 04:40:38.334724 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.334725 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e7fc8dd3-0312-4bab-a12c-6a11df14266e-stats-auth\") pod \"router-default-686d5855f4-tcz66\" (UID: \"e7fc8dd3-0312-4bab-a12c-6a11df14266e\") " pod="openshift-ingress/router-default-686d5855f4-tcz66" Apr 21 04:40:38.334968 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.334741 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/74465859-3c82-4f58-832b-74cc4fbe41ce-tmp\") pod \"insights-operator-585dfdc468-srd87\" (UID: \"74465859-3c82-4f58-832b-74cc4fbe41ce\") " pod="openshift-insights/insights-operator-585dfdc468-srd87" Apr 21 04:40:38.334968 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.334764 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g24fw\" (UniqueName: \"kubernetes.io/projected/3a707581-ce9a-46b8-9335-8f18bd8dc98c-kube-api-access-g24fw\") pod \"cluster-samples-operator-6dc5bdb6b4-bhmvl\" (UID: \"3a707581-ce9a-46b8-9335-8f18bd8dc98c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bhmvl" Apr 21 04:40:38.334968 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.334789 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74465859-3c82-4f58-832b-74cc4fbe41ce-serving-cert\") pod \"insights-operator-585dfdc468-srd87\" (UID: \"74465859-3c82-4f58-832b-74cc4fbe41ce\") " pod="openshift-insights/insights-operator-585dfdc468-srd87" Apr 21 04:40:38.334968 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.334818 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7fc8dd3-0312-4bab-a12c-6a11df14266e-service-ca-bundle\") pod \"router-default-686d5855f4-tcz66\" (UID: \"e7fc8dd3-0312-4bab-a12c-6a11df14266e\") " pod="openshift-ingress/router-default-686d5855f4-tcz66" Apr 21 04:40:38.334968 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.334862 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74465859-3c82-4f58-832b-74cc4fbe41ce-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-srd87\" (UID: \"74465859-3c82-4f58-832b-74cc4fbe41ce\") " pod="openshift-insights/insights-operator-585dfdc468-srd87" Apr 21 04:40:38.334968 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.334888 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/74465859-3c82-4f58-832b-74cc4fbe41ce-snapshots\") pod \"insights-operator-585dfdc468-srd87\" (UID: \"74465859-3c82-4f58-832b-74cc4fbe41ce\") " pod="openshift-insights/insights-operator-585dfdc468-srd87" Apr 21 04:40:38.334968 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.334913 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a707581-ce9a-46b8-9335-8f18bd8dc98c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bhmvl\" (UID: \"3a707581-ce9a-46b8-9335-8f18bd8dc98c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bhmvl" Apr 21 04:40:38.334968 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.334960 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7fc8dd3-0312-4bab-a12c-6a11df14266e-metrics-certs\") pod \"router-default-686d5855f4-tcz66\" (UID: \"e7fc8dd3-0312-4bab-a12c-6a11df14266e\") " pod="openshift-ingress/router-default-686d5855f4-tcz66" Apr 21 04:40:38.335365 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.335010 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74465859-3c82-4f58-832b-74cc4fbe41ce-service-ca-bundle\") pod \"insights-operator-585dfdc468-srd87\" (UID: \"74465859-3c82-4f58-832b-74cc4fbe41ce\") " pod="openshift-insights/insights-operator-585dfdc468-srd87" Apr 21 04:40:38.335365 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:38.335045 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e7fc8dd3-0312-4bab-a12c-6a11df14266e-service-ca-bundle podName:e7fc8dd3-0312-4bab-a12c-6a11df14266e nodeName:}" failed. No retries permitted until 2026-04-21 04:40:38.835018497 +0000 UTC m=+93.956576285 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e7fc8dd3-0312-4bab-a12c-6a11df14266e-service-ca-bundle") pod "router-default-686d5855f4-tcz66" (UID: "e7fc8dd3-0312-4bab-a12c-6a11df14266e") : configmap references non-existent config key: service-ca.crt Apr 21 04:40:38.335365 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.335209 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/74465859-3c82-4f58-832b-74cc4fbe41ce-tmp\") pod \"insights-operator-585dfdc468-srd87\" (UID: \"74465859-3c82-4f58-832b-74cc4fbe41ce\") " pod="openshift-insights/insights-operator-585dfdc468-srd87" Apr 21 04:40:38.335534 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:38.335401 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 04:40:38.335534 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:38.335454 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7fc8dd3-0312-4bab-a12c-6a11df14266e-metrics-certs podName:e7fc8dd3-0312-4bab-a12c-6a11df14266e nodeName:}" failed. No retries permitted until 2026-04-21 04:40:38.835436434 +0000 UTC m=+93.956994233 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7fc8dd3-0312-4bab-a12c-6a11df14266e-metrics-certs") pod "router-default-686d5855f4-tcz66" (UID: "e7fc8dd3-0312-4bab-a12c-6a11df14266e") : secret "router-metrics-certs-default" not found Apr 21 04:40:38.335638 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.335573 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/74465859-3c82-4f58-832b-74cc4fbe41ce-snapshots\") pod \"insights-operator-585dfdc468-srd87\" (UID: \"74465859-3c82-4f58-832b-74cc4fbe41ce\") " pod="openshift-insights/insights-operator-585dfdc468-srd87" Apr 21 04:40:38.335638 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.335617 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74465859-3c82-4f58-832b-74cc4fbe41ce-service-ca-bundle\") pod \"insights-operator-585dfdc468-srd87\" (UID: \"74465859-3c82-4f58-832b-74cc4fbe41ce\") " pod="openshift-insights/insights-operator-585dfdc468-srd87" Apr 21 04:40:38.335830 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.335810 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74465859-3c82-4f58-832b-74cc4fbe41ce-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-srd87\" (UID: \"74465859-3c82-4f58-832b-74cc4fbe41ce\") " pod="openshift-insights/insights-operator-585dfdc468-srd87" Apr 21 04:40:38.337178 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.337156 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e7fc8dd3-0312-4bab-a12c-6a11df14266e-stats-auth\") pod \"router-default-686d5855f4-tcz66\" (UID: \"e7fc8dd3-0312-4bab-a12c-6a11df14266e\") " pod="openshift-ingress/router-default-686d5855f4-tcz66" Apr 21 04:40:38.337268 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.337236 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74465859-3c82-4f58-832b-74cc4fbe41ce-serving-cert\") pod \"insights-operator-585dfdc468-srd87\" (UID: \"74465859-3c82-4f58-832b-74cc4fbe41ce\") " pod="openshift-insights/insights-operator-585dfdc468-srd87" Apr 21 04:40:38.337446 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.337430 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e7fc8dd3-0312-4bab-a12c-6a11df14266e-default-certificate\") pod \"router-default-686d5855f4-tcz66\" (UID: \"e7fc8dd3-0312-4bab-a12c-6a11df14266e\") " pod="openshift-ingress/router-default-686d5855f4-tcz66" Apr 21 04:40:38.346203 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.346175 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zdpx\" (UniqueName: \"kubernetes.io/projected/74465859-3c82-4f58-832b-74cc4fbe41ce-kube-api-access-7zdpx\") pod \"insights-operator-585dfdc468-srd87\" (UID: \"74465859-3c82-4f58-832b-74cc4fbe41ce\") " pod="openshift-insights/insights-operator-585dfdc468-srd87" Apr 21 04:40:38.346340 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.346240 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-stp7l\" (UniqueName: \"kubernetes.io/projected/e7fc8dd3-0312-4bab-a12c-6a11df14266e-kube-api-access-stp7l\") pod \"router-default-686d5855f4-tcz66\" (UID: \"e7fc8dd3-0312-4bab-a12c-6a11df14266e\") " pod="openshift-ingress/router-default-686d5855f4-tcz66" Apr 21 04:40:38.436226 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.436132 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/8cf0095a-7001-4da1-893d-f6430e613fe9-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-g25vj\" (UID: \"8cf0095a-7001-4da1-893d-f6430e613fe9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g25vj" Apr 21 04:40:38.436226 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.436169 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8cf0095a-7001-4da1-893d-f6430e613fe9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-g25vj\" (UID: \"8cf0095a-7001-4da1-893d-f6430e613fe9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g25vj" Apr 21 04:40:38.436226 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.436199 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8q4m4\" (UniqueName: \"kubernetes.io/projected/8cf0095a-7001-4da1-893d-f6430e613fe9-kube-api-access-8q4m4\") pod \"cluster-monitoring-operator-75587bd455-g25vj\" (UID: \"8cf0095a-7001-4da1-893d-f6430e613fe9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g25vj" Apr 21 04:40:38.436492 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:38.436275 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 04:40:38.436492 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:38.436355 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cf0095a-7001-4da1-893d-f6430e613fe9-cluster-monitoring-operator-tls podName:8cf0095a-7001-4da1-893d-f6430e613fe9 nodeName:}" failed. No retries permitted until 2026-04-21 04:40:38.936334314 +0000 UTC m=+94.057892099 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8cf0095a-7001-4da1-893d-f6430e613fe9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-g25vj" (UID: "8cf0095a-7001-4da1-893d-f6430e613fe9") : secret "cluster-monitoring-operator-tls" not found Apr 21 04:40:38.436492 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.436376 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g24fw\" (UniqueName: \"kubernetes.io/projected/3a707581-ce9a-46b8-9335-8f18bd8dc98c-kube-api-access-g24fw\") pod \"cluster-samples-operator-6dc5bdb6b4-bhmvl\" (UID: \"3a707581-ce9a-46b8-9335-8f18bd8dc98c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bhmvl" Apr 21 04:40:38.436492 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.436433 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a707581-ce9a-46b8-9335-8f18bd8dc98c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bhmvl\" (UID: \"3a707581-ce9a-46b8-9335-8f18bd8dc98c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bhmvl" Apr 21 04:40:38.436656 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:38.436546 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 04:40:38.436656 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:38.436592 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a707581-ce9a-46b8-9335-8f18bd8dc98c-samples-operator-tls podName:3a707581-ce9a-46b8-9335-8f18bd8dc98c nodeName:}" failed. No retries permitted until 2026-04-21 04:40:38.93657992 +0000 UTC m=+94.058137704 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3a707581-ce9a-46b8-9335-8f18bd8dc98c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-bhmvl" (UID: "3a707581-ce9a-46b8-9335-8f18bd8dc98c") : secret "samples-operator-tls" not found Apr 21 04:40:38.436839 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.436819 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/8cf0095a-7001-4da1-893d-f6430e613fe9-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-g25vj\" (UID: \"8cf0095a-7001-4da1-893d-f6430e613fe9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g25vj" Apr 21 04:40:38.445115 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.445087 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q4m4\" (UniqueName: \"kubernetes.io/projected/8cf0095a-7001-4da1-893d-f6430e613fe9-kube-api-access-8q4m4\") pod \"cluster-monitoring-operator-75587bd455-g25vj\" (UID: \"8cf0095a-7001-4da1-893d-f6430e613fe9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g25vj" Apr 21 04:40:38.445246 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.445222 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g24fw\" (UniqueName: \"kubernetes.io/projected/3a707581-ce9a-46b8-9335-8f18bd8dc98c-kube-api-access-g24fw\") pod \"cluster-samples-operator-6dc5bdb6b4-bhmvl\" (UID: \"3a707581-ce9a-46b8-9335-8f18bd8dc98c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bhmvl" Apr 21 04:40:38.496744 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.496711 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-srd87" Apr 21 04:40:38.606698 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.606669 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-srd87"] Apr 21 04:40:38.609536 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:40:38.609500 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74465859_3c82_4f58_832b_74cc4fbe41ce.slice/crio-03624ff4fbb77146861b4e4f43aa7d40d39f53ecaed5ecc93a59bb4e28382a6e WatchSource:0}: Error finding container 03624ff4fbb77146861b4e4f43aa7d40d39f53ecaed5ecc93a59bb4e28382a6e: Status 404 returned error can't find the container with id 03624ff4fbb77146861b4e4f43aa7d40d39f53ecaed5ecc93a59bb4e28382a6e Apr 21 04:40:38.839544 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.839457 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7fc8dd3-0312-4bab-a12c-6a11df14266e-service-ca-bundle\") pod \"router-default-686d5855f4-tcz66\" (UID: \"e7fc8dd3-0312-4bab-a12c-6a11df14266e\") " pod="openshift-ingress/router-default-686d5855f4-tcz66" Apr 21 04:40:38.839687 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.839620 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7fc8dd3-0312-4bab-a12c-6a11df14266e-metrics-certs\") pod \"router-default-686d5855f4-tcz66\" (UID: \"e7fc8dd3-0312-4bab-a12c-6a11df14266e\") " pod="openshift-ingress/router-default-686d5855f4-tcz66" Apr 21 04:40:38.839687 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:38.839630 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e7fc8dd3-0312-4bab-a12c-6a11df14266e-service-ca-bundle podName:e7fc8dd3-0312-4bab-a12c-6a11df14266e nodeName:}" failed. No retries permitted until 2026-04-21 04:40:39.83961427 +0000 UTC m=+94.961172056 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e7fc8dd3-0312-4bab-a12c-6a11df14266e-service-ca-bundle") pod "router-default-686d5855f4-tcz66" (UID: "e7fc8dd3-0312-4bab-a12c-6a11df14266e") : configmap references non-existent config key: service-ca.crt Apr 21 04:40:38.839763 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:38.839752 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 04:40:38.839812 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:38.839802 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7fc8dd3-0312-4bab-a12c-6a11df14266e-metrics-certs podName:e7fc8dd3-0312-4bab-a12c-6a11df14266e nodeName:}" failed. No retries permitted until 2026-04-21 04:40:39.83978715 +0000 UTC m=+94.961344939 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7fc8dd3-0312-4bab-a12c-6a11df14266e-metrics-certs") pod "router-default-686d5855f4-tcz66" (UID: "e7fc8dd3-0312-4bab-a12c-6a11df14266e") : secret "router-metrics-certs-default" not found Apr 21 04:40:38.900087 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.900047 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-srd87" event={"ID":"74465859-3c82-4f58-832b-74cc4fbe41ce","Type":"ContainerStarted","Data":"03624ff4fbb77146861b4e4f43aa7d40d39f53ecaed5ecc93a59bb4e28382a6e"} Apr 21 04:40:38.940579 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.940552 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a707581-ce9a-46b8-9335-8f18bd8dc98c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bhmvl\" (UID: \"3a707581-ce9a-46b8-9335-8f18bd8dc98c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bhmvl" Apr 21 04:40:38.940704 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:38.940608 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8cf0095a-7001-4da1-893d-f6430e613fe9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-g25vj\" (UID: \"8cf0095a-7001-4da1-893d-f6430e613fe9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g25vj" Apr 21 04:40:38.940704 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:38.940690 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 04:40:38.940823 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:38.940749 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a707581-ce9a-46b8-9335-8f18bd8dc98c-samples-operator-tls podName:3a707581-ce9a-46b8-9335-8f18bd8dc98c nodeName:}" failed. No retries permitted until 2026-04-21 04:40:39.940734888 +0000 UTC m=+95.062292678 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3a707581-ce9a-46b8-9335-8f18bd8dc98c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-bhmvl" (UID: "3a707581-ce9a-46b8-9335-8f18bd8dc98c") : secret "samples-operator-tls" not found Apr 21 04:40:38.940823 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:38.940694 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 04:40:38.940901 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:38.940824 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cf0095a-7001-4da1-893d-f6430e613fe9-cluster-monitoring-operator-tls podName:8cf0095a-7001-4da1-893d-f6430e613fe9 nodeName:}" failed. No retries permitted until 2026-04-21 04:40:39.940813083 +0000 UTC m=+95.062370873 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8cf0095a-7001-4da1-893d-f6430e613fe9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-g25vj" (UID: "8cf0095a-7001-4da1-893d-f6430e613fe9") : secret "cluster-monitoring-operator-tls" not found Apr 21 04:40:39.178467 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:39.178437 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-wdf4b"] Apr 21 04:40:39.182638 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:39.182622 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-wdf4b" Apr 21 04:40:39.185221 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:39.185200 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 21 04:40:39.186153 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:39.186131 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-n7t6p\"" Apr 21 04:40:39.186228 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:39.186163 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 21 04:40:39.186228 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:39.186213 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 21 04:40:39.186333 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:39.186226 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 21 04:40:39.194914 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:39.191464 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-wdf4b"] Apr 21 04:40:39.194914 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:39.194150 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 21 04:40:39.242912 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:39.242875 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f53d634-201a-47a8-bb8f-a939d320e536-serving-cert\") pod \"console-operator-9d4b6777b-wdf4b\" (UID: \"8f53d634-201a-47a8-bb8f-a939d320e536\") " pod="openshift-console-operator/console-operator-9d4b6777b-wdf4b" Apr 21 04:40:39.242912 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:39.242905 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f53d634-201a-47a8-bb8f-a939d320e536-trusted-ca\") pod \"console-operator-9d4b6777b-wdf4b\" (UID: \"8f53d634-201a-47a8-bb8f-a939d320e536\") " pod="openshift-console-operator/console-operator-9d4b6777b-wdf4b" Apr 21 04:40:39.243107 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:39.242928 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl99x\" (UniqueName: \"kubernetes.io/projected/8f53d634-201a-47a8-bb8f-a939d320e536-kube-api-access-zl99x\") pod \"console-operator-9d4b6777b-wdf4b\" (UID: \"8f53d634-201a-47a8-bb8f-a939d320e536\") " pod="openshift-console-operator/console-operator-9d4b6777b-wdf4b" Apr 21 04:40:39.243107 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:39.243079 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f53d634-201a-47a8-bb8f-a939d320e536-config\") pod \"console-operator-9d4b6777b-wdf4b\" (UID: \"8f53d634-201a-47a8-bb8f-a939d320e536\") " pod="openshift-console-operator/console-operator-9d4b6777b-wdf4b" Apr 21 04:40:39.344138 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:39.344102 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f53d634-201a-47a8-bb8f-a939d320e536-serving-cert\") pod \"console-operator-9d4b6777b-wdf4b\" (UID: \"8f53d634-201a-47a8-bb8f-a939d320e536\") " pod="openshift-console-operator/console-operator-9d4b6777b-wdf4b" Apr 21 04:40:39.344138 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:39.344144 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f53d634-201a-47a8-bb8f-a939d320e536-trusted-ca\") pod \"console-operator-9d4b6777b-wdf4b\" (UID: \"8f53d634-201a-47a8-bb8f-a939d320e536\") " pod="openshift-console-operator/console-operator-9d4b6777b-wdf4b" Apr 21 04:40:39.344371 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:39.344167 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zl99x\" (UniqueName: \"kubernetes.io/projected/8f53d634-201a-47a8-bb8f-a939d320e536-kube-api-access-zl99x\") pod \"console-operator-9d4b6777b-wdf4b\" (UID: \"8f53d634-201a-47a8-bb8f-a939d320e536\") " pod="openshift-console-operator/console-operator-9d4b6777b-wdf4b" Apr 21 04:40:39.344371 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:39.344269 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f53d634-201a-47a8-bb8f-a939d320e536-config\") pod \"console-operator-9d4b6777b-wdf4b\" (UID: \"8f53d634-201a-47a8-bb8f-a939d320e536\") " pod="openshift-console-operator/console-operator-9d4b6777b-wdf4b" Apr 21 04:40:39.345220 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:39.345194 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f53d634-201a-47a8-bb8f-a939d320e536-config\") pod \"console-operator-9d4b6777b-wdf4b\" (UID: \"8f53d634-201a-47a8-bb8f-a939d320e536\") " pod="openshift-console-operator/console-operator-9d4b6777b-wdf4b" Apr 21 04:40:39.345761 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:39.345728 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f53d634-201a-47a8-bb8f-a939d320e536-trusted-ca\") pod \"console-operator-9d4b6777b-wdf4b\" (UID: \"8f53d634-201a-47a8-bb8f-a939d320e536\") " pod="openshift-console-operator/console-operator-9d4b6777b-wdf4b" Apr 21 04:40:39.347038 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:39.347017 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f53d634-201a-47a8-bb8f-a939d320e536-serving-cert\") pod \"console-operator-9d4b6777b-wdf4b\" (UID: \"8f53d634-201a-47a8-bb8f-a939d320e536\") " pod="openshift-console-operator/console-operator-9d4b6777b-wdf4b" Apr 21 04:40:39.353098 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:39.353043 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl99x\" (UniqueName: \"kubernetes.io/projected/8f53d634-201a-47a8-bb8f-a939d320e536-kube-api-access-zl99x\") pod \"console-operator-9d4b6777b-wdf4b\" (UID: \"8f53d634-201a-47a8-bb8f-a939d320e536\") " pod="openshift-console-operator/console-operator-9d4b6777b-wdf4b" Apr 21 04:40:39.496372 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:39.496297 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-wdf4b" Apr 21 04:40:39.618092 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:39.618040 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-wdf4b"] Apr 21 04:40:39.622086 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:40:39.622037 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f53d634_201a_47a8_bb8f_a939d320e536.slice/crio-22cf75a29c9ada69af122e415f5dfc0302691c5df139044a30b12b3525b8ca74 WatchSource:0}: Error finding container 22cf75a29c9ada69af122e415f5dfc0302691c5df139044a30b12b3525b8ca74: Status 404 returned error can't find the container with id 22cf75a29c9ada69af122e415f5dfc0302691c5df139044a30b12b3525b8ca74 Apr 21 04:40:39.851316 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:39.851169 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7fc8dd3-0312-4bab-a12c-6a11df14266e-service-ca-bundle\") pod \"router-default-686d5855f4-tcz66\" (UID: \"e7fc8dd3-0312-4bab-a12c-6a11df14266e\") " pod="openshift-ingress/router-default-686d5855f4-tcz66" Apr 21 04:40:39.851467 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:39.851344 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e7fc8dd3-0312-4bab-a12c-6a11df14266e-service-ca-bundle podName:e7fc8dd3-0312-4bab-a12c-6a11df14266e nodeName:}" failed. No retries permitted until 2026-04-21 04:40:41.851320651 +0000 UTC m=+96.972878447 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e7fc8dd3-0312-4bab-a12c-6a11df14266e-service-ca-bundle") pod "router-default-686d5855f4-tcz66" (UID: "e7fc8dd3-0312-4bab-a12c-6a11df14266e") : configmap references non-existent config key: service-ca.crt Apr 21 04:40:39.851467 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:39.851379 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7fc8dd3-0312-4bab-a12c-6a11df14266e-metrics-certs\") pod \"router-default-686d5855f4-tcz66\" (UID: \"e7fc8dd3-0312-4bab-a12c-6a11df14266e\") " pod="openshift-ingress/router-default-686d5855f4-tcz66" Apr 21 04:40:39.851584 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:39.851535 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 04:40:39.851637 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:39.851586 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7fc8dd3-0312-4bab-a12c-6a11df14266e-metrics-certs podName:e7fc8dd3-0312-4bab-a12c-6a11df14266e nodeName:}" failed. No retries permitted until 2026-04-21 04:40:41.851571146 +0000 UTC m=+96.973128936 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7fc8dd3-0312-4bab-a12c-6a11df14266e-metrics-certs") pod "router-default-686d5855f4-tcz66" (UID: "e7fc8dd3-0312-4bab-a12c-6a11df14266e") : secret "router-metrics-certs-default" not found Apr 21 04:40:39.902957 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:39.902914 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-wdf4b" event={"ID":"8f53d634-201a-47a8-bb8f-a939d320e536","Type":"ContainerStarted","Data":"22cf75a29c9ada69af122e415f5dfc0302691c5df139044a30b12b3525b8ca74"} Apr 21 04:40:39.952667 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:39.952633 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8cf0095a-7001-4da1-893d-f6430e613fe9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-g25vj\" (UID: \"8cf0095a-7001-4da1-893d-f6430e613fe9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g25vj" Apr 21 04:40:39.952869 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:39.952782 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 04:40:39.952869 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:39.952827 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a707581-ce9a-46b8-9335-8f18bd8dc98c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bhmvl\" (UID: \"3a707581-ce9a-46b8-9335-8f18bd8dc98c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bhmvl" Apr 21 04:40:39.952869 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:39.952860 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cf0095a-7001-4da1-893d-f6430e613fe9-cluster-monitoring-operator-tls podName:8cf0095a-7001-4da1-893d-f6430e613fe9 nodeName:}" failed. No retries permitted until 2026-04-21 04:40:41.952836136 +0000 UTC m=+97.074393934 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8cf0095a-7001-4da1-893d-f6430e613fe9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-g25vj" (UID: "8cf0095a-7001-4da1-893d-f6430e613fe9") : secret "cluster-monitoring-operator-tls" not found Apr 21 04:40:39.953048 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:39.952918 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 04:40:39.953048 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:39.953011 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a707581-ce9a-46b8-9335-8f18bd8dc98c-samples-operator-tls podName:3a707581-ce9a-46b8-9335-8f18bd8dc98c nodeName:}" failed. No retries permitted until 2026-04-21 04:40:41.952967298 +0000 UTC m=+97.074525097 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3a707581-ce9a-46b8-9335-8f18bd8dc98c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-bhmvl" (UID: "3a707581-ce9a-46b8-9335-8f18bd8dc98c") : secret "samples-operator-tls" not found Apr 21 04:40:40.658553 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:40.658519 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8-cert\") pod \"ingress-canary-k8p5j\" (UID: \"f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8\") " pod="openshift-ingress-canary/ingress-canary-k8p5j" Apr 21 04:40:40.658998 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:40.658616 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70e12877-2720-42f7-b047-316b48c6b8fe-metrics-tls\") pod \"dns-default-qjtkj\" (UID: \"70e12877-2720-42f7-b047-316b48c6b8fe\") " pod="openshift-dns/dns-default-qjtkj" Apr 21 04:40:40.658998 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:40.658693 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:40:40.658998 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:40.658763 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:40:40.658998 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:40.658777 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8-cert podName:f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8 nodeName:}" failed. No retries permitted until 2026-04-21 04:41:44.658753689 +0000 UTC m=+159.780311488 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8-cert") pod "ingress-canary-k8p5j" (UID: "f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8") : secret "canary-serving-cert" not found Apr 21 04:40:40.658998 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:40.658822 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70e12877-2720-42f7-b047-316b48c6b8fe-metrics-tls podName:70e12877-2720-42f7-b047-316b48c6b8fe nodeName:}" failed. No retries permitted until 2026-04-21 04:41:44.658805768 +0000 UTC m=+159.780363569 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/70e12877-2720-42f7-b047-316b48c6b8fe-metrics-tls") pod "dns-default-qjtkj" (UID: "70e12877-2720-42f7-b047-316b48c6b8fe") : secret "dns-default-metrics-tls" not found Apr 21 04:40:40.906357 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:40.906319 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-srd87" event={"ID":"74465859-3c82-4f58-832b-74cc4fbe41ce","Type":"ContainerStarted","Data":"95c495886fe5c87fc81f96eebb87340ac818777debf8c32d6723b0bee12e6de4"} Apr 21 04:40:40.923158 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:40.923051 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-srd87" podStartSLOduration=0.826902938 podStartE2EDuration="2.923035158s" podCreationTimestamp="2026-04-21 04:40:38 +0000 UTC" firstStartedPulling="2026-04-21 04:40:38.611225753 +0000 UTC m=+93.732783539" lastFinishedPulling="2026-04-21 04:40:40.707357959 +0000 UTC m=+95.828915759" observedRunningTime="2026-04-21 04:40:40.922326041 +0000 UTC m=+96.043883871" watchObservedRunningTime="2026-04-21 04:40:40.923035158 +0000 UTC m=+96.044592966" Apr 21 04:40:41.871219 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:41.871182 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7fc8dd3-0312-4bab-a12c-6a11df14266e-service-ca-bundle\") pod \"router-default-686d5855f4-tcz66\" (UID: \"e7fc8dd3-0312-4bab-a12c-6a11df14266e\") " pod="openshift-ingress/router-default-686d5855f4-tcz66" Apr 21 04:40:41.871657 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:41.871250 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7fc8dd3-0312-4bab-a12c-6a11df14266e-metrics-certs\") pod \"router-default-686d5855f4-tcz66\" (UID: \"e7fc8dd3-0312-4bab-a12c-6a11df14266e\") " pod="openshift-ingress/router-default-686d5855f4-tcz66" Apr 21 04:40:41.871657 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:41.871353 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 04:40:41.871657 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:41.871430 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e7fc8dd3-0312-4bab-a12c-6a11df14266e-service-ca-bundle podName:e7fc8dd3-0312-4bab-a12c-6a11df14266e nodeName:}" failed. No retries permitted until 2026-04-21 04:40:45.871407073 +0000 UTC m=+100.992964876 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e7fc8dd3-0312-4bab-a12c-6a11df14266e-service-ca-bundle") pod "router-default-686d5855f4-tcz66" (UID: "e7fc8dd3-0312-4bab-a12c-6a11df14266e") : configmap references non-existent config key: service-ca.crt Apr 21 04:40:41.871657 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:41.871459 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7fc8dd3-0312-4bab-a12c-6a11df14266e-metrics-certs podName:e7fc8dd3-0312-4bab-a12c-6a11df14266e nodeName:}" failed. No retries permitted until 2026-04-21 04:40:45.871445906 +0000 UTC m=+100.993003698 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7fc8dd3-0312-4bab-a12c-6a11df14266e-metrics-certs") pod "router-default-686d5855f4-tcz66" (UID: "e7fc8dd3-0312-4bab-a12c-6a11df14266e") : secret "router-metrics-certs-default" not found Apr 21 04:40:41.910463 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:41.910431 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wdf4b_8f53d634-201a-47a8-bb8f-a939d320e536/console-operator/0.log" Apr 21 04:40:41.910602 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:41.910469 2568 generic.go:358] "Generic (PLEG): container finished" podID="8f53d634-201a-47a8-bb8f-a939d320e536" containerID="d24497b496010f3d74e2f5e00570324a3f87e0a44e5be4f2dd9ccfcbd73081a9" exitCode=255 Apr 21 04:40:41.910602 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:41.910560 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-wdf4b" event={"ID":"8f53d634-201a-47a8-bb8f-a939d320e536","Type":"ContainerDied","Data":"d24497b496010f3d74e2f5e00570324a3f87e0a44e5be4f2dd9ccfcbd73081a9"} Apr 21 04:40:41.910826 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:41.910812 2568 scope.go:117] "RemoveContainer" containerID="d24497b496010f3d74e2f5e00570324a3f87e0a44e5be4f2dd9ccfcbd73081a9" Apr 21 04:40:41.972460 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:41.972433 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8cf0095a-7001-4da1-893d-f6430e613fe9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-g25vj\" (UID: \"8cf0095a-7001-4da1-893d-f6430e613fe9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g25vj" Apr 21 04:40:41.972601 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:41.972562 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 04:40:41.972668 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:41.972607 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cf0095a-7001-4da1-893d-f6430e613fe9-cluster-monitoring-operator-tls podName:8cf0095a-7001-4da1-893d-f6430e613fe9 nodeName:}" failed. No retries permitted until 2026-04-21 04:40:45.972592374 +0000 UTC m=+101.094150158 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8cf0095a-7001-4da1-893d-f6430e613fe9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-g25vj" (UID: "8cf0095a-7001-4da1-893d-f6430e613fe9") : secret "cluster-monitoring-operator-tls" not found Apr 21 04:40:41.972810 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:41.972788 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a707581-ce9a-46b8-9335-8f18bd8dc98c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bhmvl\" (UID: \"3a707581-ce9a-46b8-9335-8f18bd8dc98c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bhmvl" Apr 21 04:40:41.972923 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:41.972905 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 04:40:41.972981 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:41.972959 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a707581-ce9a-46b8-9335-8f18bd8dc98c-samples-operator-tls podName:3a707581-ce9a-46b8-9335-8f18bd8dc98c nodeName:}" failed. No retries permitted until 2026-04-21 04:40:45.972942014 +0000 UTC m=+101.094499813 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3a707581-ce9a-46b8-9335-8f18bd8dc98c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-bhmvl" (UID: "3a707581-ce9a-46b8-9335-8f18bd8dc98c") : secret "samples-operator-tls" not found Apr 21 04:40:42.914244 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:42.914214 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wdf4b_8f53d634-201a-47a8-bb8f-a939d320e536/console-operator/1.log" Apr 21 04:40:42.914639 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:42.914587 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wdf4b_8f53d634-201a-47a8-bb8f-a939d320e536/console-operator/0.log" Apr 21 04:40:42.914639 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:42.914619 2568 generic.go:358] "Generic (PLEG): container finished" podID="8f53d634-201a-47a8-bb8f-a939d320e536" containerID="95a8a9df2c7540de316001208d1ea62334eced6393dcf8cfec0b423f47ac1bf1" exitCode=255 Apr 21 04:40:42.914724 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:42.914650 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-wdf4b" event={"ID":"8f53d634-201a-47a8-bb8f-a939d320e536","Type":"ContainerDied","Data":"95a8a9df2c7540de316001208d1ea62334eced6393dcf8cfec0b423f47ac1bf1"} Apr 21 04:40:42.914724 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:42.914702 2568 scope.go:117] "RemoveContainer" containerID="d24497b496010f3d74e2f5e00570324a3f87e0a44e5be4f2dd9ccfcbd73081a9" Apr 21 04:40:42.914949 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:42.914918 2568 scope.go:117] "RemoveContainer" containerID="95a8a9df2c7540de316001208d1ea62334eced6393dcf8cfec0b423f47ac1bf1" Apr 21 04:40:42.915148 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:42.915131 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-wdf4b_openshift-console-operator(8f53d634-201a-47a8-bb8f-a939d320e536)\"" pod="openshift-console-operator/console-operator-9d4b6777b-wdf4b" podUID="8f53d634-201a-47a8-bb8f-a939d320e536" Apr 21 04:40:43.917818 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:43.917792 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wdf4b_8f53d634-201a-47a8-bb8f-a939d320e536/console-operator/1.log" Apr 21 04:40:43.918227 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:43.918141 2568 scope.go:117] "RemoveContainer" containerID="95a8a9df2c7540de316001208d1ea62334eced6393dcf8cfec0b423f47ac1bf1" Apr 21 04:40:43.918318 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:43.918301 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-wdf4b_openshift-console-operator(8f53d634-201a-47a8-bb8f-a939d320e536)\"" pod="openshift-console-operator/console-operator-9d4b6777b-wdf4b" podUID="8f53d634-201a-47a8-bb8f-a939d320e536" Apr 21 04:40:44.383821 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:44.383800 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-n7x9t_a74e0bd7-e17f-4529-9299-93c38644ab68/dns-node-resolver/0.log" Apr 21 04:40:44.857129 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:44.857043 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-n47bc" Apr 21 04:40:45.383722 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:45.383697 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-hm47b_fed25630-f93d-40db-800e-f8042fc4f7ca/node-ca/0.log" Apr 21 04:40:45.766233 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:45.766161 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-5cxvc"] Apr 21 04:40:45.770002 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:45.769987 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5cxvc" Apr 21 04:40:45.772363 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:45.772332 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-wrzvn\"" Apr 21 04:40:45.774886 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:45.774865 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-5cxvc"] Apr 21 04:40:45.903676 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:45.903624 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7fc8dd3-0312-4bab-a12c-6a11df14266e-service-ca-bundle\") pod \"router-default-686d5855f4-tcz66\" (UID: \"e7fc8dd3-0312-4bab-a12c-6a11df14266e\") " pod="openshift-ingress/router-default-686d5855f4-tcz66" Apr 21 04:40:45.903879 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:45.903792 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7fc8dd3-0312-4bab-a12c-6a11df14266e-metrics-certs\") pod \"router-default-686d5855f4-tcz66\" (UID: \"e7fc8dd3-0312-4bab-a12c-6a11df14266e\") " pod="openshift-ingress/router-default-686d5855f4-tcz66" Apr 21 04:40:45.903879 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:45.903843 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p7z5\" (UniqueName: \"kubernetes.io/projected/81bfb6af-a368-4121-9fc9-bf01f634b5de-kube-api-access-7p7z5\") pod \"network-check-source-8894fc9bd-5cxvc\" (UID: \"81bfb6af-a368-4121-9fc9-bf01f634b5de\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5cxvc" Apr 21 04:40:45.903977 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:45.903917 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e7fc8dd3-0312-4bab-a12c-6a11df14266e-service-ca-bundle podName:e7fc8dd3-0312-4bab-a12c-6a11df14266e nodeName:}" failed. No retries permitted until 2026-04-21 04:40:53.903892387 +0000 UTC m=+109.025450176 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e7fc8dd3-0312-4bab-a12c-6a11df14266e-service-ca-bundle") pod "router-default-686d5855f4-tcz66" (UID: "e7fc8dd3-0312-4bab-a12c-6a11df14266e") : configmap references non-existent config key: service-ca.crt Apr 21 04:40:45.903977 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:45.903916 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 04:40:45.903977 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:45.903976 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7fc8dd3-0312-4bab-a12c-6a11df14266e-metrics-certs podName:e7fc8dd3-0312-4bab-a12c-6a11df14266e nodeName:}" failed. No retries permitted until 2026-04-21 04:40:53.903967969 +0000 UTC m=+109.025525753 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7fc8dd3-0312-4bab-a12c-6a11df14266e-metrics-certs") pod "router-default-686d5855f4-tcz66" (UID: "e7fc8dd3-0312-4bab-a12c-6a11df14266e") : secret "router-metrics-certs-default" not found Apr 21 04:40:46.004747 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:46.004711 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a707581-ce9a-46b8-9335-8f18bd8dc98c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bhmvl\" (UID: \"3a707581-ce9a-46b8-9335-8f18bd8dc98c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bhmvl" Apr 21 04:40:46.004916 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:46.004776 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7p7z5\" (UniqueName: \"kubernetes.io/projected/81bfb6af-a368-4121-9fc9-bf01f634b5de-kube-api-access-7p7z5\") pod \"network-check-source-8894fc9bd-5cxvc\" (UID: \"81bfb6af-a368-4121-9fc9-bf01f634b5de\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5cxvc" Apr 21 04:40:46.004916 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:46.004804 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8cf0095a-7001-4da1-893d-f6430e613fe9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-g25vj\" (UID: \"8cf0095a-7001-4da1-893d-f6430e613fe9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g25vj" Apr 21 04:40:46.004916 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:46.004853 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 04:40:46.004916 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:46.004897 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 04:40:46.004916 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:46.004916 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a707581-ce9a-46b8-9335-8f18bd8dc98c-samples-operator-tls podName:3a707581-ce9a-46b8-9335-8f18bd8dc98c nodeName:}" failed. No retries permitted until 2026-04-21 04:40:54.004900746 +0000 UTC m=+109.126458531 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3a707581-ce9a-46b8-9335-8f18bd8dc98c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-bhmvl" (UID: "3a707581-ce9a-46b8-9335-8f18bd8dc98c") : secret "samples-operator-tls" not found Apr 21 04:40:46.005104 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:46.004946 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cf0095a-7001-4da1-893d-f6430e613fe9-cluster-monitoring-operator-tls podName:8cf0095a-7001-4da1-893d-f6430e613fe9 nodeName:}" failed. No retries permitted until 2026-04-21 04:40:54.004931492 +0000 UTC m=+109.126489277 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8cf0095a-7001-4da1-893d-f6430e613fe9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-g25vj" (UID: "8cf0095a-7001-4da1-893d-f6430e613fe9") : secret "cluster-monitoring-operator-tls" not found Apr 21 04:40:46.012937 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:46.012916 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p7z5\" (UniqueName: \"kubernetes.io/projected/81bfb6af-a368-4121-9fc9-bf01f634b5de-kube-api-access-7p7z5\") pod \"network-check-source-8894fc9bd-5cxvc\" (UID: \"81bfb6af-a368-4121-9fc9-bf01f634b5de\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5cxvc" Apr 21 04:40:46.078740 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:46.078660 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5cxvc" Apr 21 04:40:46.186312 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:46.186254 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-5cxvc"] Apr 21 04:40:46.189919 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:40:46.189888 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81bfb6af_a368_4121_9fc9_bf01f634b5de.slice/crio-bc2d7fbd22f2c35b2cef377cf765adb770c72945bdb25b29408a874d78aa3e0e WatchSource:0}: Error finding container bc2d7fbd22f2c35b2cef377cf765adb770c72945bdb25b29408a874d78aa3e0e: Status 404 returned error can't find the container with id bc2d7fbd22f2c35b2cef377cf765adb770c72945bdb25b29408a874d78aa3e0e Apr 21 04:40:46.924717 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:46.924682 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5cxvc" event={"ID":"81bfb6af-a368-4121-9fc9-bf01f634b5de","Type":"ContainerStarted","Data":"60831913ff1c0a7c71555a4ea9d115903879b8f6196914c6d2d4e9c56b205195"} Apr 21 04:40:46.924717 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:46.924718 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5cxvc" event={"ID":"81bfb6af-a368-4121-9fc9-bf01f634b5de","Type":"ContainerStarted","Data":"bc2d7fbd22f2c35b2cef377cf765adb770c72945bdb25b29408a874d78aa3e0e"} Apr 21 04:40:46.940959 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:46.940914 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5cxvc" podStartSLOduration=1.94089847 podStartE2EDuration="1.94089847s" podCreationTimestamp="2026-04-21 04:40:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:40:46.939707917 +0000 UTC m=+102.061265727" watchObservedRunningTime="2026-04-21 04:40:46.94089847 +0000 UTC m=+102.062456277" Apr 21 04:40:47.316977 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:47.316898 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-vlqww"] Apr 21 04:40:47.322346 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:47.322327 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vlqww" Apr 21 04:40:47.325062 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:47.325036 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-6z9mm\"" Apr 21 04:40:47.325161 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:47.325082 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 21 04:40:47.326034 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:47.326014 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 21 04:40:47.328517 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:47.328496 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-vlqww"] Apr 21 04:40:47.418053 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:47.417997 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzkd6\" (UniqueName: \"kubernetes.io/projected/02ca5737-060f-403c-8352-2dec9f92bd2e-kube-api-access-jzkd6\") pod \"migrator-74bb7799d9-vlqww\" (UID: \"02ca5737-060f-403c-8352-2dec9f92bd2e\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vlqww" Apr 21 04:40:47.519010 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:47.518961 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jzkd6\" (UniqueName: \"kubernetes.io/projected/02ca5737-060f-403c-8352-2dec9f92bd2e-kube-api-access-jzkd6\") pod \"migrator-74bb7799d9-vlqww\" (UID: \"02ca5737-060f-403c-8352-2dec9f92bd2e\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vlqww" Apr 21 04:40:47.527444 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:47.527408 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzkd6\" (UniqueName: \"kubernetes.io/projected/02ca5737-060f-403c-8352-2dec9f92bd2e-kube-api-access-jzkd6\") pod \"migrator-74bb7799d9-vlqww\" (UID: \"02ca5737-060f-403c-8352-2dec9f92bd2e\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vlqww" Apr 21 04:40:47.631339 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:47.631304 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vlqww" Apr 21 04:40:47.745826 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:47.745792 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-vlqww"] Apr 21 04:40:47.749322 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:40:47.749291 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02ca5737_060f_403c_8352_2dec9f92bd2e.slice/crio-9e75295455408aeaa9d51cd543c8a9237cc2f91be8fbef8fe03d1594ef18710e WatchSource:0}: Error finding container 9e75295455408aeaa9d51cd543c8a9237cc2f91be8fbef8fe03d1594ef18710e: Status 404 returned error can't find the container with id 9e75295455408aeaa9d51cd543c8a9237cc2f91be8fbef8fe03d1594ef18710e Apr 21 04:40:47.927817 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:47.927720 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vlqww" event={"ID":"02ca5737-060f-403c-8352-2dec9f92bd2e","Type":"ContainerStarted","Data":"9e75295455408aeaa9d51cd543c8a9237cc2f91be8fbef8fe03d1594ef18710e"} Apr 21 04:40:49.496969 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:49.496935 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-wdf4b" Apr 21 04:40:49.496969 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:49.496970 2568 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-wdf4b" Apr 21 04:40:49.497442 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:49.497337 2568 scope.go:117] "RemoveContainer" containerID="95a8a9df2c7540de316001208d1ea62334eced6393dcf8cfec0b423f47ac1bf1" Apr 21 04:40:49.497507 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:49.497490 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-wdf4b_openshift-console-operator(8f53d634-201a-47a8-bb8f-a939d320e536)\"" pod="openshift-console-operator/console-operator-9d4b6777b-wdf4b" podUID="8f53d634-201a-47a8-bb8f-a939d320e536" Apr 21 04:40:49.933706 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:49.933667 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vlqww" event={"ID":"02ca5737-060f-403c-8352-2dec9f92bd2e","Type":"ContainerStarted","Data":"072b3b2304b0c3232e237f805fec52090713a8e5452d96f510eb462f1b451052"} Apr 21 04:40:49.933706 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:49.933706 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vlqww" event={"ID":"02ca5737-060f-403c-8352-2dec9f92bd2e","Type":"ContainerStarted","Data":"a47cd156a088c421f91a61cb210672cf5ddef9050a3f8d41c676dba961439c73"} Apr 21 04:40:49.950592 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:49.950538 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vlqww" podStartSLOduration=1.7375312109999999 podStartE2EDuration="2.950521347s" podCreationTimestamp="2026-04-21 04:40:47 +0000 UTC" firstStartedPulling="2026-04-21 04:40:47.75165405 +0000 UTC m=+102.873211836" lastFinishedPulling="2026-04-21 04:40:48.964644173 +0000 UTC m=+104.086201972" observedRunningTime="2026-04-21 04:40:49.949588139 +0000 UTC m=+105.071145947" watchObservedRunningTime="2026-04-21 04:40:49.950521347 +0000 UTC m=+105.072079154" Apr 21 04:40:53.977560 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:53.977525 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7fc8dd3-0312-4bab-a12c-6a11df14266e-service-ca-bundle\") pod \"router-default-686d5855f4-tcz66\" (UID: \"e7fc8dd3-0312-4bab-a12c-6a11df14266e\") " pod="openshift-ingress/router-default-686d5855f4-tcz66" Apr 21 04:40:53.977946 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:53.977601 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7fc8dd3-0312-4bab-a12c-6a11df14266e-metrics-certs\") pod \"router-default-686d5855f4-tcz66\" (UID: \"e7fc8dd3-0312-4bab-a12c-6a11df14266e\") " pod="openshift-ingress/router-default-686d5855f4-tcz66" Apr 21 04:40:53.977946 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:53.977693 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 04:40:53.977946 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:53.977697 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e7fc8dd3-0312-4bab-a12c-6a11df14266e-service-ca-bundle podName:e7fc8dd3-0312-4bab-a12c-6a11df14266e nodeName:}" failed. No retries permitted until 2026-04-21 04:41:09.977679074 +0000 UTC m=+125.099236859 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e7fc8dd3-0312-4bab-a12c-6a11df14266e-service-ca-bundle") pod "router-default-686d5855f4-tcz66" (UID: "e7fc8dd3-0312-4bab-a12c-6a11df14266e") : configmap references non-existent config key: service-ca.crt Apr 21 04:40:53.977946 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:53.977739 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7fc8dd3-0312-4bab-a12c-6a11df14266e-metrics-certs podName:e7fc8dd3-0312-4bab-a12c-6a11df14266e nodeName:}" failed. No retries permitted until 2026-04-21 04:41:09.977726978 +0000 UTC m=+125.099284763 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7fc8dd3-0312-4bab-a12c-6a11df14266e-metrics-certs") pod "router-default-686d5855f4-tcz66" (UID: "e7fc8dd3-0312-4bab-a12c-6a11df14266e") : secret "router-metrics-certs-default" not found Apr 21 04:40:54.078368 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:54.078330 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a707581-ce9a-46b8-9335-8f18bd8dc98c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bhmvl\" (UID: \"3a707581-ce9a-46b8-9335-8f18bd8dc98c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bhmvl" Apr 21 04:40:54.078528 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:40:54.078401 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8cf0095a-7001-4da1-893d-f6430e613fe9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-g25vj\" (UID: \"8cf0095a-7001-4da1-893d-f6430e613fe9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g25vj" Apr 21 04:40:54.078528 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:54.078467 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 04:40:54.078528 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:54.078496 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 04:40:54.078624 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:54.078538 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a707581-ce9a-46b8-9335-8f18bd8dc98c-samples-operator-tls podName:3a707581-ce9a-46b8-9335-8f18bd8dc98c nodeName:}" failed. No retries permitted until 2026-04-21 04:41:10.078523714 +0000 UTC m=+125.200081503 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3a707581-ce9a-46b8-9335-8f18bd8dc98c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-bhmvl" (UID: "3a707581-ce9a-46b8-9335-8f18bd8dc98c") : secret "samples-operator-tls" not found Apr 21 04:40:54.078624 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:40:54.078552 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cf0095a-7001-4da1-893d-f6430e613fe9-cluster-monitoring-operator-tls podName:8cf0095a-7001-4da1-893d-f6430e613fe9 nodeName:}" failed. No retries permitted until 2026-04-21 04:41:10.078545892 +0000 UTC m=+125.200103676 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8cf0095a-7001-4da1-893d-f6430e613fe9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-g25vj" (UID: "8cf0095a-7001-4da1-893d-f6430e613fe9") : secret "cluster-monitoring-operator-tls" not found Apr 21 04:41:00.480733 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:00.480706 2568 scope.go:117] "RemoveContainer" containerID="95a8a9df2c7540de316001208d1ea62334eced6393dcf8cfec0b423f47ac1bf1" Apr 21 04:41:00.960664 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:00.960636 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wdf4b_8f53d634-201a-47a8-bb8f-a939d320e536/console-operator/2.log" Apr 21 04:41:00.961022 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:00.961003 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wdf4b_8f53d634-201a-47a8-bb8f-a939d320e536/console-operator/1.log" Apr 21 04:41:00.961151 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:00.961042 2568 generic.go:358] "Generic (PLEG): container finished" podID="8f53d634-201a-47a8-bb8f-a939d320e536" containerID="f1a8e3b3a338ea745b6cff411b7d9561d89c571841a5c43ff488b3739a36d571" exitCode=255 Apr 21 04:41:00.961151 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:00.961103 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-wdf4b" event={"ID":"8f53d634-201a-47a8-bb8f-a939d320e536","Type":"ContainerDied","Data":"f1a8e3b3a338ea745b6cff411b7d9561d89c571841a5c43ff488b3739a36d571"} Apr 21 04:41:00.961151 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:00.961146 2568 scope.go:117] "RemoveContainer" containerID="95a8a9df2c7540de316001208d1ea62334eced6393dcf8cfec0b423f47ac1bf1" Apr 21 04:41:00.961466 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:00.961452 2568 scope.go:117] "RemoveContainer" containerID="f1a8e3b3a338ea745b6cff411b7d9561d89c571841a5c43ff488b3739a36d571" Apr 21 04:41:00.961655 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:41:00.961631 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-wdf4b_openshift-console-operator(8f53d634-201a-47a8-bb8f-a939d320e536)\"" pod="openshift-console-operator/console-operator-9d4b6777b-wdf4b" podUID="8f53d634-201a-47a8-bb8f-a939d320e536" Apr 21 04:41:01.964356 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:01.964330 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wdf4b_8f53d634-201a-47a8-bb8f-a939d320e536/console-operator/2.log" Apr 21 04:41:09.496400 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:09.496368 2568 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-wdf4b" Apr 21 04:41:09.496400 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:09.496404 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-wdf4b" Apr 21 04:41:09.496957 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:09.496789 2568 scope.go:117] "RemoveContainer" containerID="f1a8e3b3a338ea745b6cff411b7d9561d89c571841a5c43ff488b3739a36d571" Apr 21 04:41:09.497037 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:41:09.497014 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-wdf4b_openshift-console-operator(8f53d634-201a-47a8-bb8f-a939d320e536)\"" pod="openshift-console-operator/console-operator-9d4b6777b-wdf4b" podUID="8f53d634-201a-47a8-bb8f-a939d320e536" Apr 21 04:41:10.008228 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:10.008195 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7fc8dd3-0312-4bab-a12c-6a11df14266e-metrics-certs\") pod \"router-default-686d5855f4-tcz66\" (UID: \"e7fc8dd3-0312-4bab-a12c-6a11df14266e\") " pod="openshift-ingress/router-default-686d5855f4-tcz66" Apr 21 04:41:10.008430 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:10.008303 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7fc8dd3-0312-4bab-a12c-6a11df14266e-service-ca-bundle\") pod \"router-default-686d5855f4-tcz66\" (UID: \"e7fc8dd3-0312-4bab-a12c-6a11df14266e\") " pod="openshift-ingress/router-default-686d5855f4-tcz66" Apr 21 04:41:10.008857 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:10.008837 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7fc8dd3-0312-4bab-a12c-6a11df14266e-service-ca-bundle\") pod \"router-default-686d5855f4-tcz66\" (UID: \"e7fc8dd3-0312-4bab-a12c-6a11df14266e\") " pod="openshift-ingress/router-default-686d5855f4-tcz66" Apr 21 04:41:10.010514 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:10.010496 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7fc8dd3-0312-4bab-a12c-6a11df14266e-metrics-certs\") pod \"router-default-686d5855f4-tcz66\" (UID: \"e7fc8dd3-0312-4bab-a12c-6a11df14266e\") " pod="openshift-ingress/router-default-686d5855f4-tcz66" Apr 21 04:41:10.109045 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:10.109009 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a707581-ce9a-46b8-9335-8f18bd8dc98c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bhmvl\" (UID: \"3a707581-ce9a-46b8-9335-8f18bd8dc98c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bhmvl" Apr 21 04:41:10.109225 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:10.109083 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8cf0095a-7001-4da1-893d-f6430e613fe9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-g25vj\" (UID: \"8cf0095a-7001-4da1-893d-f6430e613fe9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g25vj" Apr 21 04:41:10.111427 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:10.111397 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8cf0095a-7001-4da1-893d-f6430e613fe9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-g25vj\" (UID: \"8cf0095a-7001-4da1-893d-f6430e613fe9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g25vj" Apr 21 04:41:10.111536 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:10.111429 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a707581-ce9a-46b8-9335-8f18bd8dc98c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bhmvl\" (UID: \"3a707581-ce9a-46b8-9335-8f18bd8dc98c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bhmvl" Apr 21 04:41:10.305505 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:10.305425 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-r25fb\"" Apr 21 04:41:10.313651 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:10.313632 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-686d5855f4-tcz66" Apr 21 04:41:10.401830 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:10.401805 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-nh5zd\"" Apr 21 04:41:10.407032 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:10.407004 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-bd9zl\"" Apr 21 04:41:10.409896 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:10.409876 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g25vj" Apr 21 04:41:10.415650 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:10.415627 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bhmvl" Apr 21 04:41:10.428140 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:10.427743 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-686d5855f4-tcz66"] Apr 21 04:41:10.429968 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:41:10.429939 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7fc8dd3_0312_4bab_a12c_6a11df14266e.slice/crio-c0a35ba5c47612f22ce874d97e5d06cdfaa7f6c112b68d2738e07e02b872f204 WatchSource:0}: Error finding container c0a35ba5c47612f22ce874d97e5d06cdfaa7f6c112b68d2738e07e02b872f204: Status 404 returned error can't find the container with id c0a35ba5c47612f22ce874d97e5d06cdfaa7f6c112b68d2738e07e02b872f204 Apr 21 04:41:10.535768 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:10.535736 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-g25vj"] Apr 21 04:41:10.541494 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:41:10.541466 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cf0095a_7001_4da1_893d_f6430e613fe9.slice/crio-b37b3dcc8161f2fbe9b9a94cc33f1d74dec27bc2dbb250295208fec406b13d5f WatchSource:0}: Error finding container b37b3dcc8161f2fbe9b9a94cc33f1d74dec27bc2dbb250295208fec406b13d5f: Status 404 returned error can't find the container with id b37b3dcc8161f2fbe9b9a94cc33f1d74dec27bc2dbb250295208fec406b13d5f Apr 21 04:41:10.552901 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:10.552874 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bhmvl"] Apr 21 04:41:10.985810 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:10.985772 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bhmvl" event={"ID":"3a707581-ce9a-46b8-9335-8f18bd8dc98c","Type":"ContainerStarted","Data":"ea44b6e5d06d0c0c123dce7b798ec5344693404dbdcf4d80a1a6f80c9eaf5fe8"} Apr 21 04:41:10.986719 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:10.986689 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g25vj" event={"ID":"8cf0095a-7001-4da1-893d-f6430e613fe9","Type":"ContainerStarted","Data":"b37b3dcc8161f2fbe9b9a94cc33f1d74dec27bc2dbb250295208fec406b13d5f"} Apr 21 04:41:10.987854 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:10.987833 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-686d5855f4-tcz66" event={"ID":"e7fc8dd3-0312-4bab-a12c-6a11df14266e","Type":"ContainerStarted","Data":"b7cd4368d483fe7ae5ad725b6cba32fd4377e57b5a758ac5d9419d6b93098582"} Apr 21 04:41:10.987923 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:10.987860 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-686d5855f4-tcz66" event={"ID":"e7fc8dd3-0312-4bab-a12c-6a11df14266e","Type":"ContainerStarted","Data":"c0a35ba5c47612f22ce874d97e5d06cdfaa7f6c112b68d2738e07e02b872f204"} Apr 21 04:41:11.006314 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:11.006279 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-686d5855f4-tcz66" podStartSLOduration=33.006268836 podStartE2EDuration="33.006268836s" podCreationTimestamp="2026-04-21 04:40:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:41:11.005556458 +0000 UTC m=+126.127114266" watchObservedRunningTime="2026-04-21 04:41:11.006268836 +0000 UTC m=+126.127826642" Apr 21 04:41:11.314784 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:11.314694 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-686d5855f4-tcz66" Apr 21 04:41:11.317664 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:11.317639 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-686d5855f4-tcz66" Apr 21 04:41:11.991137 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:11.991101 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-686d5855f4-tcz66" Apr 21 04:41:11.992543 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:11.992513 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-686d5855f4-tcz66" Apr 21 04:41:12.111315 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:12.111281 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-l4ghz"] Apr 21 04:41:12.114551 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:12.114531 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-l4ghz" Apr 21 04:41:12.117185 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:12.117160 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 04:41:12.117487 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:12.117466 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-x7jkx\"" Apr 21 04:41:12.120081 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:12.120045 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 04:41:12.125489 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:12.125397 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/58730f05-2dc5-4837-8ad1-3f20245c3215-crio-socket\") pod \"insights-runtime-extractor-l4ghz\" (UID: \"58730f05-2dc5-4837-8ad1-3f20245c3215\") " pod="openshift-insights/insights-runtime-extractor-l4ghz" Apr 21 04:41:12.125640 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:12.125596 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/58730f05-2dc5-4837-8ad1-3f20245c3215-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-l4ghz\" (UID: \"58730f05-2dc5-4837-8ad1-3f20245c3215\") " pod="openshift-insights/insights-runtime-extractor-l4ghz" Apr 21 04:41:12.125892 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:12.125663 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmj5l\" (UniqueName: \"kubernetes.io/projected/58730f05-2dc5-4837-8ad1-3f20245c3215-kube-api-access-dmj5l\") pod \"insights-runtime-extractor-l4ghz\" (UID: \"58730f05-2dc5-4837-8ad1-3f20245c3215\") " pod="openshift-insights/insights-runtime-extractor-l4ghz" Apr 21 04:41:12.125892 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:12.125698 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/58730f05-2dc5-4837-8ad1-3f20245c3215-data-volume\") pod \"insights-runtime-extractor-l4ghz\" (UID: \"58730f05-2dc5-4837-8ad1-3f20245c3215\") " pod="openshift-insights/insights-runtime-extractor-l4ghz" Apr 21 04:41:12.125892 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:12.125724 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/58730f05-2dc5-4837-8ad1-3f20245c3215-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-l4ghz\" (UID: \"58730f05-2dc5-4837-8ad1-3f20245c3215\") " pod="openshift-insights/insights-runtime-extractor-l4ghz" Apr 21 04:41:12.130382 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:12.130353 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-l4ghz"] Apr 21 04:41:12.226733 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:12.226699 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/58730f05-2dc5-4837-8ad1-3f20245c3215-data-volume\") pod \"insights-runtime-extractor-l4ghz\" (UID: \"58730f05-2dc5-4837-8ad1-3f20245c3215\") " pod="openshift-insights/insights-runtime-extractor-l4ghz" Apr 21 04:41:12.226931 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:12.226740 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/58730f05-2dc5-4837-8ad1-3f20245c3215-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-l4ghz\" (UID: \"58730f05-2dc5-4837-8ad1-3f20245c3215\") " pod="openshift-insights/insights-runtime-extractor-l4ghz" Apr 21 04:41:12.227008 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:12.226951 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/58730f05-2dc5-4837-8ad1-3f20245c3215-crio-socket\") pod \"insights-runtime-extractor-l4ghz\" (UID: \"58730f05-2dc5-4837-8ad1-3f20245c3215\") " pod="openshift-insights/insights-runtime-extractor-l4ghz" Apr 21 04:41:12.227008 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:12.226998 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/58730f05-2dc5-4837-8ad1-3f20245c3215-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-l4ghz\" (UID: \"58730f05-2dc5-4837-8ad1-3f20245c3215\") " pod="openshift-insights/insights-runtime-extractor-l4ghz" Apr 21 04:41:12.227155 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:12.227046 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dmj5l\" (UniqueName: \"kubernetes.io/projected/58730f05-2dc5-4837-8ad1-3f20245c3215-kube-api-access-dmj5l\") pod \"insights-runtime-extractor-l4ghz\" (UID: \"58730f05-2dc5-4837-8ad1-3f20245c3215\") " pod="openshift-insights/insights-runtime-extractor-l4ghz" Apr 21 04:41:12.227155 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:12.227084 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/58730f05-2dc5-4837-8ad1-3f20245c3215-crio-socket\") pod \"insights-runtime-extractor-l4ghz\" (UID: \"58730f05-2dc5-4837-8ad1-3f20245c3215\") " pod="openshift-insights/insights-runtime-extractor-l4ghz" Apr 21 04:41:12.227270 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:12.227177 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/58730f05-2dc5-4837-8ad1-3f20245c3215-data-volume\") pod \"insights-runtime-extractor-l4ghz\" (UID: \"58730f05-2dc5-4837-8ad1-3f20245c3215\") " pod="openshift-insights/insights-runtime-extractor-l4ghz" Apr 21 04:41:12.227436 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:12.227416 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/58730f05-2dc5-4837-8ad1-3f20245c3215-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-l4ghz\" (UID: \"58730f05-2dc5-4837-8ad1-3f20245c3215\") " pod="openshift-insights/insights-runtime-extractor-l4ghz" Apr 21 04:41:12.229784 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:12.229741 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/58730f05-2dc5-4837-8ad1-3f20245c3215-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-l4ghz\" (UID: \"58730f05-2dc5-4837-8ad1-3f20245c3215\") " pod="openshift-insights/insights-runtime-extractor-l4ghz" Apr 21 04:41:12.255863 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:12.255807 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmj5l\" (UniqueName: \"kubernetes.io/projected/58730f05-2dc5-4837-8ad1-3f20245c3215-kube-api-access-dmj5l\") pod \"insights-runtime-extractor-l4ghz\" (UID: \"58730f05-2dc5-4837-8ad1-3f20245c3215\") " pod="openshift-insights/insights-runtime-extractor-l4ghz" Apr 21 04:41:12.426828 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:12.426796 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-l4ghz" Apr 21 04:41:12.884102 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:12.884054 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-l4ghz"] Apr 21 04:41:12.994806 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:12.994768 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bhmvl" event={"ID":"3a707581-ce9a-46b8-9335-8f18bd8dc98c","Type":"ContainerStarted","Data":"104f328d827395fa95ec8cef8a3c5c698394cfeac234d80be2fdb4174bb30254"} Apr 21 04:41:12.994806 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:12.994812 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bhmvl" event={"ID":"3a707581-ce9a-46b8-9335-8f18bd8dc98c","Type":"ContainerStarted","Data":"c9ab328931d99423a9b52d88cb4d822ef4be036d6795fcf4417624581142f796"} Apr 21 04:41:12.996131 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:12.996103 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g25vj" event={"ID":"8cf0095a-7001-4da1-893d-f6430e613fe9","Type":"ContainerStarted","Data":"fe71c41317389bfe1f40b3d4af6a9d8ae98edbf998848b160b81089928242bd0"} Apr 21 04:41:12.997492 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:12.997466 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-l4ghz" event={"ID":"58730f05-2dc5-4837-8ad1-3f20245c3215","Type":"ContainerStarted","Data":"bee59ce09a9485ecfdf56fbd10b3b3a8081808e4a8e06b54b5fa99e0e668e3fe"} Apr 21 04:41:12.997492 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:12.997497 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-l4ghz" event={"ID":"58730f05-2dc5-4837-8ad1-3f20245c3215","Type":"ContainerStarted","Data":"6a05a0e3528f9a8b57d2e608c13b4eb543f7b33d3f8448260a26be75225a477c"} Apr 21 04:41:13.011592 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:13.011505 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bhmvl" podStartSLOduration=32.843261103 podStartE2EDuration="35.011488376s" podCreationTimestamp="2026-04-21 04:40:38 +0000 UTC" firstStartedPulling="2026-04-21 04:41:10.588869869 +0000 UTC m=+125.710427658" lastFinishedPulling="2026-04-21 04:41:12.757097132 +0000 UTC m=+127.878654931" observedRunningTime="2026-04-21 04:41:13.009667435 +0000 UTC m=+128.131225254" watchObservedRunningTime="2026-04-21 04:41:13.011488376 +0000 UTC m=+128.133046184" Apr 21 04:41:13.027064 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:13.027017 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g25vj" podStartSLOduration=32.809194134 podStartE2EDuration="35.027002186s" podCreationTimestamp="2026-04-21 04:40:38 +0000 UTC" firstStartedPulling="2026-04-21 04:41:10.543559219 +0000 UTC m=+125.665117008" lastFinishedPulling="2026-04-21 04:41:12.761367262 +0000 UTC m=+127.882925060" observedRunningTime="2026-04-21 04:41:13.026275688 +0000 UTC m=+128.147833508" watchObservedRunningTime="2026-04-21 04:41:13.027002186 +0000 UTC m=+128.148559995" Apr 21 04:41:13.264430 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:13.264338 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-94zhn"] Apr 21 04:41:13.267604 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:13.267581 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-94zhn" Apr 21 04:41:13.270195 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:13.270170 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 21 04:41:13.270488 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:13.270251 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-r7czw\"" Apr 21 04:41:13.275024 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:13.274753 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-94zhn"] Apr 21 04:41:13.337400 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:13.337362 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/609ee659-7c06-49c7-a370-5afdac306be6-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-94zhn\" (UID: \"609ee659-7c06-49c7-a370-5afdac306be6\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-94zhn" Apr 21 04:41:13.437951 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:13.437910 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/609ee659-7c06-49c7-a370-5afdac306be6-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-94zhn\" (UID: \"609ee659-7c06-49c7-a370-5afdac306be6\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-94zhn" Apr 21 04:41:13.438145 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:41:13.438118 2568 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 21 04:41:13.438237 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:41:13.438223 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/609ee659-7c06-49c7-a370-5afdac306be6-tls-certificates podName:609ee659-7c06-49c7-a370-5afdac306be6 nodeName:}" failed. No retries permitted until 2026-04-21 04:41:13.938199775 +0000 UTC m=+129.059757567 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/609ee659-7c06-49c7-a370-5afdac306be6-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-94zhn" (UID: "609ee659-7c06-49c7-a370-5afdac306be6") : secret "prometheus-operator-admission-webhook-tls" not found Apr 21 04:41:13.941296 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:13.941255 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/609ee659-7c06-49c7-a370-5afdac306be6-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-94zhn\" (UID: \"609ee659-7c06-49c7-a370-5afdac306be6\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-94zhn" Apr 21 04:41:13.943984 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:13.943955 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/609ee659-7c06-49c7-a370-5afdac306be6-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-94zhn\" (UID: \"609ee659-7c06-49c7-a370-5afdac306be6\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-94zhn" Apr 21 04:41:14.003269 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:14.003224 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-l4ghz" event={"ID":"58730f05-2dc5-4837-8ad1-3f20245c3215","Type":"ContainerStarted","Data":"3043f87b17f1ef98eca89258baeeb818f9ff68225ae6accffbaf5bb2dbe90f41"} Apr 21 04:41:14.180125 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:14.180085 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-94zhn" Apr 21 04:41:14.245689 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:14.245657 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8743d92c-6080-4066-ad83-55bb582a3f6c-metrics-certs\") pod \"network-metrics-daemon-z4rqh\" (UID: \"8743d92c-6080-4066-ad83-55bb582a3f6c\") " pod="openshift-multus/network-metrics-daemon-z4rqh" Apr 21 04:41:14.248438 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:14.248362 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8743d92c-6080-4066-ad83-55bb582a3f6c-metrics-certs\") pod \"network-metrics-daemon-z4rqh\" (UID: \"8743d92c-6080-4066-ad83-55bb582a3f6c\") " pod="openshift-multus/network-metrics-daemon-z4rqh" Apr 21 04:41:14.311261 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:14.311227 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-94zhn"] Apr 21 04:41:14.314393 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:41:14.314363 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod609ee659_7c06_49c7_a370_5afdac306be6.slice/crio-1d4e56950cc42e990d2ec9c3adf842011e63ca4c83fe3a4c8d6ed26f9600484a WatchSource:0}: Error finding container 1d4e56950cc42e990d2ec9c3adf842011e63ca4c83fe3a4c8d6ed26f9600484a: Status 404 returned error can't find the container with id 1d4e56950cc42e990d2ec9c3adf842011e63ca4c83fe3a4c8d6ed26f9600484a Apr 21 04:41:14.494732 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:14.494656 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-979g8\"" Apr 21 04:41:14.502789 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:14.502762 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z4rqh" Apr 21 04:41:14.998652 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:14.998622 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-z4rqh"] Apr 21 04:41:15.001263 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:41:15.001239 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8743d92c_6080_4066_ad83_55bb582a3f6c.slice/crio-4906007bf77d0569eb11befb6babe2c157e2919d96b8dc35cdf0d2f51c71db98 WatchSource:0}: Error finding container 4906007bf77d0569eb11befb6babe2c157e2919d96b8dc35cdf0d2f51c71db98: Status 404 returned error can't find the container with id 4906007bf77d0569eb11befb6babe2c157e2919d96b8dc35cdf0d2f51c71db98 Apr 21 04:41:15.011029 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:15.011004 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z4rqh" event={"ID":"8743d92c-6080-4066-ad83-55bb582a3f6c","Type":"ContainerStarted","Data":"4906007bf77d0569eb11befb6babe2c157e2919d96b8dc35cdf0d2f51c71db98"} Apr 21 04:41:15.012011 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:15.011989 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-94zhn" event={"ID":"609ee659-7c06-49c7-a370-5afdac306be6","Type":"ContainerStarted","Data":"1d4e56950cc42e990d2ec9c3adf842011e63ca4c83fe3a4c8d6ed26f9600484a"} Apr 21 04:41:16.016084 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:16.015970 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-94zhn" event={"ID":"609ee659-7c06-49c7-a370-5afdac306be6","Type":"ContainerStarted","Data":"fbd470baa25d189e364bb328d444712bc05e2c6df7d816902b5329f1d0024bf8"} Apr 21 04:41:16.016529 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:16.016178 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-94zhn" Apr 21 04:41:16.018227 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:16.018195 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-l4ghz" event={"ID":"58730f05-2dc5-4837-8ad1-3f20245c3215","Type":"ContainerStarted","Data":"a42f06b107eb8de54ea6c3a740f82447208e734f698be8a7eb4a7fe26ac1ed46"} Apr 21 04:41:16.023177 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:16.023153 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-94zhn" Apr 21 04:41:16.031296 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:16.031223 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-94zhn" podStartSLOduration=1.6234421829999999 podStartE2EDuration="3.031210921s" podCreationTimestamp="2026-04-21 04:41:13 +0000 UTC" firstStartedPulling="2026-04-21 04:41:14.316769053 +0000 UTC m=+129.438326852" lastFinishedPulling="2026-04-21 04:41:15.724537794 +0000 UTC m=+130.846095590" observedRunningTime="2026-04-21 04:41:16.030323591 +0000 UTC m=+131.151881410" watchObservedRunningTime="2026-04-21 04:41:16.031210921 +0000 UTC m=+131.152768727" Apr 21 04:41:16.048410 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:16.048366 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-l4ghz" podStartSLOduration=2.071821926 podStartE2EDuration="4.048353661s" podCreationTimestamp="2026-04-21 04:41:12 +0000 UTC" firstStartedPulling="2026-04-21 04:41:12.945382417 +0000 UTC m=+128.066940210" lastFinishedPulling="2026-04-21 04:41:14.921914146 +0000 UTC m=+130.043471945" observedRunningTime="2026-04-21 04:41:16.046971887 +0000 UTC m=+131.168529703" watchObservedRunningTime="2026-04-21 04:41:16.048353661 +0000 UTC m=+131.169911468" Apr 21 04:41:17.021990 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:17.021952 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z4rqh" event={"ID":"8743d92c-6080-4066-ad83-55bb582a3f6c","Type":"ContainerStarted","Data":"2640eb3e783e7eb08a8f3a86eddb89b49aef68772511f210fcdbf10a7b18195e"} Apr 21 04:41:17.021990 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:17.021988 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z4rqh" event={"ID":"8743d92c-6080-4066-ad83-55bb582a3f6c","Type":"ContainerStarted","Data":"88486360ebc7da4338a56cbd0e8c2165bb41d73802d31e891a22bd02ca39f08c"} Apr 21 04:41:17.042509 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:17.042436 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-z4rqh" podStartSLOduration=130.854959947 podStartE2EDuration="2m12.042421498s" podCreationTimestamp="2026-04-21 04:39:05 +0000 UTC" firstStartedPulling="2026-04-21 04:41:15.003178963 +0000 UTC m=+130.124736748" lastFinishedPulling="2026-04-21 04:41:16.19064051 +0000 UTC m=+131.312198299" observedRunningTime="2026-04-21 04:41:17.041617583 +0000 UTC m=+132.163175390" watchObservedRunningTime="2026-04-21 04:41:17.042421498 +0000 UTC m=+132.163979510" Apr 21 04:41:20.665589 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.665553 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-rjx6k"] Apr 21 04:41:20.671350 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.671327 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rjx6k" Apr 21 04:41:20.674016 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.673991 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 04:41:20.674157 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.673991 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 21 04:41:20.674157 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.673999 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 21 04:41:20.675180 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.675159 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-gc8md\"" Apr 21 04:41:20.676513 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.676491 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-rjx6k"] Apr 21 04:41:20.690876 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.690856 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tclxg\" (UniqueName: \"kubernetes.io/projected/0d8be9f5-8edd-4d4d-b9f0-3dffedcf1c75-kube-api-access-tclxg\") pod \"openshift-state-metrics-9d44df66c-rjx6k\" (UID: \"0d8be9f5-8edd-4d4d-b9f0-3dffedcf1c75\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rjx6k" Apr 21 04:41:20.690976 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.690886 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0d8be9f5-8edd-4d4d-b9f0-3dffedcf1c75-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-rjx6k\" (UID: \"0d8be9f5-8edd-4d4d-b9f0-3dffedcf1c75\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rjx6k" Apr 21 04:41:20.691020 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.690974 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0d8be9f5-8edd-4d4d-b9f0-3dffedcf1c75-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-rjx6k\" (UID: \"0d8be9f5-8edd-4d4d-b9f0-3dffedcf1c75\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rjx6k" Apr 21 04:41:20.691020 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.691004 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0d8be9f5-8edd-4d4d-b9f0-3dffedcf1c75-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-rjx6k\" (UID: \"0d8be9f5-8edd-4d4d-b9f0-3dffedcf1c75\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rjx6k" Apr 21 04:41:20.693811 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.693793 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-n5d8t"] Apr 21 04:41:20.697485 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.697471 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-n5d8t" Apr 21 04:41:20.699914 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.699871 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-p4hrl\"" Apr 21 04:41:20.699999 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.699876 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 04:41:20.699999 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.699900 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 04:41:20.700219 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.700206 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 04:41:20.791634 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.791603 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f8c9b1cf-c943-4008-bc8d-4783051875d2-root\") pod \"node-exporter-n5d8t\" (UID: \"f8c9b1cf-c943-4008-bc8d-4783051875d2\") " pod="openshift-monitoring/node-exporter-n5d8t" Apr 21 04:41:20.791789 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.791637 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f8c9b1cf-c943-4008-bc8d-4783051875d2-node-exporter-textfile\") pod \"node-exporter-n5d8t\" (UID: \"f8c9b1cf-c943-4008-bc8d-4783051875d2\") " pod="openshift-monitoring/node-exporter-n5d8t" Apr 21 04:41:20.791789 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.791710 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f8c9b1cf-c943-4008-bc8d-4783051875d2-node-exporter-tls\") pod \"node-exporter-n5d8t\" (UID: \"f8c9b1cf-c943-4008-bc8d-4783051875d2\") " pod="openshift-monitoring/node-exporter-n5d8t" Apr 21 04:41:20.791789 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.791746 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tclxg\" (UniqueName: \"kubernetes.io/projected/0d8be9f5-8edd-4d4d-b9f0-3dffedcf1c75-kube-api-access-tclxg\") pod \"openshift-state-metrics-9d44df66c-rjx6k\" (UID: \"0d8be9f5-8edd-4d4d-b9f0-3dffedcf1c75\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rjx6k" Apr 21 04:41:20.791789 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.791774 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0d8be9f5-8edd-4d4d-b9f0-3dffedcf1c75-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-rjx6k\" (UID: \"0d8be9f5-8edd-4d4d-b9f0-3dffedcf1c75\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rjx6k" Apr 21 04:41:20.791972 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.791838 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f8c9b1cf-c943-4008-bc8d-4783051875d2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-n5d8t\" (UID: \"f8c9b1cf-c943-4008-bc8d-4783051875d2\") " pod="openshift-monitoring/node-exporter-n5d8t" Apr 21 04:41:20.791972 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.791880 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f8c9b1cf-c943-4008-bc8d-4783051875d2-node-exporter-wtmp\") pod \"node-exporter-n5d8t\" (UID: \"f8c9b1cf-c943-4008-bc8d-4783051875d2\") " pod="openshift-monitoring/node-exporter-n5d8t" Apr 21 04:41:20.791972 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.791904 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f8c9b1cf-c943-4008-bc8d-4783051875d2-metrics-client-ca\") pod \"node-exporter-n5d8t\" (UID: \"f8c9b1cf-c943-4008-bc8d-4783051875d2\") " pod="openshift-monitoring/node-exporter-n5d8t" Apr 21 04:41:20.791972 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.791944 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47dzm\" (UniqueName: \"kubernetes.io/projected/f8c9b1cf-c943-4008-bc8d-4783051875d2-kube-api-access-47dzm\") pod \"node-exporter-n5d8t\" (UID: \"f8c9b1cf-c943-4008-bc8d-4783051875d2\") " pod="openshift-monitoring/node-exporter-n5d8t" Apr 21 04:41:20.792205 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.792011 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0d8be9f5-8edd-4d4d-b9f0-3dffedcf1c75-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-rjx6k\" (UID: \"0d8be9f5-8edd-4d4d-b9f0-3dffedcf1c75\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rjx6k" Apr 21 04:41:20.792205 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.792053 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0d8be9f5-8edd-4d4d-b9f0-3dffedcf1c75-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-rjx6k\" (UID: \"0d8be9f5-8edd-4d4d-b9f0-3dffedcf1c75\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rjx6k" Apr 21 04:41:20.792205 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.792109 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f8c9b1cf-c943-4008-bc8d-4783051875d2-node-exporter-accelerators-collector-config\") pod \"node-exporter-n5d8t\" (UID: \"f8c9b1cf-c943-4008-bc8d-4783051875d2\") " pod="openshift-monitoring/node-exporter-n5d8t" Apr 21 04:41:20.792205 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:41:20.792186 2568 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 21 04:41:20.792403 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.792227 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f8c9b1cf-c943-4008-bc8d-4783051875d2-sys\") pod \"node-exporter-n5d8t\" (UID: \"f8c9b1cf-c943-4008-bc8d-4783051875d2\") " pod="openshift-monitoring/node-exporter-n5d8t" Apr 21 04:41:20.792403 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:41:20.792243 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d8be9f5-8edd-4d4d-b9f0-3dffedcf1c75-openshift-state-metrics-tls podName:0d8be9f5-8edd-4d4d-b9f0-3dffedcf1c75 nodeName:}" failed. No retries permitted until 2026-04-21 04:41:21.292228424 +0000 UTC m=+136.413786209 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/0d8be9f5-8edd-4d4d-b9f0-3dffedcf1c75-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-rjx6k" (UID: "0d8be9f5-8edd-4d4d-b9f0-3dffedcf1c75") : secret "openshift-state-metrics-tls" not found Apr 21 04:41:20.792638 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.792619 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0d8be9f5-8edd-4d4d-b9f0-3dffedcf1c75-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-rjx6k\" (UID: \"0d8be9f5-8edd-4d4d-b9f0-3dffedcf1c75\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rjx6k" Apr 21 04:41:20.794346 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.794328 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0d8be9f5-8edd-4d4d-b9f0-3dffedcf1c75-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-rjx6k\" (UID: \"0d8be9f5-8edd-4d4d-b9f0-3dffedcf1c75\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rjx6k" Apr 21 04:41:20.800782 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.800763 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tclxg\" (UniqueName: \"kubernetes.io/projected/0d8be9f5-8edd-4d4d-b9f0-3dffedcf1c75-kube-api-access-tclxg\") pod \"openshift-state-metrics-9d44df66c-rjx6k\" (UID: \"0d8be9f5-8edd-4d4d-b9f0-3dffedcf1c75\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rjx6k" Apr 21 04:41:20.893146 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.893106 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f8c9b1cf-c943-4008-bc8d-4783051875d2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-n5d8t\" (UID: \"f8c9b1cf-c943-4008-bc8d-4783051875d2\") " pod="openshift-monitoring/node-exporter-n5d8t" Apr 21 04:41:20.893347 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.893160 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f8c9b1cf-c943-4008-bc8d-4783051875d2-node-exporter-wtmp\") pod \"node-exporter-n5d8t\" (UID: \"f8c9b1cf-c943-4008-bc8d-4783051875d2\") " pod="openshift-monitoring/node-exporter-n5d8t" Apr 21 04:41:20.893347 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.893177 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f8c9b1cf-c943-4008-bc8d-4783051875d2-metrics-client-ca\") pod \"node-exporter-n5d8t\" (UID: \"f8c9b1cf-c943-4008-bc8d-4783051875d2\") " pod="openshift-monitoring/node-exporter-n5d8t" Apr 21 04:41:20.893347 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.893286 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-47dzm\" (UniqueName: \"kubernetes.io/projected/f8c9b1cf-c943-4008-bc8d-4783051875d2-kube-api-access-47dzm\") pod \"node-exporter-n5d8t\" (UID: \"f8c9b1cf-c943-4008-bc8d-4783051875d2\") " pod="openshift-monitoring/node-exporter-n5d8t" Apr 21 04:41:20.893347 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.893323 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f8c9b1cf-c943-4008-bc8d-4783051875d2-node-exporter-wtmp\") pod \"node-exporter-n5d8t\" (UID: \"f8c9b1cf-c943-4008-bc8d-4783051875d2\") " pod="openshift-monitoring/node-exporter-n5d8t" Apr 21 04:41:20.893347 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.893341 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f8c9b1cf-c943-4008-bc8d-4783051875d2-node-exporter-accelerators-collector-config\") pod \"node-exporter-n5d8t\" (UID: \"f8c9b1cf-c943-4008-bc8d-4783051875d2\") " pod="openshift-monitoring/node-exporter-n5d8t" Apr 21 04:41:20.893592 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.893383 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f8c9b1cf-c943-4008-bc8d-4783051875d2-sys\") pod \"node-exporter-n5d8t\" (UID: \"f8c9b1cf-c943-4008-bc8d-4783051875d2\") " pod="openshift-monitoring/node-exporter-n5d8t" Apr 21 04:41:20.893592 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.893415 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f8c9b1cf-c943-4008-bc8d-4783051875d2-root\") pod \"node-exporter-n5d8t\" (UID: \"f8c9b1cf-c943-4008-bc8d-4783051875d2\") " pod="openshift-monitoring/node-exporter-n5d8t" Apr 21 04:41:20.893592 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.893444 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f8c9b1cf-c943-4008-bc8d-4783051875d2-node-exporter-textfile\") pod \"node-exporter-n5d8t\" (UID: \"f8c9b1cf-c943-4008-bc8d-4783051875d2\") " pod="openshift-monitoring/node-exporter-n5d8t" Apr 21 04:41:20.893592 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.893479 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f8c9b1cf-c943-4008-bc8d-4783051875d2-node-exporter-tls\") pod \"node-exporter-n5d8t\" (UID: \"f8c9b1cf-c943-4008-bc8d-4783051875d2\") " pod="openshift-monitoring/node-exporter-n5d8t" Apr 21 04:41:20.893592 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.893493 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f8c9b1cf-c943-4008-bc8d-4783051875d2-sys\") pod \"node-exporter-n5d8t\" (UID: \"f8c9b1cf-c943-4008-bc8d-4783051875d2\") " pod="openshift-monitoring/node-exporter-n5d8t" Apr 21 04:41:20.893592 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.893517 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f8c9b1cf-c943-4008-bc8d-4783051875d2-root\") pod \"node-exporter-n5d8t\" (UID: \"f8c9b1cf-c943-4008-bc8d-4783051875d2\") " pod="openshift-monitoring/node-exporter-n5d8t" Apr 21 04:41:20.893805 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.893754 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f8c9b1cf-c943-4008-bc8d-4783051875d2-node-exporter-textfile\") pod \"node-exporter-n5d8t\" (UID: \"f8c9b1cf-c943-4008-bc8d-4783051875d2\") " pod="openshift-monitoring/node-exporter-n5d8t" Apr 21 04:41:20.893805 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.893760 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f8c9b1cf-c943-4008-bc8d-4783051875d2-metrics-client-ca\") pod \"node-exporter-n5d8t\" (UID: \"f8c9b1cf-c943-4008-bc8d-4783051875d2\") " pod="openshift-monitoring/node-exporter-n5d8t" Apr 21 04:41:20.893913 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.893889 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f8c9b1cf-c943-4008-bc8d-4783051875d2-node-exporter-accelerators-collector-config\") pod \"node-exporter-n5d8t\" (UID: \"f8c9b1cf-c943-4008-bc8d-4783051875d2\") " pod="openshift-monitoring/node-exporter-n5d8t" Apr 21 04:41:20.895606 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.895585 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f8c9b1cf-c943-4008-bc8d-4783051875d2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-n5d8t\" (UID: \"f8c9b1cf-c943-4008-bc8d-4783051875d2\") " pod="openshift-monitoring/node-exporter-n5d8t" Apr 21 04:41:20.895700 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.895684 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f8c9b1cf-c943-4008-bc8d-4783051875d2-node-exporter-tls\") pod \"node-exporter-n5d8t\" (UID: \"f8c9b1cf-c943-4008-bc8d-4783051875d2\") " pod="openshift-monitoring/node-exporter-n5d8t" Apr 21 04:41:20.900600 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:20.900577 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-47dzm\" (UniqueName: \"kubernetes.io/projected/f8c9b1cf-c943-4008-bc8d-4783051875d2-kube-api-access-47dzm\") pod \"node-exporter-n5d8t\" (UID: \"f8c9b1cf-c943-4008-bc8d-4783051875d2\") " pod="openshift-monitoring/node-exporter-n5d8t" Apr 21 04:41:21.005807 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:21.005725 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-n5d8t" Apr 21 04:41:21.013454 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:41:21.013424 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8c9b1cf_c943_4008_bc8d_4783051875d2.slice/crio-0c6f580a6377433d56991add8e09d95c4ede69e0d988f7ecf1a342093dd8484c WatchSource:0}: Error finding container 0c6f580a6377433d56991add8e09d95c4ede69e0d988f7ecf1a342093dd8484c: Status 404 returned error can't find the container with id 0c6f580a6377433d56991add8e09d95c4ede69e0d988f7ecf1a342093dd8484c Apr 21 04:41:21.032501 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:21.032471 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n5d8t" event={"ID":"f8c9b1cf-c943-4008-bc8d-4783051875d2","Type":"ContainerStarted","Data":"0c6f580a6377433d56991add8e09d95c4ede69e0d988f7ecf1a342093dd8484c"} Apr 21 04:41:21.297879 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:21.297784 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0d8be9f5-8edd-4d4d-b9f0-3dffedcf1c75-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-rjx6k\" (UID: \"0d8be9f5-8edd-4d4d-b9f0-3dffedcf1c75\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rjx6k" Apr 21 04:41:21.300114 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:21.300088 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0d8be9f5-8edd-4d4d-b9f0-3dffedcf1c75-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-rjx6k\" (UID: \"0d8be9f5-8edd-4d4d-b9f0-3dffedcf1c75\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rjx6k" Apr 21 04:41:21.581563 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:21.581476 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rjx6k" Apr 21 04:41:21.717566 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:21.717537 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-rjx6k"] Apr 21 04:41:21.779280 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:21.779251 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 04:41:21.784579 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:21.784551 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:21.790453 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:21.790430 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 21 04:41:21.791402 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:21.791374 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 21 04:41:21.791530 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:21.791408 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-5fnvm\"" Apr 21 04:41:21.791530 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:21.791428 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 21 04:41:21.791530 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:21.791377 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 21 04:41:21.791530 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:21.791384 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 21 04:41:21.791813 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:21.791733 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 21 04:41:21.791813 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:21.791785 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 21 04:41:21.791943 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:21.791739 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 21 04:41:21.792236 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:21.792094 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 21 04:41:21.809234 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:21.809181 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 04:41:21.809485 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:41:21.809462 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d8be9f5_8edd_4d4d_b9f0_3dffedcf1c75.slice/crio-3af7c52d831f84699de48171126739569c9258fbae2f793671d1cfa8b43a789a WatchSource:0}: Error finding container 3af7c52d831f84699de48171126739569c9258fbae2f793671d1cfa8b43a789a: Status 404 returned error can't find the container with id 3af7c52d831f84699de48171126739569c9258fbae2f793671d1cfa8b43a789a Apr 21 04:41:21.902991 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:21.902960 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:21.903143 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:21.903010 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:21.903143 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:21.903097 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-config-volume\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:21.903143 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:21.903139 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:21.903310 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:21.903190 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-web-config\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:21.903310 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:21.903276 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:21.903414 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:21.903323 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0bcc49d8-b091-41af-a914-07a7a3388c25-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:21.903414 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:21.903348 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/0bcc49d8-b091-41af-a914-07a7a3388c25-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:21.903414 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:21.903363 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0bcc49d8-b091-41af-a914-07a7a3388c25-config-out\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:21.903566 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:21.903394 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:21.903566 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:21.903477 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bcc49d8-b091-41af-a914-07a7a3388c25-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:21.903566 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:21.903512 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0bcc49d8-b091-41af-a914-07a7a3388c25-tls-assets\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:21.903566 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:21.903537 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhtx8\" (UniqueName: \"kubernetes.io/projected/0bcc49d8-b091-41af-a914-07a7a3388c25-kube-api-access-hhtx8\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:22.004354 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:22.004316 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:22.004529 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:22.004368 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:22.004529 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:22.004396 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-config-volume\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:22.004529 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:22.004433 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:22.004529 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:22.004457 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-web-config\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:22.004745 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:22.004535 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:22.004745 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:22.004565 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0bcc49d8-b091-41af-a914-07a7a3388c25-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:22.004745 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:22.004707 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/0bcc49d8-b091-41af-a914-07a7a3388c25-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:22.004895 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:22.004754 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0bcc49d8-b091-41af-a914-07a7a3388c25-config-out\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:22.004895 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:22.004784 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:22.004895 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:22.004841 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bcc49d8-b091-41af-a914-07a7a3388c25-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:22.004895 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:22.004870 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0bcc49d8-b091-41af-a914-07a7a3388c25-tls-assets\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:22.005091 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:22.004895 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hhtx8\" (UniqueName: \"kubernetes.io/projected/0bcc49d8-b091-41af-a914-07a7a3388c25-kube-api-access-hhtx8\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:22.005515 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:22.005392 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/0bcc49d8-b091-41af-a914-07a7a3388c25-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:22.005983 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:22.005706 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0bcc49d8-b091-41af-a914-07a7a3388c25-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:22.006523 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:22.006252 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bcc49d8-b091-41af-a914-07a7a3388c25-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:22.008211 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:22.008121 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:22.008895 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:22.008869 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-config-volume\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:22.009802 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:22.009717 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-web-config\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:22.009802 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:22.009742 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:22.009935 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:22.009822 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0bcc49d8-b091-41af-a914-07a7a3388c25-tls-assets\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:22.009935 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:22.009863 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:22.010359 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:22.010340 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0bcc49d8-b091-41af-a914-07a7a3388c25-config-out\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:22.010690 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:22.010671 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:22.010760 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:22.010710 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:22.013325 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:22.013298 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhtx8\" (UniqueName: \"kubernetes.io/projected/0bcc49d8-b091-41af-a914-07a7a3388c25-kube-api-access-hhtx8\") pod \"alertmanager-main-0\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:22.036932 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:22.036899 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rjx6k" event={"ID":"0d8be9f5-8edd-4d4d-b9f0-3dffedcf1c75","Type":"ContainerStarted","Data":"189e0e50611cc21ed70b844aaeb6f2725ebbfba6d8fc80bc536a9549424fe92a"} Apr 21 04:41:22.037060 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:22.036942 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rjx6k" event={"ID":"0d8be9f5-8edd-4d4d-b9f0-3dffedcf1c75","Type":"ContainerStarted","Data":"690ce2516c76fc1c64c55eb0c2bb59821b44947ad34a8c3e7273639b7b5ad482"} Apr 21 04:41:22.037060 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:22.036955 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rjx6k" event={"ID":"0d8be9f5-8edd-4d4d-b9f0-3dffedcf1c75","Type":"ContainerStarted","Data":"3af7c52d831f84699de48171126739569c9258fbae2f793671d1cfa8b43a789a"} Apr 21 04:41:22.038364 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:22.038336 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n5d8t" event={"ID":"f8c9b1cf-c943-4008-bc8d-4783051875d2","Type":"ContainerStarted","Data":"d91d8eec7eeba705aa3fbd44e264498de7d4c5b9a5c064e2e6a0067fb44f7a75"} Apr 21 04:41:22.108263 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:22.108194 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:41:22.231201 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:22.231144 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 04:41:22.233234 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:41:22.233201 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bcc49d8_b091_41af_a914_07a7a3388c25.slice/crio-3ed4dbe2c29b8f040444f289f63b3d46226c1900747e2ebb8af5b963418f9090 WatchSource:0}: Error finding container 3ed4dbe2c29b8f040444f289f63b3d46226c1900747e2ebb8af5b963418f9090: Status 404 returned error can't find the container with id 3ed4dbe2c29b8f040444f289f63b3d46226c1900747e2ebb8af5b963418f9090 Apr 21 04:41:23.043351 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.043316 2568 generic.go:358] "Generic (PLEG): container finished" podID="f8c9b1cf-c943-4008-bc8d-4783051875d2" containerID="d91d8eec7eeba705aa3fbd44e264498de7d4c5b9a5c064e2e6a0067fb44f7a75" exitCode=0 Apr 21 04:41:23.043822 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.043369 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n5d8t" event={"ID":"f8c9b1cf-c943-4008-bc8d-4783051875d2","Type":"ContainerDied","Data":"d91d8eec7eeba705aa3fbd44e264498de7d4c5b9a5c064e2e6a0067fb44f7a75"} Apr 21 04:41:23.044610 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.044581 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0bcc49d8-b091-41af-a914-07a7a3388c25","Type":"ContainerStarted","Data":"3ed4dbe2c29b8f040444f289f63b3d46226c1900747e2ebb8af5b963418f9090"} Apr 21 04:41:23.481211 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.481124 2568 scope.go:117] "RemoveContainer" containerID="f1a8e3b3a338ea745b6cff411b7d9561d89c571841a5c43ff488b3739a36d571" Apr 21 04:41:23.633207 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.633172 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-b9c54d565-czdz9"] Apr 21 04:41:23.636984 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.636964 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-b9c54d565-czdz9" Apr 21 04:41:23.639426 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.639406 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-pmswh\"" Apr 21 04:41:23.639579 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.639407 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 21 04:41:23.639579 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.639459 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 21 04:41:23.639579 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.639460 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-aovcr6bc801p6\"" Apr 21 04:41:23.639770 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.639465 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 21 04:41:23.639770 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.639721 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 21 04:41:23.639770 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.639747 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 21 04:41:23.645500 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.645481 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-b9c54d565-czdz9"] Apr 21 04:41:23.723664 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.723629 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/524c70fa-e764-4096-9e62-1e894feb43cd-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-b9c54d565-czdz9\" (UID: \"524c70fa-e764-4096-9e62-1e894feb43cd\") " pod="openshift-monitoring/thanos-querier-b9c54d565-czdz9" Apr 21 04:41:23.723835 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.723677 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/524c70fa-e764-4096-9e62-1e894feb43cd-metrics-client-ca\") pod \"thanos-querier-b9c54d565-czdz9\" (UID: \"524c70fa-e764-4096-9e62-1e894feb43cd\") " pod="openshift-monitoring/thanos-querier-b9c54d565-czdz9" Apr 21 04:41:23.723835 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.723760 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/524c70fa-e764-4096-9e62-1e894feb43cd-secret-grpc-tls\") pod \"thanos-querier-b9c54d565-czdz9\" (UID: \"524c70fa-e764-4096-9e62-1e894feb43cd\") " pod="openshift-monitoring/thanos-querier-b9c54d565-czdz9" Apr 21 04:41:23.723835 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.723790 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/524c70fa-e764-4096-9e62-1e894feb43cd-secret-thanos-querier-tls\") pod \"thanos-querier-b9c54d565-czdz9\" (UID: \"524c70fa-e764-4096-9e62-1e894feb43cd\") " pod="openshift-monitoring/thanos-querier-b9c54d565-czdz9" Apr 21 04:41:23.723835 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.723829 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bkvf\" (UniqueName: \"kubernetes.io/projected/524c70fa-e764-4096-9e62-1e894feb43cd-kube-api-access-2bkvf\") pod \"thanos-querier-b9c54d565-czdz9\" (UID: \"524c70fa-e764-4096-9e62-1e894feb43cd\") " pod="openshift-monitoring/thanos-querier-b9c54d565-czdz9" Apr 21 04:41:23.724045 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.723896 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/524c70fa-e764-4096-9e62-1e894feb43cd-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-b9c54d565-czdz9\" (UID: \"524c70fa-e764-4096-9e62-1e894feb43cd\") " pod="openshift-monitoring/thanos-querier-b9c54d565-czdz9" Apr 21 04:41:23.724045 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.723937 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/524c70fa-e764-4096-9e62-1e894feb43cd-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-b9c54d565-czdz9\" (UID: \"524c70fa-e764-4096-9e62-1e894feb43cd\") " pod="openshift-monitoring/thanos-querier-b9c54d565-czdz9" Apr 21 04:41:23.724045 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.724011 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/524c70fa-e764-4096-9e62-1e894feb43cd-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-b9c54d565-czdz9\" (UID: \"524c70fa-e764-4096-9e62-1e894feb43cd\") " pod="openshift-monitoring/thanos-querier-b9c54d565-czdz9" Apr 21 04:41:23.824593 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.824501 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/524c70fa-e764-4096-9e62-1e894feb43cd-secret-grpc-tls\") pod \"thanos-querier-b9c54d565-czdz9\" (UID: \"524c70fa-e764-4096-9e62-1e894feb43cd\") " pod="openshift-monitoring/thanos-querier-b9c54d565-czdz9" Apr 21 04:41:23.824593 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.824544 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/524c70fa-e764-4096-9e62-1e894feb43cd-secret-thanos-querier-tls\") pod \"thanos-querier-b9c54d565-czdz9\" (UID: \"524c70fa-e764-4096-9e62-1e894feb43cd\") " pod="openshift-monitoring/thanos-querier-b9c54d565-czdz9" Apr 21 04:41:23.824593 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.824569 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bkvf\" (UniqueName: \"kubernetes.io/projected/524c70fa-e764-4096-9e62-1e894feb43cd-kube-api-access-2bkvf\") pod \"thanos-querier-b9c54d565-czdz9\" (UID: \"524c70fa-e764-4096-9e62-1e894feb43cd\") " pod="openshift-monitoring/thanos-querier-b9c54d565-czdz9" Apr 21 04:41:23.824593 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.824595 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/524c70fa-e764-4096-9e62-1e894feb43cd-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-b9c54d565-czdz9\" (UID: \"524c70fa-e764-4096-9e62-1e894feb43cd\") " pod="openshift-monitoring/thanos-querier-b9c54d565-czdz9" Apr 21 04:41:23.824926 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.824702 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/524c70fa-e764-4096-9e62-1e894feb43cd-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-b9c54d565-czdz9\" (UID: \"524c70fa-e764-4096-9e62-1e894feb43cd\") " pod="openshift-monitoring/thanos-querier-b9c54d565-czdz9" Apr 21 04:41:23.824926 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.824776 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/524c70fa-e764-4096-9e62-1e894feb43cd-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-b9c54d565-czdz9\" (UID: \"524c70fa-e764-4096-9e62-1e894feb43cd\") " pod="openshift-monitoring/thanos-querier-b9c54d565-czdz9" Apr 21 04:41:23.824926 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.824837 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/524c70fa-e764-4096-9e62-1e894feb43cd-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-b9c54d565-czdz9\" (UID: \"524c70fa-e764-4096-9e62-1e894feb43cd\") " pod="openshift-monitoring/thanos-querier-b9c54d565-czdz9" Apr 21 04:41:23.824926 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.824863 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/524c70fa-e764-4096-9e62-1e894feb43cd-metrics-client-ca\") pod \"thanos-querier-b9c54d565-czdz9\" (UID: \"524c70fa-e764-4096-9e62-1e894feb43cd\") " pod="openshift-monitoring/thanos-querier-b9c54d565-czdz9" Apr 21 04:41:23.825707 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.825646 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/524c70fa-e764-4096-9e62-1e894feb43cd-metrics-client-ca\") pod \"thanos-querier-b9c54d565-czdz9\" (UID: \"524c70fa-e764-4096-9e62-1e894feb43cd\") " pod="openshift-monitoring/thanos-querier-b9c54d565-czdz9" Apr 21 04:41:23.827427 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.827400 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/524c70fa-e764-4096-9e62-1e894feb43cd-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-b9c54d565-czdz9\" (UID: \"524c70fa-e764-4096-9e62-1e894feb43cd\") " pod="openshift-monitoring/thanos-querier-b9c54d565-czdz9" Apr 21 04:41:23.827692 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.827668 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/524c70fa-e764-4096-9e62-1e894feb43cd-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-b9c54d565-czdz9\" (UID: \"524c70fa-e764-4096-9e62-1e894feb43cd\") " pod="openshift-monitoring/thanos-querier-b9c54d565-czdz9" Apr 21 04:41:23.827820 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.827694 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/524c70fa-e764-4096-9e62-1e894feb43cd-secret-thanos-querier-tls\") pod \"thanos-querier-b9c54d565-czdz9\" (UID: \"524c70fa-e764-4096-9e62-1e894feb43cd\") " pod="openshift-monitoring/thanos-querier-b9c54d565-czdz9" Apr 21 04:41:23.827988 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.827968 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/524c70fa-e764-4096-9e62-1e894feb43cd-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-b9c54d565-czdz9\" (UID: \"524c70fa-e764-4096-9e62-1e894feb43cd\") " pod="openshift-monitoring/thanos-querier-b9c54d565-czdz9" Apr 21 04:41:23.828041 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.828026 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/524c70fa-e764-4096-9e62-1e894feb43cd-secret-grpc-tls\") pod \"thanos-querier-b9c54d565-czdz9\" (UID: \"524c70fa-e764-4096-9e62-1e894feb43cd\") " pod="openshift-monitoring/thanos-querier-b9c54d565-czdz9" Apr 21 04:41:23.828153 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.828137 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/524c70fa-e764-4096-9e62-1e894feb43cd-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-b9c54d565-czdz9\" (UID: \"524c70fa-e764-4096-9e62-1e894feb43cd\") " pod="openshift-monitoring/thanos-querier-b9c54d565-czdz9" Apr 21 04:41:23.832408 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.832388 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bkvf\" (UniqueName: \"kubernetes.io/projected/524c70fa-e764-4096-9e62-1e894feb43cd-kube-api-access-2bkvf\") pod \"thanos-querier-b9c54d565-czdz9\" (UID: \"524c70fa-e764-4096-9e62-1e894feb43cd\") " pod="openshift-monitoring/thanos-querier-b9c54d565-czdz9" Apr 21 04:41:23.946746 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:23.946712 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-b9c54d565-czdz9" Apr 21 04:41:24.049903 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:24.049857 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n5d8t" event={"ID":"f8c9b1cf-c943-4008-bc8d-4783051875d2","Type":"ContainerStarted","Data":"416a84844a69e4af3d4077ea05dd76bedc1c91e1d7f47eba8d46edba1e3bfdd8"} Apr 21 04:41:24.050346 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:24.049912 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n5d8t" event={"ID":"f8c9b1cf-c943-4008-bc8d-4783051875d2","Type":"ContainerStarted","Data":"d37937beee342c985618a59c8d493244b30d1bc7b7ebeb6fe85c46e3330a5382"} Apr 21 04:41:24.051424 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:24.051394 2568 generic.go:358] "Generic (PLEG): container finished" podID="0bcc49d8-b091-41af-a914-07a7a3388c25" containerID="f429086c506e6334c3d56b98c78fff253bb939d075c187c46a60f48e841f92ae" exitCode=0 Apr 21 04:41:24.051548 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:24.051475 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0bcc49d8-b091-41af-a914-07a7a3388c25","Type":"ContainerDied","Data":"f429086c506e6334c3d56b98c78fff253bb939d075c187c46a60f48e841f92ae"} Apr 21 04:41:24.053704 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:24.053680 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rjx6k" event={"ID":"0d8be9f5-8edd-4d4d-b9f0-3dffedcf1c75","Type":"ContainerStarted","Data":"63a2c4aa55c8fd4a994853872ff5e899ea2a6e263edc219cd1ac7d74e8d36dfa"} Apr 21 04:41:24.055489 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:24.055470 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wdf4b_8f53d634-201a-47a8-bb8f-a939d320e536/console-operator/2.log" Apr 21 04:41:24.055594 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:24.055512 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-wdf4b" event={"ID":"8f53d634-201a-47a8-bb8f-a939d320e536","Type":"ContainerStarted","Data":"372390ad6d93e7dff7b17fca88d36538a157a8be3155211871758328486c4787"} Apr 21 04:41:24.055743 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:24.055728 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-wdf4b" Apr 21 04:41:24.067894 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:24.067826 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-n5d8t" podStartSLOduration=3.227586129 podStartE2EDuration="4.067808363s" podCreationTimestamp="2026-04-21 04:41:20 +0000 UTC" firstStartedPulling="2026-04-21 04:41:21.014993943 +0000 UTC m=+136.136551728" lastFinishedPulling="2026-04-21 04:41:21.855216166 +0000 UTC m=+136.976773962" observedRunningTime="2026-04-21 04:41:24.066441958 +0000 UTC m=+139.187999755" watchObservedRunningTime="2026-04-21 04:41:24.067808363 +0000 UTC m=+139.189366173" Apr 21 04:41:24.075970 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:24.075948 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-b9c54d565-czdz9"] Apr 21 04:41:24.077742 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:41:24.077714 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod524c70fa_e764_4096_9e62_1e894feb43cd.slice/crio-d4031048734767fb96d236f4e92ae5796c86c082f4558b5e01816f5a1d3d10e1 WatchSource:0}: Error finding container d4031048734767fb96d236f4e92ae5796c86c082f4558b5e01816f5a1d3d10e1: Status 404 returned error can't find the container with id d4031048734767fb96d236f4e92ae5796c86c082f4558b5e01816f5a1d3d10e1 Apr 21 04:41:24.111897 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:24.111850 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rjx6k" podStartSLOduration=2.899285574 podStartE2EDuration="4.111837793s" podCreationTimestamp="2026-04-21 04:41:20 +0000 UTC" firstStartedPulling="2026-04-21 04:41:21.972764681 +0000 UTC m=+137.094322471" lastFinishedPulling="2026-04-21 04:41:23.185316899 +0000 UTC m=+138.306874690" observedRunningTime="2026-04-21 04:41:24.111093118 +0000 UTC m=+139.232650928" watchObservedRunningTime="2026-04-21 04:41:24.111837793 +0000 UTC m=+139.233395602" Apr 21 04:41:24.127825 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:24.127780 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-wdf4b" podStartSLOduration=43.172099944 podStartE2EDuration="45.127765111s" podCreationTimestamp="2026-04-21 04:40:39 +0000 UTC" firstStartedPulling="2026-04-21 04:40:39.625251107 +0000 UTC m=+94.746808897" lastFinishedPulling="2026-04-21 04:40:41.580916262 +0000 UTC m=+96.702474064" observedRunningTime="2026-04-21 04:41:24.126266161 +0000 UTC m=+139.247823969" watchObservedRunningTime="2026-04-21 04:41:24.127765111 +0000 UTC m=+139.249322921" Apr 21 04:41:24.200165 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:24.200141 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-wdf4b" Apr 21 04:41:25.060821 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:25.060497 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b9c54d565-czdz9" event={"ID":"524c70fa-e764-4096-9e62-1e894feb43cd","Type":"ContainerStarted","Data":"d4031048734767fb96d236f4e92ae5796c86c082f4558b5e01816f5a1d3d10e1"} Apr 21 04:41:26.069798 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.069768 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b9c54d565-czdz9" event={"ID":"524c70fa-e764-4096-9e62-1e894feb43cd","Type":"ContainerStarted","Data":"235165cbdf2c70946498a37399637aad7daca1b0209ae719a7af3cfc6b09e13e"} Apr 21 04:41:26.070207 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.069808 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b9c54d565-czdz9" event={"ID":"524c70fa-e764-4096-9e62-1e894feb43cd","Type":"ContainerStarted","Data":"e1400d86e08c604432d67d46fc496e5af5967a153b1a4b19585c2fb62097844a"} Apr 21 04:41:26.072155 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.072128 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0bcc49d8-b091-41af-a914-07a7a3388c25","Type":"ContainerStarted","Data":"ce45080f3042fb0356eddd7208320d27bf95cf43629e2cc5a0e524332dc1394c"} Apr 21 04:41:26.072265 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.072170 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0bcc49d8-b091-41af-a914-07a7a3388c25","Type":"ContainerStarted","Data":"c3131849715e94a10cb3c94ee6a79b27d0600e06b248d997721c704805f54a65"} Apr 21 04:41:26.890220 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.889725 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 04:41:26.895945 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.895818 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:26.900122 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.899864 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 21 04:41:26.900262 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.900242 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 21 04:41:26.900338 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.900256 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 21 04:41:26.900544 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.900521 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 21 04:41:26.901434 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.900523 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 21 04:41:26.901434 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.900878 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 21 04:41:26.901434 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.900907 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 21 04:41:26.901434 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.901047 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 21 04:41:26.901434 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.901181 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 21 04:41:26.901434 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.901219 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 21 04:41:26.901434 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.901286 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 21 04:41:26.901434 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.901219 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-858rkj8cpvdcq\"" Apr 21 04:41:26.901434 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.901341 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-5rc8p\"" Apr 21 04:41:26.901434 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.901356 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 21 04:41:26.903185 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.903163 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 21 04:41:26.909591 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.909565 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 04:41:26.957143 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.957111 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:26.957308 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.957152 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:26.957308 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.957215 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:26.957308 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.957262 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d0250b8-c362-4063-9b74-0cc043c283e9-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:26.957308 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.957300 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1d0250b8-c362-4063-9b74-0cc043c283e9-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:26.957504 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.957345 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:26.957504 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.957371 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1d0250b8-c362-4063-9b74-0cc043c283e9-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:26.957504 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.957415 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1d0250b8-c362-4063-9b74-0cc043c283e9-config-out\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:26.957504 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.957441 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:26.957653 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.957508 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-config\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:26.957653 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.957533 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:26.957653 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.957582 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-web-config\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:26.957653 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.957614 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx858\" (UniqueName: \"kubernetes.io/projected/1d0250b8-c362-4063-9b74-0cc043c283e9-kube-api-access-hx858\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:26.957823 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.957657 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d0250b8-c362-4063-9b74-0cc043c283e9-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:26.957823 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.957695 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1d0250b8-c362-4063-9b74-0cc043c283e9-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:26.957823 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.957723 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d0250b8-c362-4063-9b74-0cc043c283e9-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:26.957823 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.957746 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:26.957969 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:26.957820 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1d0250b8-c362-4063-9b74-0cc043c283e9-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.058889 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.058848 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1d0250b8-c362-4063-9b74-0cc043c283e9-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.059105 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.058918 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.059105 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.058946 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.059105 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.058980 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.059105 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.059008 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d0250b8-c362-4063-9b74-0cc043c283e9-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.059105 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.059040 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1d0250b8-c362-4063-9b74-0cc043c283e9-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.059355 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.059108 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.059355 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.059133 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1d0250b8-c362-4063-9b74-0cc043c283e9-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.059355 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.059171 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1d0250b8-c362-4063-9b74-0cc043c283e9-config-out\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.059355 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.059199 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.059355 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.059237 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-config\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.059355 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.059264 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.059355 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.059310 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-web-config\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.061197 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.059357 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hx858\" (UniqueName: \"kubernetes.io/projected/1d0250b8-c362-4063-9b74-0cc043c283e9-kube-api-access-hx858\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.061197 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.059394 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d0250b8-c362-4063-9b74-0cc043c283e9-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.061197 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.059422 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1d0250b8-c362-4063-9b74-0cc043c283e9-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.061197 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.059451 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d0250b8-c362-4063-9b74-0cc043c283e9-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.061197 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.059474 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.061197 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.060743 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d0250b8-c362-4063-9b74-0cc043c283e9-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.061594 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.061433 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1d0250b8-c362-4063-9b74-0cc043c283e9-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.061594 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.061456 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d0250b8-c362-4063-9b74-0cc043c283e9-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.062061 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.061743 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1d0250b8-c362-4063-9b74-0cc043c283e9-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.064798 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.063698 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.064798 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.064327 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d0250b8-c362-4063-9b74-0cc043c283e9-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.064798 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.064360 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.064798 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.064524 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1d0250b8-c362-4063-9b74-0cc043c283e9-config-out\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.065062 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.064839 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1d0250b8-c362-4063-9b74-0cc043c283e9-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.065062 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.064923 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.065062 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.064960 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.065508 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.065445 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.065854 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.065831 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-web-config\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.066060 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.066013 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.066155 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.066119 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.066454 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.066430 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-config\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.066549 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.066491 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1d0250b8-c362-4063-9b74-0cc043c283e9-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.067790 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.067762 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx858\" (UniqueName: \"kubernetes.io/projected/1d0250b8-c362-4063-9b74-0cc043c283e9-kube-api-access-hx858\") pod \"prometheus-k8s-0\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.083017 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.082916 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b9c54d565-czdz9" event={"ID":"524c70fa-e764-4096-9e62-1e894feb43cd","Type":"ContainerStarted","Data":"11266e02cbf2a7a54b75f6afaa4c06bb4cb9033d871fd0d8b1814e1451f6a63d"} Apr 21 04:41:27.083017 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.082963 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b9c54d565-czdz9" event={"ID":"524c70fa-e764-4096-9e62-1e894feb43cd","Type":"ContainerStarted","Data":"019d67729f779b453a0082bdbcff646702861a29b1289b9c981628214536a769"} Apr 21 04:41:27.083017 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.082977 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b9c54d565-czdz9" event={"ID":"524c70fa-e764-4096-9e62-1e894feb43cd","Type":"ContainerStarted","Data":"a5b20f6bc4f300d08b764e8627e54ee87e56003e95213befca63d4d29389c7b9"} Apr 21 04:41:27.083017 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.082991 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b9c54d565-czdz9" event={"ID":"524c70fa-e764-4096-9e62-1e894feb43cd","Type":"ContainerStarted","Data":"225eaffe4dab11797d261342d464cdd68779efafee65ef9a212a8dc19883e287"} Apr 21 04:41:27.087729 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.087700 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0bcc49d8-b091-41af-a914-07a7a3388c25","Type":"ContainerStarted","Data":"9d5e894a7fb4b5cf761d9e38d5d5a928c089d9dd99795e9cb868347fb8b2c97b"} Apr 21 04:41:27.087905 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.087887 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0bcc49d8-b091-41af-a914-07a7a3388c25","Type":"ContainerStarted","Data":"75d6078e5b7c76d3cc3e789d9c9bdd5c00296f36fd6dcf8b6d302d26ca1b3c2d"} Apr 21 04:41:27.088001 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.087986 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0bcc49d8-b091-41af-a914-07a7a3388c25","Type":"ContainerStarted","Data":"ec114c1ec10dcd98926212b2a8588bc631577fe6cfeab2a76b9323276f135c10"} Apr 21 04:41:27.088264 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.088233 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0bcc49d8-b091-41af-a914-07a7a3388c25","Type":"ContainerStarted","Data":"a0935d3be780b40a081a2573bd41459e7e5354289cd9d557dd79cc5fd600f9ec"} Apr 21 04:41:27.105749 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.105698 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-b9c54d565-czdz9" podStartSLOduration=1.344834184 podStartE2EDuration="4.105683774s" podCreationTimestamp="2026-04-21 04:41:23 +0000 UTC" firstStartedPulling="2026-04-21 04:41:24.079548512 +0000 UTC m=+139.201106298" lastFinishedPulling="2026-04-21 04:41:26.840398089 +0000 UTC m=+141.961955888" observedRunningTime="2026-04-21 04:41:27.103743461 +0000 UTC m=+142.225301267" watchObservedRunningTime="2026-04-21 04:41:27.105683774 +0000 UTC m=+142.227241581" Apr 21 04:41:27.128649 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.128595 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.521692246 podStartE2EDuration="6.128576968s" podCreationTimestamp="2026-04-21 04:41:21 +0000 UTC" firstStartedPulling="2026-04-21 04:41:22.235154322 +0000 UTC m=+137.356712111" lastFinishedPulling="2026-04-21 04:41:26.842039048 +0000 UTC m=+141.963596833" observedRunningTime="2026-04-21 04:41:27.127220871 +0000 UTC m=+142.248778670" watchObservedRunningTime="2026-04-21 04:41:27.128576968 +0000 UTC m=+142.250134775" Apr 21 04:41:27.209794 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.209754 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:27.336514 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:27.336489 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 04:41:27.338274 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:41:27.338244 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d0250b8_c362_4063_9b74_0cc043c283e9.slice/crio-f1e01fb82e844d138b8d8e9cf1f321574ea911ecaeb88df040d8ceacccd11a37 WatchSource:0}: Error finding container f1e01fb82e844d138b8d8e9cf1f321574ea911ecaeb88df040d8ceacccd11a37: Status 404 returned error can't find the container with id f1e01fb82e844d138b8d8e9cf1f321574ea911ecaeb88df040d8ceacccd11a37 Apr 21 04:41:28.091732 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:28.091694 2568 generic.go:358] "Generic (PLEG): container finished" podID="1d0250b8-c362-4063-9b74-0cc043c283e9" containerID="45480ef1a56f7712ee52f8e980344d56b6ef48ccba3b14fe97646c6564402a3d" exitCode=0 Apr 21 04:41:28.092142 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:28.091739 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1d0250b8-c362-4063-9b74-0cc043c283e9","Type":"ContainerDied","Data":"45480ef1a56f7712ee52f8e980344d56b6ef48ccba3b14fe97646c6564402a3d"} Apr 21 04:41:28.092142 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:28.091774 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1d0250b8-c362-4063-9b74-0cc043c283e9","Type":"ContainerStarted","Data":"f1e01fb82e844d138b8d8e9cf1f321574ea911ecaeb88df040d8ceacccd11a37"} Apr 21 04:41:28.092293 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:28.092280 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-b9c54d565-czdz9" Apr 21 04:41:31.104805 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:31.104722 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1d0250b8-c362-4063-9b74-0cc043c283e9","Type":"ContainerStarted","Data":"7cfc22f2a7ad9e277a3c6a7a35f4c026453e722da93fde0589ff9525509df15d"} Apr 21 04:41:31.104805 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:31.104757 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1d0250b8-c362-4063-9b74-0cc043c283e9","Type":"ContainerStarted","Data":"e402bd99d2df6d31276399fc0b6101ff1a42245935f8b32667070b6f869ae7ec"} Apr 21 04:41:31.104805 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:31.104767 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1d0250b8-c362-4063-9b74-0cc043c283e9","Type":"ContainerStarted","Data":"7f0fa038012831170a308380c44f5e8fc6724753b2d9be5d4170afa1e70a7fcd"} Apr 21 04:41:31.104805 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:31.104775 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1d0250b8-c362-4063-9b74-0cc043c283e9","Type":"ContainerStarted","Data":"7b33bfafeb15882fa1ec5911fc9d79ec6ed55b2c1c6d0ad6aa8bf218b6ff4c6e"} Apr 21 04:41:31.104805 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:31.104783 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1d0250b8-c362-4063-9b74-0cc043c283e9","Type":"ContainerStarted","Data":"2b60f2743c83cf19e36f16dddffd308fa8d3b5f325c68e994085ea86da9ef269"} Apr 21 04:41:31.104805 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:31.104791 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1d0250b8-c362-4063-9b74-0cc043c283e9","Type":"ContainerStarted","Data":"c17be8b2b83190e8b3116f7a6f240174a1ba26579f14e6789987e961c0d13133"} Apr 21 04:41:31.132438 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:31.132383 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.634313766 podStartE2EDuration="5.132370173s" podCreationTimestamp="2026-04-21 04:41:26 +0000 UTC" firstStartedPulling="2026-04-21 04:41:28.093116444 +0000 UTC m=+143.214674232" lastFinishedPulling="2026-04-21 04:41:30.591172849 +0000 UTC m=+145.712730639" observedRunningTime="2026-04-21 04:41:31.131010961 +0000 UTC m=+146.252568768" watchObservedRunningTime="2026-04-21 04:41:31.132370173 +0000 UTC m=+146.253927998" Apr 21 04:41:32.210507 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:32.210448 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:41:34.103527 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:34.103500 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-b9c54d565-czdz9" Apr 21 04:41:39.764175 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:41:39.764125 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-qjtkj" podUID="70e12877-2720-42f7-b047-316b48c6b8fe" Apr 21 04:41:39.771273 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:41:39.771239 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-k8p5j" podUID="f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8" Apr 21 04:41:40.131858 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:40.131832 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-k8p5j" Apr 21 04:41:40.132051 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:40.131832 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qjtkj" Apr 21 04:41:44.725285 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:44.725249 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8-cert\") pod \"ingress-canary-k8p5j\" (UID: \"f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8\") " pod="openshift-ingress-canary/ingress-canary-k8p5j" Apr 21 04:41:44.725645 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:44.725292 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70e12877-2720-42f7-b047-316b48c6b8fe-metrics-tls\") pod \"dns-default-qjtkj\" (UID: \"70e12877-2720-42f7-b047-316b48c6b8fe\") " pod="openshift-dns/dns-default-qjtkj" Apr 21 04:41:44.727624 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:44.727597 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70e12877-2720-42f7-b047-316b48c6b8fe-metrics-tls\") pod \"dns-default-qjtkj\" (UID: \"70e12877-2720-42f7-b047-316b48c6b8fe\") " pod="openshift-dns/dns-default-qjtkj" Apr 21 04:41:44.727792 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:44.727768 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8-cert\") pod \"ingress-canary-k8p5j\" (UID: \"f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8\") " pod="openshift-ingress-canary/ingress-canary-k8p5j" Apr 21 04:41:44.934971 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:44.934938 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ljx4n\"" Apr 21 04:41:44.935970 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:44.935952 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-f9n52\"" Apr 21 04:41:44.942915 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:44.942892 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qjtkj" Apr 21 04:41:44.943006 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:44.942894 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-k8p5j" Apr 21 04:41:45.090795 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:45.090760 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qjtkj"] Apr 21 04:41:45.093345 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:41:45.093293 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70e12877_2720_42f7_b047_316b48c6b8fe.slice/crio-29f8b23efa298ac6f717dcbea9a3179f0ca7c40d49cc872404d49801c2f6f296 WatchSource:0}: Error finding container 29f8b23efa298ac6f717dcbea9a3179f0ca7c40d49cc872404d49801c2f6f296: Status 404 returned error can't find the container with id 29f8b23efa298ac6f717dcbea9a3179f0ca7c40d49cc872404d49801c2f6f296 Apr 21 04:41:45.114940 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:45.114914 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-k8p5j"] Apr 21 04:41:45.117326 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:41:45.117296 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8e3c014_c3ca_4d5c_ae42_2a22ad9530c8.slice/crio-c0ecbe7e94c5b500dcda82517d4880ea33e141013236704a0f8646c5e4f60967 WatchSource:0}: Error finding container c0ecbe7e94c5b500dcda82517d4880ea33e141013236704a0f8646c5e4f60967: Status 404 returned error can't find the container with id c0ecbe7e94c5b500dcda82517d4880ea33e141013236704a0f8646c5e4f60967 Apr 21 04:41:45.150226 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:45.150184 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qjtkj" event={"ID":"70e12877-2720-42f7-b047-316b48c6b8fe","Type":"ContainerStarted","Data":"29f8b23efa298ac6f717dcbea9a3179f0ca7c40d49cc872404d49801c2f6f296"} Apr 21 04:41:45.151242 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:45.151214 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-k8p5j" event={"ID":"f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8","Type":"ContainerStarted","Data":"c0ecbe7e94c5b500dcda82517d4880ea33e141013236704a0f8646c5e4f60967"} Apr 21 04:41:48.161760 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:48.161714 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qjtkj" event={"ID":"70e12877-2720-42f7-b047-316b48c6b8fe","Type":"ContainerStarted","Data":"b56cfc226fcfc88e708e75d9c58c3a03a781563a75c11dad066e7754688b2e42"} Apr 21 04:41:48.161760 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:48.161756 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qjtkj" event={"ID":"70e12877-2720-42f7-b047-316b48c6b8fe","Type":"ContainerStarted","Data":"21f0d7d0f131f5b0ffdd33373cab0cbb2e65f90ecdc7cc372d9b0426a931a02a"} Apr 21 04:41:48.162346 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:48.161839 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-qjtkj" Apr 21 04:41:48.163055 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:48.163030 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-k8p5j" event={"ID":"f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8","Type":"ContainerStarted","Data":"d3f2f363892195a257012956cef2b50fe20576dd81f4f3f89d75e1f1491212dc"} Apr 21 04:41:48.179692 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:48.179651 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-qjtkj" podStartSLOduration=130.055130464 podStartE2EDuration="2m12.17963924s" podCreationTimestamp="2026-04-21 04:39:36 +0000 UTC" firstStartedPulling="2026-04-21 04:41:45.095478079 +0000 UTC m=+160.217035865" lastFinishedPulling="2026-04-21 04:41:47.219986841 +0000 UTC m=+162.341544641" observedRunningTime="2026-04-21 04:41:48.177918666 +0000 UTC m=+163.299476472" watchObservedRunningTime="2026-04-21 04:41:48.17963924 +0000 UTC m=+163.301197047" Apr 21 04:41:48.193558 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:48.193513 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-k8p5j" podStartSLOduration=130.089110525 podStartE2EDuration="2m12.193498427s" podCreationTimestamp="2026-04-21 04:39:36 +0000 UTC" firstStartedPulling="2026-04-21 04:41:45.119169211 +0000 UTC m=+160.240726996" lastFinishedPulling="2026-04-21 04:41:47.223557098 +0000 UTC m=+162.345114898" observedRunningTime="2026-04-21 04:41:48.191945232 +0000 UTC m=+163.313503040" watchObservedRunningTime="2026-04-21 04:41:48.193498427 +0000 UTC m=+163.315056235" Apr 21 04:41:58.168377 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:41:58.168344 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-qjtkj" Apr 21 04:42:12.234397 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:12.234360 2568 generic.go:358] "Generic (PLEG): container finished" podID="74465859-3c82-4f58-832b-74cc4fbe41ce" containerID="95c495886fe5c87fc81f96eebb87340ac818777debf8c32d6723b0bee12e6de4" exitCode=0 Apr 21 04:42:12.234884 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:12.234440 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-srd87" event={"ID":"74465859-3c82-4f58-832b-74cc4fbe41ce","Type":"ContainerDied","Data":"95c495886fe5c87fc81f96eebb87340ac818777debf8c32d6723b0bee12e6de4"} Apr 21 04:42:12.234884 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:12.234852 2568 scope.go:117] "RemoveContainer" containerID="95c495886fe5c87fc81f96eebb87340ac818777debf8c32d6723b0bee12e6de4" Apr 21 04:42:13.239228 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:13.239190 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-srd87" event={"ID":"74465859-3c82-4f58-832b-74cc4fbe41ce","Type":"ContainerStarted","Data":"7a8c65700a86132f50ab6cc71cf10e50e93ead397604d4bc36a263485e58e628"} Apr 21 04:42:27.210081 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:27.210026 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:27.228880 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:27.228854 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:27.295921 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:27.295893 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:40.986590 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:40.986550 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 04:42:40.987096 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:40.986995 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="0bcc49d8-b091-41af-a914-07a7a3388c25" containerName="alertmanager" containerID="cri-o://c3131849715e94a10cb3c94ee6a79b27d0600e06b248d997721c704805f54a65" gracePeriod=120 Apr 21 04:42:40.987182 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:40.987092 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="0bcc49d8-b091-41af-a914-07a7a3388c25" containerName="kube-rbac-proxy-metric" containerID="cri-o://75d6078e5b7c76d3cc3e789d9c9bdd5c00296f36fd6dcf8b6d302d26ca1b3c2d" gracePeriod=120 Apr 21 04:42:40.987182 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:40.987093 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="0bcc49d8-b091-41af-a914-07a7a3388c25" containerName="prom-label-proxy" containerID="cri-o://9d5e894a7fb4b5cf761d9e38d5d5a928c089d9dd99795e9cb868347fb8b2c97b" gracePeriod=120 Apr 21 04:42:40.987182 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:40.987140 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="0bcc49d8-b091-41af-a914-07a7a3388c25" containerName="kube-rbac-proxy" containerID="cri-o://ec114c1ec10dcd98926212b2a8588bc631577fe6cfeab2a76b9323276f135c10" gracePeriod=120 Apr 21 04:42:40.987331 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:40.987254 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="0bcc49d8-b091-41af-a914-07a7a3388c25" containerName="kube-rbac-proxy-web" containerID="cri-o://a0935d3be780b40a081a2573bd41459e7e5354289cd9d557dd79cc5fd600f9ec" gracePeriod=120 Apr 21 04:42:40.987387 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:40.987338 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="0bcc49d8-b091-41af-a914-07a7a3388c25" containerName="config-reloader" containerID="cri-o://ce45080f3042fb0356eddd7208320d27bf95cf43629e2cc5a0e524332dc1394c" gracePeriod=120 Apr 21 04:42:41.322290 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:41.322206 2568 generic.go:358] "Generic (PLEG): container finished" podID="0bcc49d8-b091-41af-a914-07a7a3388c25" containerID="9d5e894a7fb4b5cf761d9e38d5d5a928c089d9dd99795e9cb868347fb8b2c97b" exitCode=0 Apr 21 04:42:41.322290 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:41.322231 2568 generic.go:358] "Generic (PLEG): container finished" podID="0bcc49d8-b091-41af-a914-07a7a3388c25" containerID="75d6078e5b7c76d3cc3e789d9c9bdd5c00296f36fd6dcf8b6d302d26ca1b3c2d" exitCode=0 Apr 21 04:42:41.322290 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:41.322237 2568 generic.go:358] "Generic (PLEG): container finished" podID="0bcc49d8-b091-41af-a914-07a7a3388c25" containerID="ec114c1ec10dcd98926212b2a8588bc631577fe6cfeab2a76b9323276f135c10" exitCode=0 Apr 21 04:42:41.322290 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:41.322243 2568 generic.go:358] "Generic (PLEG): container finished" podID="0bcc49d8-b091-41af-a914-07a7a3388c25" containerID="ce45080f3042fb0356eddd7208320d27bf95cf43629e2cc5a0e524332dc1394c" exitCode=0 Apr 21 04:42:41.322290 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:41.322249 2568 generic.go:358] "Generic (PLEG): container finished" podID="0bcc49d8-b091-41af-a914-07a7a3388c25" containerID="c3131849715e94a10cb3c94ee6a79b27d0600e06b248d997721c704805f54a65" exitCode=0 Apr 21 04:42:41.322290 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:41.322275 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0bcc49d8-b091-41af-a914-07a7a3388c25","Type":"ContainerDied","Data":"9d5e894a7fb4b5cf761d9e38d5d5a928c089d9dd99795e9cb868347fb8b2c97b"} Apr 21 04:42:41.322581 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:41.322304 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0bcc49d8-b091-41af-a914-07a7a3388c25","Type":"ContainerDied","Data":"75d6078e5b7c76d3cc3e789d9c9bdd5c00296f36fd6dcf8b6d302d26ca1b3c2d"} Apr 21 04:42:41.322581 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:41.322315 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0bcc49d8-b091-41af-a914-07a7a3388c25","Type":"ContainerDied","Data":"ec114c1ec10dcd98926212b2a8588bc631577fe6cfeab2a76b9323276f135c10"} Apr 21 04:42:41.322581 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:41.322325 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0bcc49d8-b091-41af-a914-07a7a3388c25","Type":"ContainerDied","Data":"ce45080f3042fb0356eddd7208320d27bf95cf43629e2cc5a0e524332dc1394c"} Apr 21 04:42:41.322581 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:41.322335 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0bcc49d8-b091-41af-a914-07a7a3388c25","Type":"ContainerDied","Data":"c3131849715e94a10cb3c94ee6a79b27d0600e06b248d997721c704805f54a65"} Apr 21 04:42:42.231212 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.231185 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.308117 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.308011 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/0bcc49d8-b091-41af-a914-07a7a3388c25-alertmanager-main-db\") pod \"0bcc49d8-b091-41af-a914-07a7a3388c25\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " Apr 21 04:42:42.308117 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.308051 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-web-config\") pod \"0bcc49d8-b091-41af-a914-07a7a3388c25\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " Apr 21 04:42:42.308117 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.308090 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bcc49d8-b091-41af-a914-07a7a3388c25-alertmanager-trusted-ca-bundle\") pod \"0bcc49d8-b091-41af-a914-07a7a3388c25\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " Apr 21 04:42:42.308117 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.308112 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-cluster-tls-config\") pod \"0bcc49d8-b091-41af-a914-07a7a3388c25\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " Apr 21 04:42:42.308447 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.308132 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-config-volume\") pod \"0bcc49d8-b091-41af-a914-07a7a3388c25\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " Apr 21 04:42:42.308447 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.308148 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-secret-alertmanager-kube-rbac-proxy\") pod \"0bcc49d8-b091-41af-a914-07a7a3388c25\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " Apr 21 04:42:42.308447 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.308181 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0bcc49d8-b091-41af-a914-07a7a3388c25-metrics-client-ca\") pod \"0bcc49d8-b091-41af-a914-07a7a3388c25\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " Apr 21 04:42:42.308447 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.308323 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-secret-alertmanager-kube-rbac-proxy-metric\") pod \"0bcc49d8-b091-41af-a914-07a7a3388c25\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " Apr 21 04:42:42.308447 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.308375 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-secret-alertmanager-kube-rbac-proxy-web\") pod \"0bcc49d8-b091-41af-a914-07a7a3388c25\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " Apr 21 04:42:42.308447 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.308403 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0bcc49d8-b091-41af-a914-07a7a3388c25-tls-assets\") pod \"0bcc49d8-b091-41af-a914-07a7a3388c25\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " Apr 21 04:42:42.308447 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.308431 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bcc49d8-b091-41af-a914-07a7a3388c25-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "0bcc49d8-b091-41af-a914-07a7a3388c25" (UID: "0bcc49d8-b091-41af-a914-07a7a3388c25"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:42:42.308447 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.308444 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-secret-alertmanager-main-tls\") pod \"0bcc49d8-b091-41af-a914-07a7a3388c25\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " Apr 21 04:42:42.308859 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.308481 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0bcc49d8-b091-41af-a914-07a7a3388c25-config-out\") pod \"0bcc49d8-b091-41af-a914-07a7a3388c25\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " Apr 21 04:42:42.308859 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.308508 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhtx8\" (UniqueName: \"kubernetes.io/projected/0bcc49d8-b091-41af-a914-07a7a3388c25-kube-api-access-hhtx8\") pod \"0bcc49d8-b091-41af-a914-07a7a3388c25\" (UID: \"0bcc49d8-b091-41af-a914-07a7a3388c25\") " Apr 21 04:42:42.308859 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.308568 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bcc49d8-b091-41af-a914-07a7a3388c25-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "0bcc49d8-b091-41af-a914-07a7a3388c25" (UID: "0bcc49d8-b091-41af-a914-07a7a3388c25"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:42:42.308859 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.308775 2568 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/0bcc49d8-b091-41af-a914-07a7a3388c25-alertmanager-main-db\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:42:42.308859 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.308797 2568 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bcc49d8-b091-41af-a914-07a7a3388c25-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:42:42.309336 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.309284 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bcc49d8-b091-41af-a914-07a7a3388c25-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "0bcc49d8-b091-41af-a914-07a7a3388c25" (UID: "0bcc49d8-b091-41af-a914-07a7a3388c25"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:42:42.311607 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.311576 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "0bcc49d8-b091-41af-a914-07a7a3388c25" (UID: "0bcc49d8-b091-41af-a914-07a7a3388c25"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:42:42.311727 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.311636 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "0bcc49d8-b091-41af-a914-07a7a3388c25" (UID: "0bcc49d8-b091-41af-a914-07a7a3388c25"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:42:42.311793 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.311734 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bcc49d8-b091-41af-a914-07a7a3388c25-kube-api-access-hhtx8" (OuterVolumeSpecName: "kube-api-access-hhtx8") pod "0bcc49d8-b091-41af-a914-07a7a3388c25" (UID: "0bcc49d8-b091-41af-a914-07a7a3388c25"). InnerVolumeSpecName "kube-api-access-hhtx8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:42:42.311793 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.311753 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bcc49d8-b091-41af-a914-07a7a3388c25-config-out" (OuterVolumeSpecName: "config-out") pod "0bcc49d8-b091-41af-a914-07a7a3388c25" (UID: "0bcc49d8-b091-41af-a914-07a7a3388c25"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:42:42.311793 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.311766 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-config-volume" (OuterVolumeSpecName: "config-volume") pod "0bcc49d8-b091-41af-a914-07a7a3388c25" (UID: "0bcc49d8-b091-41af-a914-07a7a3388c25"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:42:42.312141 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.312119 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "0bcc49d8-b091-41af-a914-07a7a3388c25" (UID: "0bcc49d8-b091-41af-a914-07a7a3388c25"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:42:42.312306 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.312287 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "0bcc49d8-b091-41af-a914-07a7a3388c25" (UID: "0bcc49d8-b091-41af-a914-07a7a3388c25"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:42:42.312676 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.312655 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bcc49d8-b091-41af-a914-07a7a3388c25-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "0bcc49d8-b091-41af-a914-07a7a3388c25" (UID: "0bcc49d8-b091-41af-a914-07a7a3388c25"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:42:42.316293 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.316264 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "0bcc49d8-b091-41af-a914-07a7a3388c25" (UID: "0bcc49d8-b091-41af-a914-07a7a3388c25"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:42:42.324525 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.324486 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-web-config" (OuterVolumeSpecName: "web-config") pod "0bcc49d8-b091-41af-a914-07a7a3388c25" (UID: "0bcc49d8-b091-41af-a914-07a7a3388c25"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:42:42.328415 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.328382 2568 generic.go:358] "Generic (PLEG): container finished" podID="0bcc49d8-b091-41af-a914-07a7a3388c25" containerID="a0935d3be780b40a081a2573bd41459e7e5354289cd9d557dd79cc5fd600f9ec" exitCode=0 Apr 21 04:42:42.328511 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.328464 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0bcc49d8-b091-41af-a914-07a7a3388c25","Type":"ContainerDied","Data":"a0935d3be780b40a081a2573bd41459e7e5354289cd9d557dd79cc5fd600f9ec"} Apr 21 04:42:42.328511 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.328488 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.328582 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.328513 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0bcc49d8-b091-41af-a914-07a7a3388c25","Type":"ContainerDied","Data":"3ed4dbe2c29b8f040444f289f63b3d46226c1900747e2ebb8af5b963418f9090"} Apr 21 04:42:42.328582 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.328535 2568 scope.go:117] "RemoveContainer" containerID="9d5e894a7fb4b5cf761d9e38d5d5a928c089d9dd99795e9cb868347fb8b2c97b" Apr 21 04:42:42.337056 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.337038 2568 scope.go:117] "RemoveContainer" containerID="75d6078e5b7c76d3cc3e789d9c9bdd5c00296f36fd6dcf8b6d302d26ca1b3c2d" Apr 21 04:42:42.344936 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.344914 2568 scope.go:117] "RemoveContainer" containerID="ec114c1ec10dcd98926212b2a8588bc631577fe6cfeab2a76b9323276f135c10" Apr 21 04:42:42.352083 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.352053 2568 scope.go:117] "RemoveContainer" containerID="a0935d3be780b40a081a2573bd41459e7e5354289cd9d557dd79cc5fd600f9ec" Apr 21 04:42:42.357314 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.357287 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 04:42:42.359888 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.359867 2568 scope.go:117] "RemoveContainer" containerID="ce45080f3042fb0356eddd7208320d27bf95cf43629e2cc5a0e524332dc1394c" Apr 21 04:42:42.365890 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.365862 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 04:42:42.368601 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.368578 2568 scope.go:117] "RemoveContainer" containerID="c3131849715e94a10cb3c94ee6a79b27d0600e06b248d997721c704805f54a65" Apr 21 04:42:42.375876 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.375857 2568 scope.go:117] "RemoveContainer" containerID="f429086c506e6334c3d56b98c78fff253bb939d075c187c46a60f48e841f92ae" Apr 21 04:42:42.382887 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.382868 2568 scope.go:117] "RemoveContainer" containerID="9d5e894a7fb4b5cf761d9e38d5d5a928c089d9dd99795e9cb868347fb8b2c97b" Apr 21 04:42:42.383275 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:42:42.383251 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d5e894a7fb4b5cf761d9e38d5d5a928c089d9dd99795e9cb868347fb8b2c97b\": container with ID starting with 9d5e894a7fb4b5cf761d9e38d5d5a928c089d9dd99795e9cb868347fb8b2c97b not found: ID does not exist" containerID="9d5e894a7fb4b5cf761d9e38d5d5a928c089d9dd99795e9cb868347fb8b2c97b" Apr 21 04:42:42.383355 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.383284 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d5e894a7fb4b5cf761d9e38d5d5a928c089d9dd99795e9cb868347fb8b2c97b"} err="failed to get container status \"9d5e894a7fb4b5cf761d9e38d5d5a928c089d9dd99795e9cb868347fb8b2c97b\": rpc error: code = NotFound desc = could not find container \"9d5e894a7fb4b5cf761d9e38d5d5a928c089d9dd99795e9cb868347fb8b2c97b\": container with ID starting with 9d5e894a7fb4b5cf761d9e38d5d5a928c089d9dd99795e9cb868347fb8b2c97b not found: ID does not exist" Apr 21 04:42:42.383355 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.383319 2568 scope.go:117] "RemoveContainer" containerID="75d6078e5b7c76d3cc3e789d9c9bdd5c00296f36fd6dcf8b6d302d26ca1b3c2d" Apr 21 04:42:42.383545 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:42:42.383531 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75d6078e5b7c76d3cc3e789d9c9bdd5c00296f36fd6dcf8b6d302d26ca1b3c2d\": container with ID starting with 75d6078e5b7c76d3cc3e789d9c9bdd5c00296f36fd6dcf8b6d302d26ca1b3c2d not found: ID does not exist" containerID="75d6078e5b7c76d3cc3e789d9c9bdd5c00296f36fd6dcf8b6d302d26ca1b3c2d" Apr 21 04:42:42.383587 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.383554 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75d6078e5b7c76d3cc3e789d9c9bdd5c00296f36fd6dcf8b6d302d26ca1b3c2d"} err="failed to get container status \"75d6078e5b7c76d3cc3e789d9c9bdd5c00296f36fd6dcf8b6d302d26ca1b3c2d\": rpc error: code = NotFound desc = could not find container \"75d6078e5b7c76d3cc3e789d9c9bdd5c00296f36fd6dcf8b6d302d26ca1b3c2d\": container with ID starting with 75d6078e5b7c76d3cc3e789d9c9bdd5c00296f36fd6dcf8b6d302d26ca1b3c2d not found: ID does not exist" Apr 21 04:42:42.383587 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.383567 2568 scope.go:117] "RemoveContainer" containerID="ec114c1ec10dcd98926212b2a8588bc631577fe6cfeab2a76b9323276f135c10" Apr 21 04:42:42.383757 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:42:42.383741 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec114c1ec10dcd98926212b2a8588bc631577fe6cfeab2a76b9323276f135c10\": container with ID starting with ec114c1ec10dcd98926212b2a8588bc631577fe6cfeab2a76b9323276f135c10 not found: ID does not exist" containerID="ec114c1ec10dcd98926212b2a8588bc631577fe6cfeab2a76b9323276f135c10" Apr 21 04:42:42.383797 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.383761 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec114c1ec10dcd98926212b2a8588bc631577fe6cfeab2a76b9323276f135c10"} err="failed to get container status \"ec114c1ec10dcd98926212b2a8588bc631577fe6cfeab2a76b9323276f135c10\": rpc error: code = NotFound desc = could not find container \"ec114c1ec10dcd98926212b2a8588bc631577fe6cfeab2a76b9323276f135c10\": container with ID starting with ec114c1ec10dcd98926212b2a8588bc631577fe6cfeab2a76b9323276f135c10 not found: ID does not exist" Apr 21 04:42:42.383797 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.383775 2568 scope.go:117] "RemoveContainer" containerID="a0935d3be780b40a081a2573bd41459e7e5354289cd9d557dd79cc5fd600f9ec" Apr 21 04:42:42.384157 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:42:42.384120 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0935d3be780b40a081a2573bd41459e7e5354289cd9d557dd79cc5fd600f9ec\": container with ID starting with a0935d3be780b40a081a2573bd41459e7e5354289cd9d557dd79cc5fd600f9ec not found: ID does not exist" containerID="a0935d3be780b40a081a2573bd41459e7e5354289cd9d557dd79cc5fd600f9ec" Apr 21 04:42:42.384310 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.384168 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0935d3be780b40a081a2573bd41459e7e5354289cd9d557dd79cc5fd600f9ec"} err="failed to get container status \"a0935d3be780b40a081a2573bd41459e7e5354289cd9d557dd79cc5fd600f9ec\": rpc error: code = NotFound desc = could not find container \"a0935d3be780b40a081a2573bd41459e7e5354289cd9d557dd79cc5fd600f9ec\": container with ID starting with a0935d3be780b40a081a2573bd41459e7e5354289cd9d557dd79cc5fd600f9ec not found: ID does not exist" Apr 21 04:42:42.384310 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.384194 2568 scope.go:117] "RemoveContainer" containerID="ce45080f3042fb0356eddd7208320d27bf95cf43629e2cc5a0e524332dc1394c" Apr 21 04:42:42.384526 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:42:42.384504 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce45080f3042fb0356eddd7208320d27bf95cf43629e2cc5a0e524332dc1394c\": container with ID starting with ce45080f3042fb0356eddd7208320d27bf95cf43629e2cc5a0e524332dc1394c not found: ID does not exist" containerID="ce45080f3042fb0356eddd7208320d27bf95cf43629e2cc5a0e524332dc1394c" Apr 21 04:42:42.384609 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.384531 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce45080f3042fb0356eddd7208320d27bf95cf43629e2cc5a0e524332dc1394c"} err="failed to get container status \"ce45080f3042fb0356eddd7208320d27bf95cf43629e2cc5a0e524332dc1394c\": rpc error: code = NotFound desc = could not find container \"ce45080f3042fb0356eddd7208320d27bf95cf43629e2cc5a0e524332dc1394c\": container with ID starting with ce45080f3042fb0356eddd7208320d27bf95cf43629e2cc5a0e524332dc1394c not found: ID does not exist" Apr 21 04:42:42.384609 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.384552 2568 scope.go:117] "RemoveContainer" containerID="c3131849715e94a10cb3c94ee6a79b27d0600e06b248d997721c704805f54a65" Apr 21 04:42:42.384883 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:42:42.384856 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3131849715e94a10cb3c94ee6a79b27d0600e06b248d997721c704805f54a65\": container with ID starting with c3131849715e94a10cb3c94ee6a79b27d0600e06b248d997721c704805f54a65 not found: ID does not exist" containerID="c3131849715e94a10cb3c94ee6a79b27d0600e06b248d997721c704805f54a65" Apr 21 04:42:42.384936 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.384888 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3131849715e94a10cb3c94ee6a79b27d0600e06b248d997721c704805f54a65"} err="failed to get container status \"c3131849715e94a10cb3c94ee6a79b27d0600e06b248d997721c704805f54a65\": rpc error: code = NotFound desc = could not find container \"c3131849715e94a10cb3c94ee6a79b27d0600e06b248d997721c704805f54a65\": container with ID starting with c3131849715e94a10cb3c94ee6a79b27d0600e06b248d997721c704805f54a65 not found: ID does not exist" Apr 21 04:42:42.384936 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.384906 2568 scope.go:117] "RemoveContainer" containerID="f429086c506e6334c3d56b98c78fff253bb939d075c187c46a60f48e841f92ae" Apr 21 04:42:42.385206 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:42:42.385182 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f429086c506e6334c3d56b98c78fff253bb939d075c187c46a60f48e841f92ae\": container with ID starting with f429086c506e6334c3d56b98c78fff253bb939d075c187c46a60f48e841f92ae not found: ID does not exist" containerID="f429086c506e6334c3d56b98c78fff253bb939d075c187c46a60f48e841f92ae" Apr 21 04:42:42.385273 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.385214 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f429086c506e6334c3d56b98c78fff253bb939d075c187c46a60f48e841f92ae"} err="failed to get container status \"f429086c506e6334c3d56b98c78fff253bb939d075c187c46a60f48e841f92ae\": rpc error: code = NotFound desc = could not find container \"f429086c506e6334c3d56b98c78fff253bb939d075c187c46a60f48e841f92ae\": container with ID starting with f429086c506e6334c3d56b98c78fff253bb939d075c187c46a60f48e841f92ae not found: ID does not exist" Apr 21 04:42:42.393925 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.393902 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 04:42:42.394247 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.394234 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0bcc49d8-b091-41af-a914-07a7a3388c25" containerName="alertmanager" Apr 21 04:42:42.394289 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.394260 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bcc49d8-b091-41af-a914-07a7a3388c25" containerName="alertmanager" Apr 21 04:42:42.394289 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.394270 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0bcc49d8-b091-41af-a914-07a7a3388c25" containerName="kube-rbac-proxy" Apr 21 04:42:42.394289 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.394276 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bcc49d8-b091-41af-a914-07a7a3388c25" containerName="kube-rbac-proxy" Apr 21 04:42:42.394289 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.394283 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0bcc49d8-b091-41af-a914-07a7a3388c25" containerName="prom-label-proxy" Apr 21 04:42:42.394289 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.394290 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bcc49d8-b091-41af-a914-07a7a3388c25" containerName="prom-label-proxy" Apr 21 04:42:42.394435 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.394301 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0bcc49d8-b091-41af-a914-07a7a3388c25" containerName="init-config-reloader" Apr 21 04:42:42.394435 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.394308 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bcc49d8-b091-41af-a914-07a7a3388c25" containerName="init-config-reloader" Apr 21 04:42:42.394435 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.394313 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0bcc49d8-b091-41af-a914-07a7a3388c25" containerName="kube-rbac-proxy-metric" Apr 21 04:42:42.394435 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.394319 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bcc49d8-b091-41af-a914-07a7a3388c25" containerName="kube-rbac-proxy-metric" Apr 21 04:42:42.394435 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.394330 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0bcc49d8-b091-41af-a914-07a7a3388c25" containerName="kube-rbac-proxy-web" Apr 21 04:42:42.394435 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.394336 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bcc49d8-b091-41af-a914-07a7a3388c25" containerName="kube-rbac-proxy-web" Apr 21 04:42:42.394435 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.394347 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0bcc49d8-b091-41af-a914-07a7a3388c25" containerName="config-reloader" Apr 21 04:42:42.394435 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.394352 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bcc49d8-b091-41af-a914-07a7a3388c25" containerName="config-reloader" Apr 21 04:42:42.394435 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.394396 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="0bcc49d8-b091-41af-a914-07a7a3388c25" containerName="kube-rbac-proxy" Apr 21 04:42:42.394435 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.394403 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="0bcc49d8-b091-41af-a914-07a7a3388c25" containerName="kube-rbac-proxy-web" Apr 21 04:42:42.394435 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.394411 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="0bcc49d8-b091-41af-a914-07a7a3388c25" containerName="config-reloader" Apr 21 04:42:42.394435 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.394416 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="0bcc49d8-b091-41af-a914-07a7a3388c25" containerName="prom-label-proxy" Apr 21 04:42:42.394435 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.394424 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="0bcc49d8-b091-41af-a914-07a7a3388c25" containerName="kube-rbac-proxy-metric" Apr 21 04:42:42.394435 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.394431 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="0bcc49d8-b091-41af-a914-07a7a3388c25" containerName="alertmanager" Apr 21 04:42:42.399618 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.399602 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.402179 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.402152 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 21 04:42:42.402291 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.402179 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 21 04:42:42.402291 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.402152 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 21 04:42:42.402291 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.402284 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 21 04:42:42.402491 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.402476 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 21 04:42:42.402553 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.402483 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-5fnvm\"" Apr 21 04:42:42.402685 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.402668 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 21 04:42:42.402749 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.402734 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 21 04:42:42.402799 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.402786 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 21 04:42:42.408603 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.408582 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 21 04:42:42.409169 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.409148 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/872de7ac-6622-48c0-b75b-c05be54702c3-config-volume\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.409211 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.409180 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkgfv\" (UniqueName: \"kubernetes.io/projected/872de7ac-6622-48c0-b75b-c05be54702c3-kube-api-access-jkgfv\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.409211 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.409198 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/872de7ac-6622-48c0-b75b-c05be54702c3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.409275 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.409255 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/872de7ac-6622-48c0-b75b-c05be54702c3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.409306 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.409284 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/872de7ac-6622-48c0-b75b-c05be54702c3-web-config\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.409341 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.409309 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/872de7ac-6622-48c0-b75b-c05be54702c3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.409341 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.409331 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/872de7ac-6622-48c0-b75b-c05be54702c3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.409400 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.409357 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/872de7ac-6622-48c0-b75b-c05be54702c3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.409432 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.409404 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/872de7ac-6622-48c0-b75b-c05be54702c3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.409432 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.409422 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/872de7ac-6622-48c0-b75b-c05be54702c3-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.409495 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.409436 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/872de7ac-6622-48c0-b75b-c05be54702c3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.409495 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.409456 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/872de7ac-6622-48c0-b75b-c05be54702c3-config-out\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.409555 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.409508 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/872de7ac-6622-48c0-b75b-c05be54702c3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.409588 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.409573 2568 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0bcc49d8-b091-41af-a914-07a7a3388c25-metrics-client-ca\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:42:42.409621 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.409585 2568 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:42:42.409621 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.409596 2568 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:42:42.409621 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.409606 2568 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0bcc49d8-b091-41af-a914-07a7a3388c25-tls-assets\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:42:42.409621 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.409616 2568 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-secret-alertmanager-main-tls\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:42:42.409737 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.409625 2568 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0bcc49d8-b091-41af-a914-07a7a3388c25-config-out\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:42:42.409737 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.409634 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hhtx8\" (UniqueName: \"kubernetes.io/projected/0bcc49d8-b091-41af-a914-07a7a3388c25-kube-api-access-hhtx8\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:42:42.409737 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.409643 2568 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-web-config\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:42:42.409737 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.409652 2568 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-cluster-tls-config\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:42:42.409737 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.409661 2568 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-config-volume\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:42:42.409737 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.409670 2568 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0bcc49d8-b091-41af-a914-07a7a3388c25-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:42:42.414299 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.414274 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 04:42:42.509991 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.509946 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/872de7ac-6622-48c0-b75b-c05be54702c3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.509991 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.509997 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/872de7ac-6622-48c0-b75b-c05be54702c3-web-config\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.510317 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.510021 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/872de7ac-6622-48c0-b75b-c05be54702c3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.510317 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.510040 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/872de7ac-6622-48c0-b75b-c05be54702c3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.510317 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.510057 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/872de7ac-6622-48c0-b75b-c05be54702c3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.510317 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.510100 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/872de7ac-6622-48c0-b75b-c05be54702c3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.510317 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.510263 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/872de7ac-6622-48c0-b75b-c05be54702c3-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.510581 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.510313 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/872de7ac-6622-48c0-b75b-c05be54702c3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.510581 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.510359 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/872de7ac-6622-48c0-b75b-c05be54702c3-config-out\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.510581 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.510388 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/872de7ac-6622-48c0-b75b-c05be54702c3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.510581 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.510472 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/872de7ac-6622-48c0-b75b-c05be54702c3-config-volume\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.510581 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.510502 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkgfv\" (UniqueName: \"kubernetes.io/projected/872de7ac-6622-48c0-b75b-c05be54702c3-kube-api-access-jkgfv\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.510581 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.510528 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/872de7ac-6622-48c0-b75b-c05be54702c3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.510995 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.510967 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/872de7ac-6622-48c0-b75b-c05be54702c3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.511466 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.511108 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/872de7ac-6622-48c0-b75b-c05be54702c3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.512244 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.512205 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/872de7ac-6622-48c0-b75b-c05be54702c3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.513436 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.513371 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/872de7ac-6622-48c0-b75b-c05be54702c3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.513436 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.513388 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/872de7ac-6622-48c0-b75b-c05be54702c3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.513590 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.513567 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/872de7ac-6622-48c0-b75b-c05be54702c3-web-config\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.513637 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.513604 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/872de7ac-6622-48c0-b75b-c05be54702c3-config-volume\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.513691 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.513667 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/872de7ac-6622-48c0-b75b-c05be54702c3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.513729 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.513676 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/872de7ac-6622-48c0-b75b-c05be54702c3-config-out\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.513906 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.513887 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/872de7ac-6622-48c0-b75b-c05be54702c3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.514219 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.514203 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/872de7ac-6622-48c0-b75b-c05be54702c3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.514943 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.514928 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/872de7ac-6622-48c0-b75b-c05be54702c3-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.521368 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.521350 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkgfv\" (UniqueName: \"kubernetes.io/projected/872de7ac-6622-48c0-b75b-c05be54702c3-kube-api-access-jkgfv\") pod \"alertmanager-main-0\" (UID: \"872de7ac-6622-48c0-b75b-c05be54702c3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.709869 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.709831 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:42:42.844186 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:42.844149 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 04:42:42.847041 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:42:42.847012 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod872de7ac_6622_48c0_b75b_c05be54702c3.slice/crio-2ee7553df2eb2c338fe31a9befe298849ea3e71d38494d5bbf8932db80f6373a WatchSource:0}: Error finding container 2ee7553df2eb2c338fe31a9befe298849ea3e71d38494d5bbf8932db80f6373a: Status 404 returned error can't find the container with id 2ee7553df2eb2c338fe31a9befe298849ea3e71d38494d5bbf8932db80f6373a Apr 21 04:42:43.333407 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:43.333310 2568 generic.go:358] "Generic (PLEG): container finished" podID="872de7ac-6622-48c0-b75b-c05be54702c3" containerID="bbcc7387b65f5e9c967e2faebc11e6b282048906791c318392424cb6812c9461" exitCode=0 Apr 21 04:42:43.333902 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:43.333400 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"872de7ac-6622-48c0-b75b-c05be54702c3","Type":"ContainerDied","Data":"bbcc7387b65f5e9c967e2faebc11e6b282048906791c318392424cb6812c9461"} Apr 21 04:42:43.333902 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:43.333437 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"872de7ac-6622-48c0-b75b-c05be54702c3","Type":"ContainerStarted","Data":"2ee7553df2eb2c338fe31a9befe298849ea3e71d38494d5bbf8932db80f6373a"} Apr 21 04:42:43.486550 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:43.486521 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bcc49d8-b091-41af-a914-07a7a3388c25" path="/var/lib/kubelet/pods/0bcc49d8-b091-41af-a914-07a7a3388c25/volumes" Apr 21 04:42:44.341135 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:44.341090 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"872de7ac-6622-48c0-b75b-c05be54702c3","Type":"ContainerStarted","Data":"68742c05c4c5259b46ea657b5f341a898a3add62d7650e334fce686767d91d89"} Apr 21 04:42:44.341135 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:44.341135 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"872de7ac-6622-48c0-b75b-c05be54702c3","Type":"ContainerStarted","Data":"d72b5fa9d3c625a3185a1238414bacaa54616122f538337ec3aac8ebdc907516"} Apr 21 04:42:44.341548 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:44.341147 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"872de7ac-6622-48c0-b75b-c05be54702c3","Type":"ContainerStarted","Data":"bb71eb75b5703be9fddd0b420bd1bdcef731f9a4cbc874038c6a61793d204963"} Apr 21 04:42:44.341548 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:44.341161 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"872de7ac-6622-48c0-b75b-c05be54702c3","Type":"ContainerStarted","Data":"83d9f26824f2d13e0482db64ba383066f597c809a9395229e8f8edf2548a632a"} Apr 21 04:42:44.341548 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:44.341174 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"872de7ac-6622-48c0-b75b-c05be54702c3","Type":"ContainerStarted","Data":"04e6b72f76a99e8e950691e7e19fb272adc478f802f661e32b55de721a8bf035"} Apr 21 04:42:44.341548 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:44.341185 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"872de7ac-6622-48c0-b75b-c05be54702c3","Type":"ContainerStarted","Data":"cf393d9a7e7ee5497be17ee5de260a07c83a345f803d0ca5d821d634929a09f1"} Apr 21 04:42:44.368329 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:44.368271 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.368252979 podStartE2EDuration="2.368252979s" podCreationTimestamp="2026-04-21 04:42:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:42:44.365918877 +0000 UTC m=+219.487476685" watchObservedRunningTime="2026-04-21 04:42:44.368252979 +0000 UTC m=+219.489810788" Apr 21 04:42:45.239119 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:45.239064 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 04:42:45.239956 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:45.239917 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1d0250b8-c362-4063-9b74-0cc043c283e9" containerName="prometheus" containerID="cri-o://c17be8b2b83190e8b3116f7a6f240174a1ba26579f14e6789987e961c0d13133" gracePeriod=600 Apr 21 04:42:45.240276 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:45.240224 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1d0250b8-c362-4063-9b74-0cc043c283e9" containerName="kube-rbac-proxy" containerID="cri-o://e402bd99d2df6d31276399fc0b6101ff1a42245935f8b32667070b6f869ae7ec" gracePeriod=600 Apr 21 04:42:45.240376 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:45.240350 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1d0250b8-c362-4063-9b74-0cc043c283e9" containerName="kube-rbac-proxy-web" containerID="cri-o://7f0fa038012831170a308380c44f5e8fc6724753b2d9be5d4170afa1e70a7fcd" gracePeriod=600 Apr 21 04:42:45.240430 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:45.240402 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1d0250b8-c362-4063-9b74-0cc043c283e9" containerName="kube-rbac-proxy-thanos" containerID="cri-o://7cfc22f2a7ad9e277a3c6a7a35f4c026453e722da93fde0589ff9525509df15d" gracePeriod=600 Apr 21 04:42:45.240495 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:45.240475 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1d0250b8-c362-4063-9b74-0cc043c283e9" containerName="thanos-sidecar" containerID="cri-o://7b33bfafeb15882fa1ec5911fc9d79ec6ed55b2c1c6d0ad6aa8bf218b6ff4c6e" gracePeriod=600 Apr 21 04:42:45.240546 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:45.240498 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1d0250b8-c362-4063-9b74-0cc043c283e9" containerName="config-reloader" containerID="cri-o://2b60f2743c83cf19e36f16dddffd308fa8d3b5f325c68e994085ea86da9ef269" gracePeriod=600 Apr 21 04:42:46.350962 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.350931 2568 generic.go:358] "Generic (PLEG): container finished" podID="1d0250b8-c362-4063-9b74-0cc043c283e9" containerID="7cfc22f2a7ad9e277a3c6a7a35f4c026453e722da93fde0589ff9525509df15d" exitCode=0 Apr 21 04:42:46.350962 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.350956 2568 generic.go:358] "Generic (PLEG): container finished" podID="1d0250b8-c362-4063-9b74-0cc043c283e9" containerID="e402bd99d2df6d31276399fc0b6101ff1a42245935f8b32667070b6f869ae7ec" exitCode=0 Apr 21 04:42:46.350962 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.350963 2568 generic.go:358] "Generic (PLEG): container finished" podID="1d0250b8-c362-4063-9b74-0cc043c283e9" containerID="7b33bfafeb15882fa1ec5911fc9d79ec6ed55b2c1c6d0ad6aa8bf218b6ff4c6e" exitCode=0 Apr 21 04:42:46.350962 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.350969 2568 generic.go:358] "Generic (PLEG): container finished" podID="1d0250b8-c362-4063-9b74-0cc043c283e9" containerID="2b60f2743c83cf19e36f16dddffd308fa8d3b5f325c68e994085ea86da9ef269" exitCode=0 Apr 21 04:42:46.350962 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.350974 2568 generic.go:358] "Generic (PLEG): container finished" podID="1d0250b8-c362-4063-9b74-0cc043c283e9" containerID="c17be8b2b83190e8b3116f7a6f240174a1ba26579f14e6789987e961c0d13133" exitCode=0 Apr 21 04:42:46.351448 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.351002 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1d0250b8-c362-4063-9b74-0cc043c283e9","Type":"ContainerDied","Data":"7cfc22f2a7ad9e277a3c6a7a35f4c026453e722da93fde0589ff9525509df15d"} Apr 21 04:42:46.351448 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.351041 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1d0250b8-c362-4063-9b74-0cc043c283e9","Type":"ContainerDied","Data":"e402bd99d2df6d31276399fc0b6101ff1a42245935f8b32667070b6f869ae7ec"} Apr 21 04:42:46.351448 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.351053 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1d0250b8-c362-4063-9b74-0cc043c283e9","Type":"ContainerDied","Data":"7b33bfafeb15882fa1ec5911fc9d79ec6ed55b2c1c6d0ad6aa8bf218b6ff4c6e"} Apr 21 04:42:46.351448 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.351063 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1d0250b8-c362-4063-9b74-0cc043c283e9","Type":"ContainerDied","Data":"2b60f2743c83cf19e36f16dddffd308fa8d3b5f325c68e994085ea86da9ef269"} Apr 21 04:42:46.351448 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.351095 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1d0250b8-c362-4063-9b74-0cc043c283e9","Type":"ContainerDied","Data":"c17be8b2b83190e8b3116f7a6f240174a1ba26579f14e6789987e961c0d13133"} Apr 21 04:42:46.488769 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.488746 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:46.543948 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.543914 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hx858\" (UniqueName: \"kubernetes.io/projected/1d0250b8-c362-4063-9b74-0cc043c283e9-kube-api-access-hx858\") pod \"1d0250b8-c362-4063-9b74-0cc043c283e9\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " Apr 21 04:42:46.544145 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.543969 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d0250b8-c362-4063-9b74-0cc043c283e9-prometheus-trusted-ca-bundle\") pod \"1d0250b8-c362-4063-9b74-0cc043c283e9\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " Apr 21 04:42:46.544145 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.544088 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1d0250b8-c362-4063-9b74-0cc043c283e9-tls-assets\") pod \"1d0250b8-c362-4063-9b74-0cc043c283e9\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " Apr 21 04:42:46.544145 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.544138 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1d0250b8-c362-4063-9b74-0cc043c283e9-prometheus-k8s-db\") pod \"1d0250b8-c362-4063-9b74-0cc043c283e9\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " Apr 21 04:42:46.544344 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.544185 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-web-config\") pod \"1d0250b8-c362-4063-9b74-0cc043c283e9\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " Apr 21 04:42:46.544344 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.544215 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-thanos-prometheus-http-client-file\") pod \"1d0250b8-c362-4063-9b74-0cc043c283e9\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " Apr 21 04:42:46.544344 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.544242 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-config\") pod \"1d0250b8-c362-4063-9b74-0cc043c283e9\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " Apr 21 04:42:46.544344 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.544279 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"1d0250b8-c362-4063-9b74-0cc043c283e9\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " Apr 21 04:42:46.544344 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.544312 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"1d0250b8-c362-4063-9b74-0cc043c283e9\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " Apr 21 04:42:46.544344 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.544340 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d0250b8-c362-4063-9b74-0cc043c283e9-configmap-kubelet-serving-ca-bundle\") pod \"1d0250b8-c362-4063-9b74-0cc043c283e9\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " Apr 21 04:42:46.544613 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.544366 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-secret-grpc-tls\") pod \"1d0250b8-c362-4063-9b74-0cc043c283e9\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " Apr 21 04:42:46.544613 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.544418 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-secret-prometheus-k8s-tls\") pod \"1d0250b8-c362-4063-9b74-0cc043c283e9\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " Apr 21 04:42:46.544613 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.544425 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d0250b8-c362-4063-9b74-0cc043c283e9-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "1d0250b8-c362-4063-9b74-0cc043c283e9" (UID: "1d0250b8-c362-4063-9b74-0cc043c283e9"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:42:46.545386 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.545353 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d0250b8-c362-4063-9b74-0cc043c283e9-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "1d0250b8-c362-4063-9b74-0cc043c283e9" (UID: "1d0250b8-c362-4063-9b74-0cc043c283e9"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:42:46.545541 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.545491 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-secret-metrics-client-certs\") pod \"1d0250b8-c362-4063-9b74-0cc043c283e9\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " Apr 21 04:42:46.545601 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.545545 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-secret-kube-rbac-proxy\") pod \"1d0250b8-c362-4063-9b74-0cc043c283e9\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " Apr 21 04:42:46.545601 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.545560 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d0250b8-c362-4063-9b74-0cc043c283e9-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "1d0250b8-c362-4063-9b74-0cc043c283e9" (UID: "1d0250b8-c362-4063-9b74-0cc043c283e9"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:42:46.545601 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.545583 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d0250b8-c362-4063-9b74-0cc043c283e9-configmap-serving-certs-ca-bundle\") pod \"1d0250b8-c362-4063-9b74-0cc043c283e9\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " Apr 21 04:42:46.545741 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.545630 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1d0250b8-c362-4063-9b74-0cc043c283e9-configmap-metrics-client-ca\") pod \"1d0250b8-c362-4063-9b74-0cc043c283e9\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " Apr 21 04:42:46.545741 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.545658 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1d0250b8-c362-4063-9b74-0cc043c283e9-config-out\") pod \"1d0250b8-c362-4063-9b74-0cc043c283e9\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " Apr 21 04:42:46.545741 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.545690 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1d0250b8-c362-4063-9b74-0cc043c283e9-prometheus-k8s-rulefiles-0\") pod \"1d0250b8-c362-4063-9b74-0cc043c283e9\" (UID: \"1d0250b8-c362-4063-9b74-0cc043c283e9\") " Apr 21 04:42:46.546095 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.546054 2568 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d0250b8-c362-4063-9b74-0cc043c283e9-prometheus-trusted-ca-bundle\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:42:46.546189 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.546094 2568 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1d0250b8-c362-4063-9b74-0cc043c283e9-prometheus-k8s-db\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:42:46.546189 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.546110 2568 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d0250b8-c362-4063-9b74-0cc043c283e9-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:42:46.549390 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.547342 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d0250b8-c362-4063-9b74-0cc043c283e9-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "1d0250b8-c362-4063-9b74-0cc043c283e9" (UID: "1d0250b8-c362-4063-9b74-0cc043c283e9"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:42:46.549390 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.548268 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "1d0250b8-c362-4063-9b74-0cc043c283e9" (UID: "1d0250b8-c362-4063-9b74-0cc043c283e9"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:42:46.549390 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.548916 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-config" (OuterVolumeSpecName: "config") pod "1d0250b8-c362-4063-9b74-0cc043c283e9" (UID: "1d0250b8-c362-4063-9b74-0cc043c283e9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:42:46.549390 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.548952 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d0250b8-c362-4063-9b74-0cc043c283e9-kube-api-access-hx858" (OuterVolumeSpecName: "kube-api-access-hx858") pod "1d0250b8-c362-4063-9b74-0cc043c283e9" (UID: "1d0250b8-c362-4063-9b74-0cc043c283e9"). InnerVolumeSpecName "kube-api-access-hx858". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:42:46.549390 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.549020 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "1d0250b8-c362-4063-9b74-0cc043c283e9" (UID: "1d0250b8-c362-4063-9b74-0cc043c283e9"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:42:46.549390 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.549104 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d0250b8-c362-4063-9b74-0cc043c283e9-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "1d0250b8-c362-4063-9b74-0cc043c283e9" (UID: "1d0250b8-c362-4063-9b74-0cc043c283e9"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:42:46.549390 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.549115 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "1d0250b8-c362-4063-9b74-0cc043c283e9" (UID: "1d0250b8-c362-4063-9b74-0cc043c283e9"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:42:46.549390 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.549343 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d0250b8-c362-4063-9b74-0cc043c283e9-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "1d0250b8-c362-4063-9b74-0cc043c283e9" (UID: "1d0250b8-c362-4063-9b74-0cc043c283e9"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:42:46.550290 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.550242 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "1d0250b8-c362-4063-9b74-0cc043c283e9" (UID: "1d0250b8-c362-4063-9b74-0cc043c283e9"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:42:46.550290 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.550258 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "1d0250b8-c362-4063-9b74-0cc043c283e9" (UID: "1d0250b8-c362-4063-9b74-0cc043c283e9"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:42:46.550439 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.550371 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d0250b8-c362-4063-9b74-0cc043c283e9-config-out" (OuterVolumeSpecName: "config-out") pod "1d0250b8-c362-4063-9b74-0cc043c283e9" (UID: "1d0250b8-c362-4063-9b74-0cc043c283e9"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:42:46.550503 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.550474 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "1d0250b8-c362-4063-9b74-0cc043c283e9" (UID: "1d0250b8-c362-4063-9b74-0cc043c283e9"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:42:46.550609 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.550585 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "1d0250b8-c362-4063-9b74-0cc043c283e9" (UID: "1d0250b8-c362-4063-9b74-0cc043c283e9"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:42:46.550785 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.550757 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d0250b8-c362-4063-9b74-0cc043c283e9-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "1d0250b8-c362-4063-9b74-0cc043c283e9" (UID: "1d0250b8-c362-4063-9b74-0cc043c283e9"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:42:46.559514 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.559493 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-web-config" (OuterVolumeSpecName: "web-config") pod "1d0250b8-c362-4063-9b74-0cc043c283e9" (UID: "1d0250b8-c362-4063-9b74-0cc043c283e9"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:42:46.646626 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.646596 2568 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-thanos-prometheus-http-client-file\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:42:46.646626 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.646624 2568 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-config\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:42:46.646820 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.646634 2568 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:42:46.646820 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.646644 2568 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:42:46.646820 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.646654 2568 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-secret-grpc-tls\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:42:46.646820 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.646663 2568 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-secret-prometheus-k8s-tls\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:42:46.646820 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.646672 2568 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-secret-metrics-client-certs\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:42:46.646820 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.646681 2568 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-secret-kube-rbac-proxy\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:42:46.646820 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.646691 2568 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d0250b8-c362-4063-9b74-0cc043c283e9-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:42:46.646820 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.646701 2568 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1d0250b8-c362-4063-9b74-0cc043c283e9-configmap-metrics-client-ca\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:42:46.646820 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.646709 2568 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1d0250b8-c362-4063-9b74-0cc043c283e9-config-out\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:42:46.646820 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.646718 2568 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1d0250b8-c362-4063-9b74-0cc043c283e9-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:42:46.646820 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.646726 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hx858\" (UniqueName: \"kubernetes.io/projected/1d0250b8-c362-4063-9b74-0cc043c283e9-kube-api-access-hx858\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:42:46.646820 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.646734 2568 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1d0250b8-c362-4063-9b74-0cc043c283e9-tls-assets\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:42:46.646820 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:46.646744 2568 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1d0250b8-c362-4063-9b74-0cc043c283e9-web-config\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:42:47.357330 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.357298 2568 generic.go:358] "Generic (PLEG): container finished" podID="1d0250b8-c362-4063-9b74-0cc043c283e9" containerID="7f0fa038012831170a308380c44f5e8fc6724753b2d9be5d4170afa1e70a7fcd" exitCode=0 Apr 21 04:42:47.357754 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.357375 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1d0250b8-c362-4063-9b74-0cc043c283e9","Type":"ContainerDied","Data":"7f0fa038012831170a308380c44f5e8fc6724753b2d9be5d4170afa1e70a7fcd"} Apr 21 04:42:47.357754 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.357405 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.357754 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.357414 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1d0250b8-c362-4063-9b74-0cc043c283e9","Type":"ContainerDied","Data":"f1e01fb82e844d138b8d8e9cf1f321574ea911ecaeb88df040d8ceacccd11a37"} Apr 21 04:42:47.357754 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.357431 2568 scope.go:117] "RemoveContainer" containerID="7cfc22f2a7ad9e277a3c6a7a35f4c026453e722da93fde0589ff9525509df15d" Apr 21 04:42:47.365289 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.365267 2568 scope.go:117] "RemoveContainer" containerID="e402bd99d2df6d31276399fc0b6101ff1a42245935f8b32667070b6f869ae7ec" Apr 21 04:42:47.372603 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.372583 2568 scope.go:117] "RemoveContainer" containerID="7f0fa038012831170a308380c44f5e8fc6724753b2d9be5d4170afa1e70a7fcd" Apr 21 04:42:47.379825 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.379755 2568 scope.go:117] "RemoveContainer" containerID="7b33bfafeb15882fa1ec5911fc9d79ec6ed55b2c1c6d0ad6aa8bf218b6ff4c6e" Apr 21 04:42:47.381394 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.381374 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 04:42:47.385646 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.385621 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 04:42:47.388559 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.388542 2568 scope.go:117] "RemoveContainer" containerID="2b60f2743c83cf19e36f16dddffd308fa8d3b5f325c68e994085ea86da9ef269" Apr 21 04:42:47.395521 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.395504 2568 scope.go:117] "RemoveContainer" containerID="c17be8b2b83190e8b3116f7a6f240174a1ba26579f14e6789987e961c0d13133" Apr 21 04:42:47.402969 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.402947 2568 scope.go:117] "RemoveContainer" containerID="45480ef1a56f7712ee52f8e980344d56b6ef48ccba3b14fe97646c6564402a3d" Apr 21 04:42:47.407332 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.407309 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 04:42:47.407649 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.407636 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1d0250b8-c362-4063-9b74-0cc043c283e9" containerName="config-reloader" Apr 21 04:42:47.407693 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.407653 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d0250b8-c362-4063-9b74-0cc043c283e9" containerName="config-reloader" Apr 21 04:42:47.407693 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.407668 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1d0250b8-c362-4063-9b74-0cc043c283e9" containerName="kube-rbac-proxy-thanos" Apr 21 04:42:47.407693 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.407674 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d0250b8-c362-4063-9b74-0cc043c283e9" containerName="kube-rbac-proxy-thanos" Apr 21 04:42:47.407693 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.407686 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1d0250b8-c362-4063-9b74-0cc043c283e9" containerName="init-config-reloader" Apr 21 04:42:47.407693 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.407693 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d0250b8-c362-4063-9b74-0cc043c283e9" containerName="init-config-reloader" Apr 21 04:42:47.407861 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.407701 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1d0250b8-c362-4063-9b74-0cc043c283e9" containerName="kube-rbac-proxy-web" Apr 21 04:42:47.407861 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.407706 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d0250b8-c362-4063-9b74-0cc043c283e9" containerName="kube-rbac-proxy-web" Apr 21 04:42:47.407861 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.407715 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1d0250b8-c362-4063-9b74-0cc043c283e9" containerName="kube-rbac-proxy" Apr 21 04:42:47.407861 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.407719 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d0250b8-c362-4063-9b74-0cc043c283e9" containerName="kube-rbac-proxy" Apr 21 04:42:47.407861 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.407732 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1d0250b8-c362-4063-9b74-0cc043c283e9" containerName="prometheus" Apr 21 04:42:47.407861 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.407740 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d0250b8-c362-4063-9b74-0cc043c283e9" containerName="prometheus" Apr 21 04:42:47.407861 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.407750 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1d0250b8-c362-4063-9b74-0cc043c283e9" containerName="thanos-sidecar" Apr 21 04:42:47.407861 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.407755 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d0250b8-c362-4063-9b74-0cc043c283e9" containerName="thanos-sidecar" Apr 21 04:42:47.407861 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.407798 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="1d0250b8-c362-4063-9b74-0cc043c283e9" containerName="kube-rbac-proxy-web" Apr 21 04:42:47.407861 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.407811 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="1d0250b8-c362-4063-9b74-0cc043c283e9" containerName="kube-rbac-proxy" Apr 21 04:42:47.407861 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.407821 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="1d0250b8-c362-4063-9b74-0cc043c283e9" containerName="thanos-sidecar" Apr 21 04:42:47.407861 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.407832 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="1d0250b8-c362-4063-9b74-0cc043c283e9" containerName="config-reloader" Apr 21 04:42:47.407861 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.407841 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="1d0250b8-c362-4063-9b74-0cc043c283e9" containerName="prometheus" Apr 21 04:42:47.407861 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.407851 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="1d0250b8-c362-4063-9b74-0cc043c283e9" containerName="kube-rbac-proxy-thanos" Apr 21 04:42:47.411261 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.411234 2568 scope.go:117] "RemoveContainer" containerID="7cfc22f2a7ad9e277a3c6a7a35f4c026453e722da93fde0589ff9525509df15d" Apr 21 04:42:47.411612 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:42:47.411588 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cfc22f2a7ad9e277a3c6a7a35f4c026453e722da93fde0589ff9525509df15d\": container with ID starting with 7cfc22f2a7ad9e277a3c6a7a35f4c026453e722da93fde0589ff9525509df15d not found: ID does not exist" containerID="7cfc22f2a7ad9e277a3c6a7a35f4c026453e722da93fde0589ff9525509df15d" Apr 21 04:42:47.411695 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.411622 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cfc22f2a7ad9e277a3c6a7a35f4c026453e722da93fde0589ff9525509df15d"} err="failed to get container status \"7cfc22f2a7ad9e277a3c6a7a35f4c026453e722da93fde0589ff9525509df15d\": rpc error: code = NotFound desc = could not find container \"7cfc22f2a7ad9e277a3c6a7a35f4c026453e722da93fde0589ff9525509df15d\": container with ID starting with 7cfc22f2a7ad9e277a3c6a7a35f4c026453e722da93fde0589ff9525509df15d not found: ID does not exist" Apr 21 04:42:47.411695 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.411648 2568 scope.go:117] "RemoveContainer" containerID="e402bd99d2df6d31276399fc0b6101ff1a42245935f8b32667070b6f869ae7ec" Apr 21 04:42:47.412005 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:42:47.411977 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e402bd99d2df6d31276399fc0b6101ff1a42245935f8b32667070b6f869ae7ec\": container with ID starting with e402bd99d2df6d31276399fc0b6101ff1a42245935f8b32667070b6f869ae7ec not found: ID does not exist" containerID="e402bd99d2df6d31276399fc0b6101ff1a42245935f8b32667070b6f869ae7ec" Apr 21 04:42:47.412124 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.412023 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e402bd99d2df6d31276399fc0b6101ff1a42245935f8b32667070b6f869ae7ec"} err="failed to get container status \"e402bd99d2df6d31276399fc0b6101ff1a42245935f8b32667070b6f869ae7ec\": rpc error: code = NotFound desc = could not find container \"e402bd99d2df6d31276399fc0b6101ff1a42245935f8b32667070b6f869ae7ec\": container with ID starting with e402bd99d2df6d31276399fc0b6101ff1a42245935f8b32667070b6f869ae7ec not found: ID does not exist" Apr 21 04:42:47.412124 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.412047 2568 scope.go:117] "RemoveContainer" containerID="7f0fa038012831170a308380c44f5e8fc6724753b2d9be5d4170afa1e70a7fcd" Apr 21 04:42:47.412356 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:42:47.412315 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f0fa038012831170a308380c44f5e8fc6724753b2d9be5d4170afa1e70a7fcd\": container with ID starting with 7f0fa038012831170a308380c44f5e8fc6724753b2d9be5d4170afa1e70a7fcd not found: ID does not exist" containerID="7f0fa038012831170a308380c44f5e8fc6724753b2d9be5d4170afa1e70a7fcd" Apr 21 04:42:47.412404 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.412359 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f0fa038012831170a308380c44f5e8fc6724753b2d9be5d4170afa1e70a7fcd"} err="failed to get container status \"7f0fa038012831170a308380c44f5e8fc6724753b2d9be5d4170afa1e70a7fcd\": rpc error: code = NotFound desc = could not find container \"7f0fa038012831170a308380c44f5e8fc6724753b2d9be5d4170afa1e70a7fcd\": container with ID starting with 7f0fa038012831170a308380c44f5e8fc6724753b2d9be5d4170afa1e70a7fcd not found: ID does not exist" Apr 21 04:42:47.412404 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.412385 2568 scope.go:117] "RemoveContainer" containerID="7b33bfafeb15882fa1ec5911fc9d79ec6ed55b2c1c6d0ad6aa8bf218b6ff4c6e" Apr 21 04:42:47.413548 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.413530 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.414678 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:42:47.414538 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b33bfafeb15882fa1ec5911fc9d79ec6ed55b2c1c6d0ad6aa8bf218b6ff4c6e\": container with ID starting with 7b33bfafeb15882fa1ec5911fc9d79ec6ed55b2c1c6d0ad6aa8bf218b6ff4c6e not found: ID does not exist" containerID="7b33bfafeb15882fa1ec5911fc9d79ec6ed55b2c1c6d0ad6aa8bf218b6ff4c6e" Apr 21 04:42:47.414678 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.414572 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b33bfafeb15882fa1ec5911fc9d79ec6ed55b2c1c6d0ad6aa8bf218b6ff4c6e"} err="failed to get container status \"7b33bfafeb15882fa1ec5911fc9d79ec6ed55b2c1c6d0ad6aa8bf218b6ff4c6e\": rpc error: code = NotFound desc = could not find container \"7b33bfafeb15882fa1ec5911fc9d79ec6ed55b2c1c6d0ad6aa8bf218b6ff4c6e\": container with ID starting with 7b33bfafeb15882fa1ec5911fc9d79ec6ed55b2c1c6d0ad6aa8bf218b6ff4c6e not found: ID does not exist" Apr 21 04:42:47.414678 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.414633 2568 scope.go:117] "RemoveContainer" containerID="2b60f2743c83cf19e36f16dddffd308fa8d3b5f325c68e994085ea86da9ef269" Apr 21 04:42:47.415035 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:42:47.414982 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b60f2743c83cf19e36f16dddffd308fa8d3b5f325c68e994085ea86da9ef269\": container with ID starting with 2b60f2743c83cf19e36f16dddffd308fa8d3b5f325c68e994085ea86da9ef269 not found: ID does not exist" containerID="2b60f2743c83cf19e36f16dddffd308fa8d3b5f325c68e994085ea86da9ef269" Apr 21 04:42:47.415117 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.415042 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b60f2743c83cf19e36f16dddffd308fa8d3b5f325c68e994085ea86da9ef269"} err="failed to get container status \"2b60f2743c83cf19e36f16dddffd308fa8d3b5f325c68e994085ea86da9ef269\": rpc error: code = NotFound desc = could not find container \"2b60f2743c83cf19e36f16dddffd308fa8d3b5f325c68e994085ea86da9ef269\": container with ID starting with 2b60f2743c83cf19e36f16dddffd308fa8d3b5f325c68e994085ea86da9ef269 not found: ID does not exist" Apr 21 04:42:47.415117 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.415063 2568 scope.go:117] "RemoveContainer" containerID="c17be8b2b83190e8b3116f7a6f240174a1ba26579f14e6789987e961c0d13133" Apr 21 04:42:47.415741 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:42:47.415721 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c17be8b2b83190e8b3116f7a6f240174a1ba26579f14e6789987e961c0d13133\": container with ID starting with c17be8b2b83190e8b3116f7a6f240174a1ba26579f14e6789987e961c0d13133 not found: ID does not exist" containerID="c17be8b2b83190e8b3116f7a6f240174a1ba26579f14e6789987e961c0d13133" Apr 21 04:42:47.415818 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.415744 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c17be8b2b83190e8b3116f7a6f240174a1ba26579f14e6789987e961c0d13133"} err="failed to get container status \"c17be8b2b83190e8b3116f7a6f240174a1ba26579f14e6789987e961c0d13133\": rpc error: code = NotFound desc = could not find container \"c17be8b2b83190e8b3116f7a6f240174a1ba26579f14e6789987e961c0d13133\": container with ID starting with c17be8b2b83190e8b3116f7a6f240174a1ba26579f14e6789987e961c0d13133 not found: ID does not exist" Apr 21 04:42:47.415818 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.415764 2568 scope.go:117] "RemoveContainer" containerID="45480ef1a56f7712ee52f8e980344d56b6ef48ccba3b14fe97646c6564402a3d" Apr 21 04:42:47.416023 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:42:47.416003 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45480ef1a56f7712ee52f8e980344d56b6ef48ccba3b14fe97646c6564402a3d\": container with ID starting with 45480ef1a56f7712ee52f8e980344d56b6ef48ccba3b14fe97646c6564402a3d not found: ID does not exist" containerID="45480ef1a56f7712ee52f8e980344d56b6ef48ccba3b14fe97646c6564402a3d" Apr 21 04:42:47.416138 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.416028 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45480ef1a56f7712ee52f8e980344d56b6ef48ccba3b14fe97646c6564402a3d"} err="failed to get container status \"45480ef1a56f7712ee52f8e980344d56b6ef48ccba3b14fe97646c6564402a3d\": rpc error: code = NotFound desc = could not find container \"45480ef1a56f7712ee52f8e980344d56b6ef48ccba3b14fe97646c6564402a3d\": container with ID starting with 45480ef1a56f7712ee52f8e980344d56b6ef48ccba3b14fe97646c6564402a3d not found: ID does not exist" Apr 21 04:42:47.416393 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.416373 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 21 04:42:47.416487 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.416467 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 21 04:42:47.416487 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.416479 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 21 04:42:47.416590 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.416570 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-5rc8p\"" Apr 21 04:42:47.416590 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.416579 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 21 04:42:47.416807 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.416793 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 21 04:42:47.416871 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.416830 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 21 04:42:47.416871 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.416845 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 21 04:42:47.416967 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.416910 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 21 04:42:47.417425 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.417387 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 21 04:42:47.417496 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.417389 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 21 04:42:47.417857 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.417842 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 21 04:42:47.418048 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.418029 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-858rkj8cpvdcq\"" Apr 21 04:42:47.419718 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.419654 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 21 04:42:47.422225 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.422208 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 21 04:42:47.425947 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.425927 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 04:42:47.452903 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.452870 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1d1329d2-f78b-491e-aea8-de3ec08202c7-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.452903 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.452904 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d1329d2-f78b-491e-aea8-de3ec08202c7-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.453126 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.452931 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1d1329d2-f78b-491e-aea8-de3ec08202c7-web-config\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.453126 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.452956 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1d1329d2-f78b-491e-aea8-de3ec08202c7-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.453126 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.453021 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1d1329d2-f78b-491e-aea8-de3ec08202c7-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.453126 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.453056 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d1329d2-f78b-491e-aea8-de3ec08202c7-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.453126 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.453102 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8nr7\" (UniqueName: \"kubernetes.io/projected/1d1329d2-f78b-491e-aea8-de3ec08202c7-kube-api-access-t8nr7\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.453350 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.453148 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1d1329d2-f78b-491e-aea8-de3ec08202c7-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.453350 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.453165 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1d1329d2-f78b-491e-aea8-de3ec08202c7-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.453350 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.453212 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d1329d2-f78b-491e-aea8-de3ec08202c7-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.453350 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.453254 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1d1329d2-f78b-491e-aea8-de3ec08202c7-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.453350 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.453295 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1d1329d2-f78b-491e-aea8-de3ec08202c7-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.453350 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.453324 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1d1329d2-f78b-491e-aea8-de3ec08202c7-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.453613 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.453364 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1d1329d2-f78b-491e-aea8-de3ec08202c7-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.453613 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.453423 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1d1329d2-f78b-491e-aea8-de3ec08202c7-config\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.453613 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.453450 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1d1329d2-f78b-491e-aea8-de3ec08202c7-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.453613 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.453478 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1d1329d2-f78b-491e-aea8-de3ec08202c7-config-out\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.453613 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.453503 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1d1329d2-f78b-491e-aea8-de3ec08202c7-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.485015 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.484980 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d0250b8-c362-4063-9b74-0cc043c283e9" path="/var/lib/kubelet/pods/1d0250b8-c362-4063-9b74-0cc043c283e9/volumes" Apr 21 04:42:47.554012 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.553977 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1d1329d2-f78b-491e-aea8-de3ec08202c7-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.554180 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.554023 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d1329d2-f78b-491e-aea8-de3ec08202c7-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.554180 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.554044 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t8nr7\" (UniqueName: \"kubernetes.io/projected/1d1329d2-f78b-491e-aea8-de3ec08202c7-kube-api-access-t8nr7\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.554180 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.554105 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1d1329d2-f78b-491e-aea8-de3ec08202c7-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.554180 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.554125 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1d1329d2-f78b-491e-aea8-de3ec08202c7-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.554180 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.554144 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d1329d2-f78b-491e-aea8-de3ec08202c7-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.554180 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.554161 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1d1329d2-f78b-491e-aea8-de3ec08202c7-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.554481 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.554191 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1d1329d2-f78b-491e-aea8-de3ec08202c7-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.554481 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.554222 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1d1329d2-f78b-491e-aea8-de3ec08202c7-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.554481 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.554262 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1d1329d2-f78b-491e-aea8-de3ec08202c7-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.554481 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.554294 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1d1329d2-f78b-491e-aea8-de3ec08202c7-config\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.554481 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.554323 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1d1329d2-f78b-491e-aea8-de3ec08202c7-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.554481 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.554344 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1d1329d2-f78b-491e-aea8-de3ec08202c7-config-out\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.554481 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.554365 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1d1329d2-f78b-491e-aea8-de3ec08202c7-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.554481 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.554398 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1d1329d2-f78b-491e-aea8-de3ec08202c7-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.554481 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.554422 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d1329d2-f78b-491e-aea8-de3ec08202c7-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.554481 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.554447 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1d1329d2-f78b-491e-aea8-de3ec08202c7-web-config\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.554481 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.554471 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1d1329d2-f78b-491e-aea8-de3ec08202c7-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.555507 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.555185 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d1329d2-f78b-491e-aea8-de3ec08202c7-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.555507 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.555303 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1d1329d2-f78b-491e-aea8-de3ec08202c7-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.555694 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.555641 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1d1329d2-f78b-491e-aea8-de3ec08202c7-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.556819 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.556773 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d1329d2-f78b-491e-aea8-de3ec08202c7-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.557824 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.557794 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1d1329d2-f78b-491e-aea8-de3ec08202c7-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.557922 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.557804 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1d1329d2-f78b-491e-aea8-de3ec08202c7-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.558383 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.558305 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1d1329d2-f78b-491e-aea8-de3ec08202c7-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.558383 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.558331 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1d1329d2-f78b-491e-aea8-de3ec08202c7-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.558810 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.558780 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1d1329d2-f78b-491e-aea8-de3ec08202c7-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.559026 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.558983 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d1329d2-f78b-491e-aea8-de3ec08202c7-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.559387 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.559358 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1d1329d2-f78b-491e-aea8-de3ec08202c7-config-out\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.559476 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.559433 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1d1329d2-f78b-491e-aea8-de3ec08202c7-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.559533 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.559520 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1d1329d2-f78b-491e-aea8-de3ec08202c7-config\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.559679 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.559661 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1d1329d2-f78b-491e-aea8-de3ec08202c7-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.559764 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.559745 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1d1329d2-f78b-491e-aea8-de3ec08202c7-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.559938 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.559922 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1d1329d2-f78b-491e-aea8-de3ec08202c7-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.560357 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.560337 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1d1329d2-f78b-491e-aea8-de3ec08202c7-web-config\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.562264 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.562245 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8nr7\" (UniqueName: \"kubernetes.io/projected/1d1329d2-f78b-491e-aea8-de3ec08202c7-kube-api-access-t8nr7\") pod \"prometheus-k8s-0\" (UID: \"1d1329d2-f78b-491e-aea8-de3ec08202c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.724642 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.724601 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:42:47.851016 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:47.850978 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 04:42:47.854816 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:42:47.854789 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d1329d2_f78b_491e_aea8_de3ec08202c7.slice/crio-412a9094cc1788d1abd57a2947ecb6cc86272ebcbf262d0b8146cd3e5dccdbdd WatchSource:0}: Error finding container 412a9094cc1788d1abd57a2947ecb6cc86272ebcbf262d0b8146cd3e5dccdbdd: Status 404 returned error can't find the container with id 412a9094cc1788d1abd57a2947ecb6cc86272ebcbf262d0b8146cd3e5dccdbdd Apr 21 04:42:48.362492 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:48.362456 2568 generic.go:358] "Generic (PLEG): container finished" podID="1d1329d2-f78b-491e-aea8-de3ec08202c7" containerID="5d330354f7c37bab1928786e458d678e22253841fc642d49b9f80c5e03a1df4d" exitCode=0 Apr 21 04:42:48.362947 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:48.362538 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1d1329d2-f78b-491e-aea8-de3ec08202c7","Type":"ContainerDied","Data":"5d330354f7c37bab1928786e458d678e22253841fc642d49b9f80c5e03a1df4d"} Apr 21 04:42:48.362947 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:48.362579 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1d1329d2-f78b-491e-aea8-de3ec08202c7","Type":"ContainerStarted","Data":"412a9094cc1788d1abd57a2947ecb6cc86272ebcbf262d0b8146cd3e5dccdbdd"} Apr 21 04:42:49.370142 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:49.370101 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1d1329d2-f78b-491e-aea8-de3ec08202c7","Type":"ContainerStarted","Data":"fac012ab99565141a8aa140eb3ec1a410058379ec37a9637f2234a9016245f24"} Apr 21 04:42:49.370142 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:49.370144 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1d1329d2-f78b-491e-aea8-de3ec08202c7","Type":"ContainerStarted","Data":"bbfd4c49cf405508748798e91fa40b7658c2a50000cac51c9749cebb2ec20b16"} Apr 21 04:42:49.370580 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:49.370153 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1d1329d2-f78b-491e-aea8-de3ec08202c7","Type":"ContainerStarted","Data":"b8202c57cdc650d4175eb41f6c9aa50a254f10b7e2945b0095e33c12d9c88bfd"} Apr 21 04:42:49.370580 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:49.370163 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1d1329d2-f78b-491e-aea8-de3ec08202c7","Type":"ContainerStarted","Data":"c207390ed6e680062587023a56854db6cca385aadf0a196810fcef5f39e55a93"} Apr 21 04:42:49.370580 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:49.370171 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1d1329d2-f78b-491e-aea8-de3ec08202c7","Type":"ContainerStarted","Data":"59b56226fc9b63380baea370117ca7a811e0fcf2f7c2f70978695b7fd6b1eb9d"} Apr 21 04:42:49.370580 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:49.370179 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1d1329d2-f78b-491e-aea8-de3ec08202c7","Type":"ContainerStarted","Data":"a0bf5b139938352dbb98dcd5ac3344d7321a0b4eb6fa2c654ce2052c7b98e3df"} Apr 21 04:42:49.398216 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:49.397718 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.39769788 podStartE2EDuration="2.39769788s" podCreationTimestamp="2026-04-21 04:42:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:42:49.395927489 +0000 UTC m=+224.517485333" watchObservedRunningTime="2026-04-21 04:42:49.39769788 +0000 UTC m=+224.519255680" Apr 21 04:42:52.725178 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:42:52.725143 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:43:37.905314 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:43:37.905230 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-4s8qh"] Apr 21 04:43:37.908492 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:43:37.908468 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4s8qh" Apr 21 04:43:37.915469 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:43:37.915444 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 04:43:37.929425 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:43:37.929400 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-4s8qh"] Apr 21 04:43:38.000183 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:43:38.000135 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/973fae08-b8f2-43d5-9852-9711b218e560-dbus\") pod \"global-pull-secret-syncer-4s8qh\" (UID: \"973fae08-b8f2-43d5-9852-9711b218e560\") " pod="kube-system/global-pull-secret-syncer-4s8qh" Apr 21 04:43:38.000389 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:43:38.000259 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/973fae08-b8f2-43d5-9852-9711b218e560-kubelet-config\") pod \"global-pull-secret-syncer-4s8qh\" (UID: \"973fae08-b8f2-43d5-9852-9711b218e560\") " pod="kube-system/global-pull-secret-syncer-4s8qh" Apr 21 04:43:38.000389 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:43:38.000299 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/973fae08-b8f2-43d5-9852-9711b218e560-original-pull-secret\") pod \"global-pull-secret-syncer-4s8qh\" (UID: \"973fae08-b8f2-43d5-9852-9711b218e560\") " pod="kube-system/global-pull-secret-syncer-4s8qh" Apr 21 04:43:38.100937 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:43:38.100896 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/973fae08-b8f2-43d5-9852-9711b218e560-kubelet-config\") pod \"global-pull-secret-syncer-4s8qh\" (UID: \"973fae08-b8f2-43d5-9852-9711b218e560\") " pod="kube-system/global-pull-secret-syncer-4s8qh" Apr 21 04:43:38.101148 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:43:38.100947 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/973fae08-b8f2-43d5-9852-9711b218e560-original-pull-secret\") pod \"global-pull-secret-syncer-4s8qh\" (UID: \"973fae08-b8f2-43d5-9852-9711b218e560\") " pod="kube-system/global-pull-secret-syncer-4s8qh" Apr 21 04:43:38.101148 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:43:38.101012 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/973fae08-b8f2-43d5-9852-9711b218e560-dbus\") pod \"global-pull-secret-syncer-4s8qh\" (UID: \"973fae08-b8f2-43d5-9852-9711b218e560\") " pod="kube-system/global-pull-secret-syncer-4s8qh" Apr 21 04:43:38.101148 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:43:38.101021 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/973fae08-b8f2-43d5-9852-9711b218e560-kubelet-config\") pod \"global-pull-secret-syncer-4s8qh\" (UID: \"973fae08-b8f2-43d5-9852-9711b218e560\") " pod="kube-system/global-pull-secret-syncer-4s8qh" Apr 21 04:43:38.101345 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:43:38.101200 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/973fae08-b8f2-43d5-9852-9711b218e560-dbus\") pod \"global-pull-secret-syncer-4s8qh\" (UID: \"973fae08-b8f2-43d5-9852-9711b218e560\") " pod="kube-system/global-pull-secret-syncer-4s8qh" Apr 21 04:43:38.103267 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:43:38.103245 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/973fae08-b8f2-43d5-9852-9711b218e560-original-pull-secret\") pod \"global-pull-secret-syncer-4s8qh\" (UID: \"973fae08-b8f2-43d5-9852-9711b218e560\") " pod="kube-system/global-pull-secret-syncer-4s8qh" Apr 21 04:43:38.217504 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:43:38.217417 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4s8qh" Apr 21 04:43:38.348697 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:43:38.348661 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-4s8qh"] Apr 21 04:43:38.351902 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:43:38.351874 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod973fae08_b8f2_43d5_9852_9711b218e560.slice/crio-32c7d67849893a4ca2bc67e46bb7b46a82dd04a0ab298b112f17dd25f97f4ea8 WatchSource:0}: Error finding container 32c7d67849893a4ca2bc67e46bb7b46a82dd04a0ab298b112f17dd25f97f4ea8: Status 404 returned error can't find the container with id 32c7d67849893a4ca2bc67e46bb7b46a82dd04a0ab298b112f17dd25f97f4ea8 Apr 21 04:43:38.521059 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:43:38.520976 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-4s8qh" event={"ID":"973fae08-b8f2-43d5-9852-9711b218e560","Type":"ContainerStarted","Data":"32c7d67849893a4ca2bc67e46bb7b46a82dd04a0ab298b112f17dd25f97f4ea8"} Apr 21 04:43:42.536019 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:43:42.535984 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-4s8qh" event={"ID":"973fae08-b8f2-43d5-9852-9711b218e560","Type":"ContainerStarted","Data":"ebbdece59e51eefdb1c4070ec7b83f3d4a8a81f785843fb16886ba5779bd65a5"} Apr 21 04:43:42.562704 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:43:42.562658 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-4s8qh" podStartSLOduration=1.995095676 podStartE2EDuration="5.562644093s" podCreationTimestamp="2026-04-21 04:43:37 +0000 UTC" firstStartedPulling="2026-04-21 04:43:38.353439835 +0000 UTC m=+273.474997619" lastFinishedPulling="2026-04-21 04:43:41.920988237 +0000 UTC m=+277.042546036" observedRunningTime="2026-04-21 04:43:42.562295372 +0000 UTC m=+277.683853179" watchObservedRunningTime="2026-04-21 04:43:42.562644093 +0000 UTC m=+277.684201899" Apr 21 04:43:47.724917 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:43:47.724868 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:43:47.740370 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:43:47.740345 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:43:48.569990 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:43:48.569962 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:44:05.376254 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:44:05.376228 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wdf4b_8f53d634-201a-47a8-bb8f-a939d320e536/console-operator/2.log" Apr 21 04:44:05.376699 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:44:05.376228 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wdf4b_8f53d634-201a-47a8-bb8f-a939d320e536/console-operator/2.log" Apr 21 04:44:05.383418 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:44:05.383397 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jhdlv_8b8f621a-cf82-4827-a595-5b1724d2e7df/ovn-acl-logging/0.log" Apr 21 04:44:05.383566 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:44:05.383457 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jhdlv_8b8f621a-cf82-4827-a595-5b1724d2e7df/ovn-acl-logging/0.log" Apr 21 04:44:05.386116 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:44:05.386097 2568 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 04:44:51.100418 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:44:51.100382 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-55ddb68486-pbs7j"] Apr 21 04:44:51.102682 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:44:51.102665 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-pbs7j" Apr 21 04:44:51.105844 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:44:51.105818 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 21 04:44:51.105977 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:44:51.105883 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 21 04:44:51.106207 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:44:51.106187 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-9fpwh\"" Apr 21 04:44:51.106391 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:44:51.106370 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 21 04:44:51.106671 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:44:51.106651 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 21 04:44:51.110829 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:44:51.110809 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvgkk\" (UniqueName: \"kubernetes.io/projected/18136690-4103-4545-8843-72079fd2605c-kube-api-access-gvgkk\") pod \"opendatahub-operator-controller-manager-55ddb68486-pbs7j\" (UID: \"18136690-4103-4545-8843-72079fd2605c\") " pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-pbs7j" Apr 21 04:44:51.110923 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:44:51.110857 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/18136690-4103-4545-8843-72079fd2605c-webhook-cert\") pod \"opendatahub-operator-controller-manager-55ddb68486-pbs7j\" (UID: \"18136690-4103-4545-8843-72079fd2605c\") " pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-pbs7j" Apr 21 04:44:51.110977 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:44:51.110948 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/18136690-4103-4545-8843-72079fd2605c-apiservice-cert\") pod \"opendatahub-operator-controller-manager-55ddb68486-pbs7j\" (UID: \"18136690-4103-4545-8843-72079fd2605c\") " pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-pbs7j" Apr 21 04:44:51.123752 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:44:51.123726 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-55ddb68486-pbs7j"] Apr 21 04:44:51.212046 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:44:51.212009 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/18136690-4103-4545-8843-72079fd2605c-webhook-cert\") pod \"opendatahub-operator-controller-manager-55ddb68486-pbs7j\" (UID: \"18136690-4103-4545-8843-72079fd2605c\") " pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-pbs7j" Apr 21 04:44:51.212261 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:44:51.212097 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/18136690-4103-4545-8843-72079fd2605c-apiservice-cert\") pod \"opendatahub-operator-controller-manager-55ddb68486-pbs7j\" (UID: \"18136690-4103-4545-8843-72079fd2605c\") " pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-pbs7j" Apr 21 04:44:51.212261 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:44:51.212137 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvgkk\" (UniqueName: \"kubernetes.io/projected/18136690-4103-4545-8843-72079fd2605c-kube-api-access-gvgkk\") pod \"opendatahub-operator-controller-manager-55ddb68486-pbs7j\" (UID: \"18136690-4103-4545-8843-72079fd2605c\") " pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-pbs7j" Apr 21 04:44:51.214488 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:44:51.214459 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/18136690-4103-4545-8843-72079fd2605c-apiservice-cert\") pod \"opendatahub-operator-controller-manager-55ddb68486-pbs7j\" (UID: \"18136690-4103-4545-8843-72079fd2605c\") " pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-pbs7j" Apr 21 04:44:51.214596 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:44:51.214570 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/18136690-4103-4545-8843-72079fd2605c-webhook-cert\") pod \"opendatahub-operator-controller-manager-55ddb68486-pbs7j\" (UID: \"18136690-4103-4545-8843-72079fd2605c\") " pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-pbs7j" Apr 21 04:44:51.221660 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:44:51.221638 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvgkk\" (UniqueName: \"kubernetes.io/projected/18136690-4103-4545-8843-72079fd2605c-kube-api-access-gvgkk\") pod \"opendatahub-operator-controller-manager-55ddb68486-pbs7j\" (UID: \"18136690-4103-4545-8843-72079fd2605c\") " pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-pbs7j" Apr 21 04:44:51.412980 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:44:51.412945 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-pbs7j" Apr 21 04:44:51.542212 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:44:51.542140 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-55ddb68486-pbs7j"] Apr 21 04:44:51.545093 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:44:51.545041 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18136690_4103_4545_8843_72079fd2605c.slice/crio-985ae8d0c0a7ac9af24ea0c1e333137580fec350dfb887021110019442e39652 WatchSource:0}: Error finding container 985ae8d0c0a7ac9af24ea0c1e333137580fec350dfb887021110019442e39652: Status 404 returned error can't find the container with id 985ae8d0c0a7ac9af24ea0c1e333137580fec350dfb887021110019442e39652 Apr 21 04:44:51.546662 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:44:51.546645 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:44:51.747488 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:44:51.747391 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-pbs7j" event={"ID":"18136690-4103-4545-8843-72079fd2605c","Type":"ContainerStarted","Data":"985ae8d0c0a7ac9af24ea0c1e333137580fec350dfb887021110019442e39652"} Apr 21 04:44:54.758650 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:44:54.758612 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-pbs7j" event={"ID":"18136690-4103-4545-8843-72079fd2605c","Type":"ContainerStarted","Data":"0875191b59cf0118b9480868b3e43604a7361604fe61d6879336c307575208dc"} Apr 21 04:44:54.759051 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:44:54.758712 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-pbs7j" Apr 21 04:44:54.783481 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:44:54.783435 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-pbs7j" podStartSLOduration=1.223371936 podStartE2EDuration="3.783422121s" podCreationTimestamp="2026-04-21 04:44:51 +0000 UTC" firstStartedPulling="2026-04-21 04:44:51.546764325 +0000 UTC m=+346.668322110" lastFinishedPulling="2026-04-21 04:44:54.106814507 +0000 UTC m=+349.228372295" observedRunningTime="2026-04-21 04:44:54.783257318 +0000 UTC m=+349.904815127" watchObservedRunningTime="2026-04-21 04:44:54.783422121 +0000 UTC m=+349.904979928" Apr 21 04:45:00.466284 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:00.466245 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-bb95b586d-plmjx"] Apr 21 04:45:00.468439 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:00.468423 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-bb95b586d-plmjx" Apr 21 04:45:00.473004 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:00.472979 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 21 04:45:00.473154 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:00.473047 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 21 04:45:00.473154 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:00.473093 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-qnzdb\"" Apr 21 04:45:00.473154 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:00.473121 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 21 04:45:00.473391 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:00.473374 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 21 04:45:00.473438 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:00.473401 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 21 04:45:00.484679 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:00.484659 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-bb95b586d-plmjx"] Apr 21 04:45:00.493411 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:00.493389 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95sg5\" (UniqueName: \"kubernetes.io/projected/868e0256-9c68-4772-8ae4-45e88a055901-kube-api-access-95sg5\") pod \"lws-controller-manager-bb95b586d-plmjx\" (UID: \"868e0256-9c68-4772-8ae4-45e88a055901\") " pod="openshift-lws-operator/lws-controller-manager-bb95b586d-plmjx" Apr 21 04:45:00.493521 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:00.493430 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/868e0256-9c68-4772-8ae4-45e88a055901-cert\") pod \"lws-controller-manager-bb95b586d-plmjx\" (UID: \"868e0256-9c68-4772-8ae4-45e88a055901\") " pod="openshift-lws-operator/lws-controller-manager-bb95b586d-plmjx" Apr 21 04:45:00.493521 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:00.493503 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/868e0256-9c68-4772-8ae4-45e88a055901-manager-config\") pod \"lws-controller-manager-bb95b586d-plmjx\" (UID: \"868e0256-9c68-4772-8ae4-45e88a055901\") " pod="openshift-lws-operator/lws-controller-manager-bb95b586d-plmjx" Apr 21 04:45:00.493625 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:00.493537 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/868e0256-9c68-4772-8ae4-45e88a055901-metrics-cert\") pod \"lws-controller-manager-bb95b586d-plmjx\" (UID: \"868e0256-9c68-4772-8ae4-45e88a055901\") " pod="openshift-lws-operator/lws-controller-manager-bb95b586d-plmjx" Apr 21 04:45:00.594529 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:00.594496 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-95sg5\" (UniqueName: \"kubernetes.io/projected/868e0256-9c68-4772-8ae4-45e88a055901-kube-api-access-95sg5\") pod \"lws-controller-manager-bb95b586d-plmjx\" (UID: \"868e0256-9c68-4772-8ae4-45e88a055901\") " pod="openshift-lws-operator/lws-controller-manager-bb95b586d-plmjx" Apr 21 04:45:00.594725 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:00.594546 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/868e0256-9c68-4772-8ae4-45e88a055901-cert\") pod \"lws-controller-manager-bb95b586d-plmjx\" (UID: \"868e0256-9c68-4772-8ae4-45e88a055901\") " pod="openshift-lws-operator/lws-controller-manager-bb95b586d-plmjx" Apr 21 04:45:00.594725 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:00.594599 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/868e0256-9c68-4772-8ae4-45e88a055901-manager-config\") pod \"lws-controller-manager-bb95b586d-plmjx\" (UID: \"868e0256-9c68-4772-8ae4-45e88a055901\") " pod="openshift-lws-operator/lws-controller-manager-bb95b586d-plmjx" Apr 21 04:45:00.594725 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:00.594619 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/868e0256-9c68-4772-8ae4-45e88a055901-metrics-cert\") pod \"lws-controller-manager-bb95b586d-plmjx\" (UID: \"868e0256-9c68-4772-8ae4-45e88a055901\") " pod="openshift-lws-operator/lws-controller-manager-bb95b586d-plmjx" Apr 21 04:45:00.595352 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:00.595325 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/868e0256-9c68-4772-8ae4-45e88a055901-manager-config\") pod \"lws-controller-manager-bb95b586d-plmjx\" (UID: \"868e0256-9c68-4772-8ae4-45e88a055901\") " pod="openshift-lws-operator/lws-controller-manager-bb95b586d-plmjx" Apr 21 04:45:00.597103 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:00.597058 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/868e0256-9c68-4772-8ae4-45e88a055901-cert\") pod \"lws-controller-manager-bb95b586d-plmjx\" (UID: \"868e0256-9c68-4772-8ae4-45e88a055901\") " pod="openshift-lws-operator/lws-controller-manager-bb95b586d-plmjx" Apr 21 04:45:00.597256 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:00.597234 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/868e0256-9c68-4772-8ae4-45e88a055901-metrics-cert\") pod \"lws-controller-manager-bb95b586d-plmjx\" (UID: \"868e0256-9c68-4772-8ae4-45e88a055901\") " pod="openshift-lws-operator/lws-controller-manager-bb95b586d-plmjx" Apr 21 04:45:00.605955 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:00.605932 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-95sg5\" (UniqueName: \"kubernetes.io/projected/868e0256-9c68-4772-8ae4-45e88a055901-kube-api-access-95sg5\") pod \"lws-controller-manager-bb95b586d-plmjx\" (UID: \"868e0256-9c68-4772-8ae4-45e88a055901\") " pod="openshift-lws-operator/lws-controller-manager-bb95b586d-plmjx" Apr 21 04:45:00.777499 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:00.777415 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-bb95b586d-plmjx" Apr 21 04:45:00.903476 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:00.903440 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-bb95b586d-plmjx"] Apr 21 04:45:00.905853 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:45:00.905830 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod868e0256_9c68_4772_8ae4_45e88a055901.slice/crio-85ee95c68f70f9fbfede25fb5aeebe4ecf51eaadfe8208c7b8f6f1c7daf7c08f WatchSource:0}: Error finding container 85ee95c68f70f9fbfede25fb5aeebe4ecf51eaadfe8208c7b8f6f1c7daf7c08f: Status 404 returned error can't find the container with id 85ee95c68f70f9fbfede25fb5aeebe4ecf51eaadfe8208c7b8f6f1c7daf7c08f Apr 21 04:45:01.785465 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:01.785423 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-bb95b586d-plmjx" event={"ID":"868e0256-9c68-4772-8ae4-45e88a055901","Type":"ContainerStarted","Data":"85ee95c68f70f9fbfede25fb5aeebe4ecf51eaadfe8208c7b8f6f1c7daf7c08f"} Apr 21 04:45:03.793948 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:03.793911 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-bb95b586d-plmjx" event={"ID":"868e0256-9c68-4772-8ae4-45e88a055901","Type":"ContainerStarted","Data":"ba667aedb635833903295170d759c082aea76aa83c1a8e036a9cf164c5ddcfd7"} Apr 21 04:45:03.794375 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:03.793984 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-bb95b586d-plmjx" Apr 21 04:45:03.814782 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:03.814738 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-bb95b586d-plmjx" podStartSLOduration=1.3910275269999999 podStartE2EDuration="3.814724122s" podCreationTimestamp="2026-04-21 04:45:00 +0000 UTC" firstStartedPulling="2026-04-21 04:45:00.907629783 +0000 UTC m=+356.029187572" lastFinishedPulling="2026-04-21 04:45:03.331326379 +0000 UTC m=+358.452884167" observedRunningTime="2026-04-21 04:45:03.814562514 +0000 UTC m=+358.936120354" watchObservedRunningTime="2026-04-21 04:45:03.814724122 +0000 UTC m=+358.936281929" Apr 21 04:45:05.763964 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:05.763933 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-55ddb68486-pbs7j" Apr 21 04:45:09.811380 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:09.811342 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-788fdfdbbd-4svlw"] Apr 21 04:45:09.814206 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:09.814183 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-788fdfdbbd-4svlw" Apr 21 04:45:09.816976 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:09.816959 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 21 04:45:09.817091 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:09.816995 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 21 04:45:09.817798 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:09.817780 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-6kdpx\"" Apr 21 04:45:09.824934 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:09.824911 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-788fdfdbbd-4svlw"] Apr 21 04:45:09.887687 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:09.887647 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgspt\" (UniqueName: \"kubernetes.io/projected/abcb51c1-3dee-45ae-8fb8-36db34a5e745-kube-api-access-kgspt\") pod \"kube-auth-proxy-788fdfdbbd-4svlw\" (UID: \"abcb51c1-3dee-45ae-8fb8-36db34a5e745\") " pod="openshift-ingress/kube-auth-proxy-788fdfdbbd-4svlw" Apr 21 04:45:09.887870 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:09.887732 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/abcb51c1-3dee-45ae-8fb8-36db34a5e745-tls-certs\") pod \"kube-auth-proxy-788fdfdbbd-4svlw\" (UID: \"abcb51c1-3dee-45ae-8fb8-36db34a5e745\") " pod="openshift-ingress/kube-auth-proxy-788fdfdbbd-4svlw" Apr 21 04:45:09.887870 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:09.887760 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/abcb51c1-3dee-45ae-8fb8-36db34a5e745-tmp\") pod \"kube-auth-proxy-788fdfdbbd-4svlw\" (UID: \"abcb51c1-3dee-45ae-8fb8-36db34a5e745\") " pod="openshift-ingress/kube-auth-proxy-788fdfdbbd-4svlw" Apr 21 04:45:09.988988 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:09.988952 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/abcb51c1-3dee-45ae-8fb8-36db34a5e745-tls-certs\") pod \"kube-auth-proxy-788fdfdbbd-4svlw\" (UID: \"abcb51c1-3dee-45ae-8fb8-36db34a5e745\") " pod="openshift-ingress/kube-auth-proxy-788fdfdbbd-4svlw" Apr 21 04:45:09.989171 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:09.988999 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/abcb51c1-3dee-45ae-8fb8-36db34a5e745-tmp\") pod \"kube-auth-proxy-788fdfdbbd-4svlw\" (UID: \"abcb51c1-3dee-45ae-8fb8-36db34a5e745\") " pod="openshift-ingress/kube-auth-proxy-788fdfdbbd-4svlw" Apr 21 04:45:09.989171 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:09.989022 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kgspt\" (UniqueName: \"kubernetes.io/projected/abcb51c1-3dee-45ae-8fb8-36db34a5e745-kube-api-access-kgspt\") pod \"kube-auth-proxy-788fdfdbbd-4svlw\" (UID: \"abcb51c1-3dee-45ae-8fb8-36db34a5e745\") " pod="openshift-ingress/kube-auth-proxy-788fdfdbbd-4svlw" Apr 21 04:45:09.991271 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:09.991239 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/abcb51c1-3dee-45ae-8fb8-36db34a5e745-tmp\") pod \"kube-auth-proxy-788fdfdbbd-4svlw\" (UID: \"abcb51c1-3dee-45ae-8fb8-36db34a5e745\") " pod="openshift-ingress/kube-auth-proxy-788fdfdbbd-4svlw" Apr 21 04:45:09.991383 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:09.991361 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/abcb51c1-3dee-45ae-8fb8-36db34a5e745-tls-certs\") pod \"kube-auth-proxy-788fdfdbbd-4svlw\" (UID: \"abcb51c1-3dee-45ae-8fb8-36db34a5e745\") " pod="openshift-ingress/kube-auth-proxy-788fdfdbbd-4svlw" Apr 21 04:45:09.997315 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:09.997292 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgspt\" (UniqueName: \"kubernetes.io/projected/abcb51c1-3dee-45ae-8fb8-36db34a5e745-kube-api-access-kgspt\") pod \"kube-auth-proxy-788fdfdbbd-4svlw\" (UID: \"abcb51c1-3dee-45ae-8fb8-36db34a5e745\") " pod="openshift-ingress/kube-auth-proxy-788fdfdbbd-4svlw" Apr 21 04:45:10.124471 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:10.124435 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-788fdfdbbd-4svlw" Apr 21 04:45:10.265607 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:10.265535 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-788fdfdbbd-4svlw"] Apr 21 04:45:10.267927 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:45:10.267890 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabcb51c1_3dee_45ae_8fb8_36db34a5e745.slice/crio-5810c9dbc28efdc803daaf5bc587242abd7ab5ec3129255f5d6b17c16fff7e2d WatchSource:0}: Error finding container 5810c9dbc28efdc803daaf5bc587242abd7ab5ec3129255f5d6b17c16fff7e2d: Status 404 returned error can't find the container with id 5810c9dbc28efdc803daaf5bc587242abd7ab5ec3129255f5d6b17c16fff7e2d Apr 21 04:45:10.820007 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:10.819961 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-788fdfdbbd-4svlw" event={"ID":"abcb51c1-3dee-45ae-8fb8-36db34a5e745","Type":"ContainerStarted","Data":"5810c9dbc28efdc803daaf5bc587242abd7ab5ec3129255f5d6b17c16fff7e2d"} Apr 21 04:45:13.832622 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:13.832590 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-788fdfdbbd-4svlw" event={"ID":"abcb51c1-3dee-45ae-8fb8-36db34a5e745","Type":"ContainerStarted","Data":"5041439926a2db08fea7be3f52951d3ca7561f1497896ae4185c57537aa61a26"} Apr 21 04:45:13.850603 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:13.850551 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-788fdfdbbd-4svlw" podStartSLOduration=1.830536653 podStartE2EDuration="4.850537966s" podCreationTimestamp="2026-04-21 04:45:09 +0000 UTC" firstStartedPulling="2026-04-21 04:45:10.269900792 +0000 UTC m=+365.391458588" lastFinishedPulling="2026-04-21 04:45:13.289902112 +0000 UTC m=+368.411459901" observedRunningTime="2026-04-21 04:45:13.849484907 +0000 UTC m=+368.971042714" watchObservedRunningTime="2026-04-21 04:45:13.850537966 +0000 UTC m=+368.972095772" Apr 21 04:45:14.800170 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:45:14.800140 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-bb95b586d-plmjx" Apr 21 04:46:49.049622 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:46:49.049586 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-9wb8h"] Apr 21 04:46:49.053174 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:46:49.053157 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9wb8h" Apr 21 04:46:49.056252 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:46:49.056224 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 04:46:49.056370 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:46:49.056260 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-6bfw2\"" Apr 21 04:46:49.057340 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:46:49.057324 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 21 04:46:49.057614 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:46:49.057599 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 04:46:49.064770 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:46:49.064750 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-9wb8h"] Apr 21 04:46:49.163567 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:46:49.163532 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4sm4\" (UniqueName: \"kubernetes.io/projected/5e69f849-cd12-40d2-8bea-8818945861ad-kube-api-access-c4sm4\") pod \"dns-operator-controller-manager-648d5c98bc-9wb8h\" (UID: \"5e69f849-cd12-40d2-8bea-8818945861ad\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9wb8h" Apr 21 04:46:49.264568 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:46:49.264532 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c4sm4\" (UniqueName: \"kubernetes.io/projected/5e69f849-cd12-40d2-8bea-8818945861ad-kube-api-access-c4sm4\") pod \"dns-operator-controller-manager-648d5c98bc-9wb8h\" (UID: \"5e69f849-cd12-40d2-8bea-8818945861ad\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9wb8h" Apr 21 04:46:49.273618 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:46:49.273584 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4sm4\" (UniqueName: \"kubernetes.io/projected/5e69f849-cd12-40d2-8bea-8818945861ad-kube-api-access-c4sm4\") pod \"dns-operator-controller-manager-648d5c98bc-9wb8h\" (UID: \"5e69f849-cd12-40d2-8bea-8818945861ad\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9wb8h" Apr 21 04:46:49.363903 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:46:49.363806 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9wb8h" Apr 21 04:46:49.494841 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:46:49.494812 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-9wb8h"] Apr 21 04:46:49.498390 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:46:49.498348 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e69f849_cd12_40d2_8bea_8818945861ad.slice/crio-099efacc0d9b40e6ca9a70e647d92555567ac160d4bcb4bb240046a3867ddca0 WatchSource:0}: Error finding container 099efacc0d9b40e6ca9a70e647d92555567ac160d4bcb4bb240046a3867ddca0: Status 404 returned error can't find the container with id 099efacc0d9b40e6ca9a70e647d92555567ac160d4bcb4bb240046a3867ddca0 Apr 21 04:46:50.151807 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:46:50.151767 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9wb8h" event={"ID":"5e69f849-cd12-40d2-8bea-8818945861ad","Type":"ContainerStarted","Data":"099efacc0d9b40e6ca9a70e647d92555567ac160d4bcb4bb240046a3867ddca0"} Apr 21 04:46:52.162411 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:46:52.162373 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9wb8h" event={"ID":"5e69f849-cd12-40d2-8bea-8818945861ad","Type":"ContainerStarted","Data":"9a86ff2ad769e09e55adcc0a127846f25b95a985f5c1c0ad51aa0691c9409843"} Apr 21 04:46:52.162813 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:46:52.162503 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9wb8h" Apr 21 04:46:52.189887 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:46:52.189828 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9wb8h" podStartSLOduration=0.829680682 podStartE2EDuration="3.189812727s" podCreationTimestamp="2026-04-21 04:46:49 +0000 UTC" firstStartedPulling="2026-04-21 04:46:49.501314216 +0000 UTC m=+464.622872014" lastFinishedPulling="2026-04-21 04:46:51.861446259 +0000 UTC m=+466.983004059" observedRunningTime="2026-04-21 04:46:52.186599047 +0000 UTC m=+467.308156853" watchObservedRunningTime="2026-04-21 04:46:52.189812727 +0000 UTC m=+467.311370515" Apr 21 04:47:03.168313 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:03.168278 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9wb8h" Apr 21 04:47:03.262758 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:03.262724 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xw2hr"] Apr 21 04:47:03.265927 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:03.265911 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xw2hr" Apr 21 04:47:03.268873 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:03.268851 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-rxf4s\"" Apr 21 04:47:03.278090 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:03.278050 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xw2hr"] Apr 21 04:47:03.401651 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:03.401614 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7d1e892c-2a50-4136-bda4-431ffab2513b-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-xw2hr\" (UID: \"7d1e892c-2a50-4136-bda4-431ffab2513b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xw2hr" Apr 21 04:47:03.401651 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:03.401665 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntl9t\" (UniqueName: \"kubernetes.io/projected/7d1e892c-2a50-4136-bda4-431ffab2513b-kube-api-access-ntl9t\") pod \"kuadrant-operator-controller-manager-55c7f4c975-xw2hr\" (UID: \"7d1e892c-2a50-4136-bda4-431ffab2513b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xw2hr" Apr 21 04:47:03.503124 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:03.503016 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7d1e892c-2a50-4136-bda4-431ffab2513b-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-xw2hr\" (UID: \"7d1e892c-2a50-4136-bda4-431ffab2513b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xw2hr" Apr 21 04:47:03.503124 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:03.503100 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ntl9t\" (UniqueName: \"kubernetes.io/projected/7d1e892c-2a50-4136-bda4-431ffab2513b-kube-api-access-ntl9t\") pod \"kuadrant-operator-controller-manager-55c7f4c975-xw2hr\" (UID: \"7d1e892c-2a50-4136-bda4-431ffab2513b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xw2hr" Apr 21 04:47:03.503457 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:03.503438 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7d1e892c-2a50-4136-bda4-431ffab2513b-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-xw2hr\" (UID: \"7d1e892c-2a50-4136-bda4-431ffab2513b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xw2hr" Apr 21 04:47:03.512754 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:03.512724 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntl9t\" (UniqueName: \"kubernetes.io/projected/7d1e892c-2a50-4136-bda4-431ffab2513b-kube-api-access-ntl9t\") pod \"kuadrant-operator-controller-manager-55c7f4c975-xw2hr\" (UID: \"7d1e892c-2a50-4136-bda4-431ffab2513b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xw2hr" Apr 21 04:47:03.576317 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:03.576287 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xw2hr" Apr 21 04:47:03.712035 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:03.712002 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xw2hr"] Apr 21 04:47:03.715859 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:47:03.715829 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d1e892c_2a50_4136_bda4_431ffab2513b.slice/crio-ef9d96ac7953be71ccf655d49f07f9e843b19777986e715704f0fc5bc358ef30 WatchSource:0}: Error finding container ef9d96ac7953be71ccf655d49f07f9e843b19777986e715704f0fc5bc358ef30: Status 404 returned error can't find the container with id ef9d96ac7953be71ccf655d49f07f9e843b19777986e715704f0fc5bc358ef30 Apr 21 04:47:04.201552 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:04.201518 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xw2hr" event={"ID":"7d1e892c-2a50-4136-bda4-431ffab2513b","Type":"ContainerStarted","Data":"ef9d96ac7953be71ccf655d49f07f9e843b19777986e715704f0fc5bc358ef30"} Apr 21 04:47:08.217712 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:08.217676 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xw2hr" event={"ID":"7d1e892c-2a50-4136-bda4-431ffab2513b","Type":"ContainerStarted","Data":"6f58c6504f4bf7ec6d52dea16756616ab88925e3373bc431af310d37793655dd"} Apr 21 04:47:08.218186 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:08.217874 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xw2hr" Apr 21 04:47:08.239592 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:08.239543 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xw2hr" podStartSLOduration=1.52543835 podStartE2EDuration="5.239530381s" podCreationTimestamp="2026-04-21 04:47:03 +0000 UTC" firstStartedPulling="2026-04-21 04:47:03.718225725 +0000 UTC m=+478.839783510" lastFinishedPulling="2026-04-21 04:47:07.43231775 +0000 UTC m=+482.553875541" observedRunningTime="2026-04-21 04:47:08.237732398 +0000 UTC m=+483.359290205" watchObservedRunningTime="2026-04-21 04:47:08.239530381 +0000 UTC m=+483.361088188" Apr 21 04:47:19.224194 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:19.224163 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xw2hr" Apr 21 04:47:40.641583 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:40.641545 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-znfv8"] Apr 21 04:47:40.650203 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:40.650165 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-znfv8" Apr 21 04:47:40.651597 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:40.651574 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-znfv8"] Apr 21 04:47:40.652718 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:40.652694 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-xpblk\"" Apr 21 04:47:40.733807 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:40.733774 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgpt4\" (UniqueName: \"kubernetes.io/projected/3c0d69d7-e21e-4a02-b2e4-4a48044de5b0-kube-api-access-xgpt4\") pod \"authorino-f99f4b5cd-znfv8\" (UID: \"3c0d69d7-e21e-4a02-b2e4-4a48044de5b0\") " pod="kuadrant-system/authorino-f99f4b5cd-znfv8" Apr 21 04:47:40.835110 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:40.835048 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xgpt4\" (UniqueName: \"kubernetes.io/projected/3c0d69d7-e21e-4a02-b2e4-4a48044de5b0-kube-api-access-xgpt4\") pod \"authorino-f99f4b5cd-znfv8\" (UID: \"3c0d69d7-e21e-4a02-b2e4-4a48044de5b0\") " pod="kuadrant-system/authorino-f99f4b5cd-znfv8" Apr 21 04:47:40.846292 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:40.846258 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgpt4\" (UniqueName: \"kubernetes.io/projected/3c0d69d7-e21e-4a02-b2e4-4a48044de5b0-kube-api-access-xgpt4\") pod \"authorino-f99f4b5cd-znfv8\" (UID: \"3c0d69d7-e21e-4a02-b2e4-4a48044de5b0\") " pod="kuadrant-system/authorino-f99f4b5cd-znfv8" Apr 21 04:47:40.862878 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:40.862850 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-b2wqs"] Apr 21 04:47:40.866491 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:40.866472 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-b2wqs" Apr 21 04:47:40.874025 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:40.873881 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-b2wqs"] Apr 21 04:47:40.967850 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:40.967761 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-znfv8" Apr 21 04:47:41.036777 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:41.036726 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2gdk\" (UniqueName: \"kubernetes.io/projected/ad0e49eb-5d54-4dc7-98d3-1ada1c6596ca-kube-api-access-f2gdk\") pod \"authorino-7498df8756-b2wqs\" (UID: \"ad0e49eb-5d54-4dc7-98d3-1ada1c6596ca\") " pod="kuadrant-system/authorino-7498df8756-b2wqs" Apr 21 04:47:41.086408 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:41.086379 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-znfv8"] Apr 21 04:47:41.089103 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:47:41.089061 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c0d69d7_e21e_4a02_b2e4_4a48044de5b0.slice/crio-8086fe2aa5933efcb240b33fa5cf23ceb9568ab733aa241ce7dbf4175c58555a WatchSource:0}: Error finding container 8086fe2aa5933efcb240b33fa5cf23ceb9568ab733aa241ce7dbf4175c58555a: Status 404 returned error can't find the container with id 8086fe2aa5933efcb240b33fa5cf23ceb9568ab733aa241ce7dbf4175c58555a Apr 21 04:47:41.138046 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:41.138011 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f2gdk\" (UniqueName: \"kubernetes.io/projected/ad0e49eb-5d54-4dc7-98d3-1ada1c6596ca-kube-api-access-f2gdk\") pod \"authorino-7498df8756-b2wqs\" (UID: \"ad0e49eb-5d54-4dc7-98d3-1ada1c6596ca\") " pod="kuadrant-system/authorino-7498df8756-b2wqs" Apr 21 04:47:41.146646 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:41.146618 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2gdk\" (UniqueName: \"kubernetes.io/projected/ad0e49eb-5d54-4dc7-98d3-1ada1c6596ca-kube-api-access-f2gdk\") pod \"authorino-7498df8756-b2wqs\" (UID: \"ad0e49eb-5d54-4dc7-98d3-1ada1c6596ca\") " pod="kuadrant-system/authorino-7498df8756-b2wqs" Apr 21 04:47:41.177492 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:41.177456 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-b2wqs" Apr 21 04:47:41.299617 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:41.299504 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-b2wqs"] Apr 21 04:47:41.301795 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:47:41.301763 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad0e49eb_5d54_4dc7_98d3_1ada1c6596ca.slice/crio-a3df5832bf5dd9e66cf39144ad7f00a4eadaae9ca3b32ad30996419c74d88645 WatchSource:0}: Error finding container a3df5832bf5dd9e66cf39144ad7f00a4eadaae9ca3b32ad30996419c74d88645: Status 404 returned error can't find the container with id a3df5832bf5dd9e66cf39144ad7f00a4eadaae9ca3b32ad30996419c74d88645 Apr 21 04:47:41.333749 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:41.333712 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-znfv8" event={"ID":"3c0d69d7-e21e-4a02-b2e4-4a48044de5b0","Type":"ContainerStarted","Data":"8086fe2aa5933efcb240b33fa5cf23ceb9568ab733aa241ce7dbf4175c58555a"} Apr 21 04:47:41.334715 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:41.334693 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-b2wqs" event={"ID":"ad0e49eb-5d54-4dc7-98d3-1ada1c6596ca","Type":"ContainerStarted","Data":"a3df5832bf5dd9e66cf39144ad7f00a4eadaae9ca3b32ad30996419c74d88645"} Apr 21 04:47:45.351861 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:45.351825 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-znfv8" event={"ID":"3c0d69d7-e21e-4a02-b2e4-4a48044de5b0","Type":"ContainerStarted","Data":"5594ec5bb45f01c666cbadbbaa5e171cbea82bdf228809cb7a5e3f92294e40c8"} Apr 21 04:47:45.353196 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:45.353174 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-b2wqs" event={"ID":"ad0e49eb-5d54-4dc7-98d3-1ada1c6596ca","Type":"ContainerStarted","Data":"d9e2c3fc7ceb62d4f3ace7a9ec1cf0453f61b71260ead0ccfd67e5376e0037ff"} Apr 21 04:47:45.368408 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:45.368361 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-znfv8" podStartSLOduration=1.814908172 podStartE2EDuration="5.368348079s" podCreationTimestamp="2026-04-21 04:47:40 +0000 UTC" firstStartedPulling="2026-04-21 04:47:41.090569587 +0000 UTC m=+516.212127375" lastFinishedPulling="2026-04-21 04:47:44.644009487 +0000 UTC m=+519.765567282" observedRunningTime="2026-04-21 04:47:45.367339963 +0000 UTC m=+520.488897771" watchObservedRunningTime="2026-04-21 04:47:45.368348079 +0000 UTC m=+520.489905885" Apr 21 04:47:45.381429 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:45.381385 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-b2wqs" podStartSLOduration=2.029554456 podStartE2EDuration="5.381369826s" podCreationTimestamp="2026-04-21 04:47:40 +0000 UTC" firstStartedPulling="2026-04-21 04:47:41.303264437 +0000 UTC m=+516.424822222" lastFinishedPulling="2026-04-21 04:47:44.655079797 +0000 UTC m=+519.776637592" observedRunningTime="2026-04-21 04:47:45.380932182 +0000 UTC m=+520.502489989" watchObservedRunningTime="2026-04-21 04:47:45.381369826 +0000 UTC m=+520.502927638" Apr 21 04:47:45.415450 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:45.415413 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-znfv8"] Apr 21 04:47:47.359673 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:47.359635 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-znfv8" podUID="3c0d69d7-e21e-4a02-b2e4-4a48044de5b0" containerName="authorino" containerID="cri-o://5594ec5bb45f01c666cbadbbaa5e171cbea82bdf228809cb7a5e3f92294e40c8" gracePeriod=30 Apr 21 04:47:47.606178 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:47.606153 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-znfv8" Apr 21 04:47:47.804380 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:47.804339 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgpt4\" (UniqueName: \"kubernetes.io/projected/3c0d69d7-e21e-4a02-b2e4-4a48044de5b0-kube-api-access-xgpt4\") pod \"3c0d69d7-e21e-4a02-b2e4-4a48044de5b0\" (UID: \"3c0d69d7-e21e-4a02-b2e4-4a48044de5b0\") " Apr 21 04:47:47.806477 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:47.806450 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c0d69d7-e21e-4a02-b2e4-4a48044de5b0-kube-api-access-xgpt4" (OuterVolumeSpecName: "kube-api-access-xgpt4") pod "3c0d69d7-e21e-4a02-b2e4-4a48044de5b0" (UID: "3c0d69d7-e21e-4a02-b2e4-4a48044de5b0"). InnerVolumeSpecName "kube-api-access-xgpt4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:47:47.905308 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:47.905272 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xgpt4\" (UniqueName: \"kubernetes.io/projected/3c0d69d7-e21e-4a02-b2e4-4a48044de5b0-kube-api-access-xgpt4\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:47:48.365444 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:48.365409 2568 generic.go:358] "Generic (PLEG): container finished" podID="3c0d69d7-e21e-4a02-b2e4-4a48044de5b0" containerID="5594ec5bb45f01c666cbadbbaa5e171cbea82bdf228809cb7a5e3f92294e40c8" exitCode=0 Apr 21 04:47:48.365883 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:48.365460 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-znfv8" Apr 21 04:47:48.365883 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:48.365491 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-znfv8" event={"ID":"3c0d69d7-e21e-4a02-b2e4-4a48044de5b0","Type":"ContainerDied","Data":"5594ec5bb45f01c666cbadbbaa5e171cbea82bdf228809cb7a5e3f92294e40c8"} Apr 21 04:47:48.365883 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:48.365532 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-znfv8" event={"ID":"3c0d69d7-e21e-4a02-b2e4-4a48044de5b0","Type":"ContainerDied","Data":"8086fe2aa5933efcb240b33fa5cf23ceb9568ab733aa241ce7dbf4175c58555a"} Apr 21 04:47:48.365883 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:48.365548 2568 scope.go:117] "RemoveContainer" containerID="5594ec5bb45f01c666cbadbbaa5e171cbea82bdf228809cb7a5e3f92294e40c8" Apr 21 04:47:48.374377 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:48.374361 2568 scope.go:117] "RemoveContainer" containerID="5594ec5bb45f01c666cbadbbaa5e171cbea82bdf228809cb7a5e3f92294e40c8" Apr 21 04:47:48.374658 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:47:48.374638 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5594ec5bb45f01c666cbadbbaa5e171cbea82bdf228809cb7a5e3f92294e40c8\": container with ID starting with 5594ec5bb45f01c666cbadbbaa5e171cbea82bdf228809cb7a5e3f92294e40c8 not found: ID does not exist" containerID="5594ec5bb45f01c666cbadbbaa5e171cbea82bdf228809cb7a5e3f92294e40c8" Apr 21 04:47:48.374774 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:48.374666 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5594ec5bb45f01c666cbadbbaa5e171cbea82bdf228809cb7a5e3f92294e40c8"} err="failed to get container status \"5594ec5bb45f01c666cbadbbaa5e171cbea82bdf228809cb7a5e3f92294e40c8\": rpc error: code = NotFound desc = could not find container \"5594ec5bb45f01c666cbadbbaa5e171cbea82bdf228809cb7a5e3f92294e40c8\": container with ID starting with 5594ec5bb45f01c666cbadbbaa5e171cbea82bdf228809cb7a5e3f92294e40c8 not found: ID does not exist" Apr 21 04:47:48.387589 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:48.387559 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-znfv8"] Apr 21 04:47:48.391975 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:48.391951 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-znfv8"] Apr 21 04:47:49.485648 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:47:49.485606 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c0d69d7-e21e-4a02-b2e4-4a48044de5b0" path="/var/lib/kubelet/pods/3c0d69d7-e21e-4a02-b2e4-4a48044de5b0/volumes" Apr 21 04:48:38.096895 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:48:38.096863 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 04:48:38.097400 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:48:38.097254 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c0d69d7-e21e-4a02-b2e4-4a48044de5b0" containerName="authorino" Apr 21 04:48:38.097400 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:48:38.097266 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c0d69d7-e21e-4a02-b2e4-4a48044de5b0" containerName="authorino" Apr 21 04:48:38.097400 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:48:38.097324 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="3c0d69d7-e21e-4a02-b2e4-4a48044de5b0" containerName="authorino" Apr 21 04:48:38.100564 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:48:38.100547 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 21 04:48:38.103359 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:48:38.103322 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 21 04:48:38.103461 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:48:38.103426 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 21 04:48:38.104366 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:48:38.104348 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"maas-keycloak-initial-admin\"" Apr 21 04:48:38.104447 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:48:38.104348 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"default-dockercfg-lmh74\"" Apr 21 04:48:38.108109 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:48:38.108047 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 04:48:38.147137 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:48:38.147106 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmpt9\" (UniqueName: \"kubernetes.io/projected/b0d228d8-53f1-48d3-a7cd-39205a1ff870-kube-api-access-qmpt9\") pod \"maas-keycloak-0\" (UID: \"b0d228d8-53f1-48d3-a7cd-39205a1ff870\") " pod="keycloak-system/maas-keycloak-0" Apr 21 04:48:38.247986 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:48:38.247950 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qmpt9\" (UniqueName: \"kubernetes.io/projected/b0d228d8-53f1-48d3-a7cd-39205a1ff870-kube-api-access-qmpt9\") pod \"maas-keycloak-0\" (UID: \"b0d228d8-53f1-48d3-a7cd-39205a1ff870\") " pod="keycloak-system/maas-keycloak-0" Apr 21 04:48:38.257927 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:48:38.257899 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmpt9\" (UniqueName: \"kubernetes.io/projected/b0d228d8-53f1-48d3-a7cd-39205a1ff870-kube-api-access-qmpt9\") pod \"maas-keycloak-0\" (UID: \"b0d228d8-53f1-48d3-a7cd-39205a1ff870\") " pod="keycloak-system/maas-keycloak-0" Apr 21 04:48:38.412414 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:48:38.412366 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 21 04:48:38.535791 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:48:38.535764 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 04:48:38.537658 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:48:38.537627 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0d228d8_53f1_48d3_a7cd_39205a1ff870.slice/crio-e9b6551db59b61959af0d5867efc339069d46f673d8b4ea0a2e2b67e6dfbad53 WatchSource:0}: Error finding container e9b6551db59b61959af0d5867efc339069d46f673d8b4ea0a2e2b67e6dfbad53: Status 404 returned error can't find the container with id e9b6551db59b61959af0d5867efc339069d46f673d8b4ea0a2e2b67e6dfbad53 Apr 21 04:48:38.546129 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:48:38.546100 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"b0d228d8-53f1-48d3-a7cd-39205a1ff870","Type":"ContainerStarted","Data":"e9b6551db59b61959af0d5867efc339069d46f673d8b4ea0a2e2b67e6dfbad53"} Apr 21 04:48:44.574049 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:48:44.573993 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"b0d228d8-53f1-48d3-a7cd-39205a1ff870","Type":"ContainerStarted","Data":"82f3e635467c2c37a2f914efa79e62b99cfafe707a630554d51fe6e82ac09601"} Apr 21 04:48:44.592947 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:48:44.592884 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/maas-keycloak-0" podStartSLOduration=1.489412435 podStartE2EDuration="6.59286773s" podCreationTimestamp="2026-04-21 04:48:38 +0000 UTC" firstStartedPulling="2026-04-21 04:48:38.539106867 +0000 UTC m=+573.660664653" lastFinishedPulling="2026-04-21 04:48:43.642562148 +0000 UTC m=+578.764119948" observedRunningTime="2026-04-21 04:48:44.590746167 +0000 UTC m=+579.712304013" watchObservedRunningTime="2026-04-21 04:48:44.59286773 +0000 UTC m=+579.714425536" Apr 21 04:48:45.412610 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:48:45.412563 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keycloak-system/maas-keycloak-0" Apr 21 04:48:45.414297 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:48:45.414234 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="b0d228d8-53f1-48d3-a7cd-39205a1ff870" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.30:9000/health/started\": dial tcp 10.134.0.30:9000: connect: connection refused" Apr 21 04:48:46.413594 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:48:46.413551 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="b0d228d8-53f1-48d3-a7cd-39205a1ff870" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.30:9000/health/started\": dial tcp 10.134.0.30:9000: connect: connection refused" Apr 21 04:48:47.413309 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:48:47.413245 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="b0d228d8-53f1-48d3-a7cd-39205a1ff870" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.30:9000/health/started\": dial tcp 10.134.0.30:9000: connect: connection refused" Apr 21 04:48:48.412889 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:48:48.412848 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="keycloak-system/maas-keycloak-0" Apr 21 04:48:48.413367 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:48:48.413226 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="b0d228d8-53f1-48d3-a7cd-39205a1ff870" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.30:9000/health/started\": dial tcp 10.134.0.30:9000: connect: connection refused" Apr 21 04:48:49.413082 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:48:49.413014 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="b0d228d8-53f1-48d3-a7cd-39205a1ff870" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.30:9000/health/started\": dial tcp 10.134.0.30:9000: connect: connection refused" Apr 21 04:48:50.413309 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:48:50.413250 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="b0d228d8-53f1-48d3-a7cd-39205a1ff870" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.30:9000/health/started\": dial tcp 10.134.0.30:9000: connect: connection refused" Apr 21 04:48:51.413869 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:48:51.413818 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="b0d228d8-53f1-48d3-a7cd-39205a1ff870" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.30:9000/health/started\": dial tcp 10.134.0.30:9000: connect: connection refused" Apr 21 04:48:52.413247 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:48:52.413200 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="b0d228d8-53f1-48d3-a7cd-39205a1ff870" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.30:9000/health/started\": dial tcp 10.134.0.30:9000: connect: connection refused" Apr 21 04:48:53.413666 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:48:53.413613 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="b0d228d8-53f1-48d3-a7cd-39205a1ff870" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.30:9000/health/started\": dial tcp 10.134.0.30:9000: connect: connection refused" Apr 21 04:48:54.413684 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:48:54.413632 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="b0d228d8-53f1-48d3-a7cd-39205a1ff870" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.30:9000/health/started\": dial tcp 10.134.0.30:9000: connect: connection refused" Apr 21 04:48:55.413339 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:48:55.413285 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="b0d228d8-53f1-48d3-a7cd-39205a1ff870" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.30:9000/health/started\": dial tcp 10.134.0.30:9000: connect: connection refused" Apr 21 04:48:56.413421 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:48:56.413364 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="b0d228d8-53f1-48d3-a7cd-39205a1ff870" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.30:9000/health/started\": dial tcp 10.134.0.30:9000: connect: connection refused" Apr 21 04:48:57.545928 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:48:57.545891 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="keycloak-system/maas-keycloak-0" Apr 21 04:48:57.562120 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:48:57.562010 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="keycloak-system/maas-keycloak-0" podUID="b0d228d8-53f1-48d3-a7cd-39205a1ff870" containerName="keycloak" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:49:05.406757 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:05.406728 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wdf4b_8f53d634-201a-47a8-bb8f-a939d320e536/console-operator/2.log" Apr 21 04:49:05.406757 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:05.406749 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wdf4b_8f53d634-201a-47a8-bb8f-a939d320e536/console-operator/2.log" Apr 21 04:49:05.413330 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:05.413309 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jhdlv_8b8f621a-cf82-4827-a595-5b1724d2e7df/ovn-acl-logging/0.log" Apr 21 04:49:05.413330 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:05.413322 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jhdlv_8b8f621a-cf82-4827-a595-5b1724d2e7df/ovn-acl-logging/0.log" Apr 21 04:49:07.553347 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:07.553310 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="keycloak-system/maas-keycloak-0" Apr 21 04:49:08.446025 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:08.445985 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-dldt2"] Apr 21 04:49:08.456008 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:08.455984 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-dldt2" Apr 21 04:49:08.456501 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:08.456478 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-dldt2"] Apr 21 04:49:08.546310 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:08.546269 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzgwp\" (UniqueName: \"kubernetes.io/projected/8d084e33-dbb8-4897-9fa5-121df2fc13ba-kube-api-access-zzgwp\") pod \"authorino-8b475cf9f-dldt2\" (UID: \"8d084e33-dbb8-4897-9fa5-121df2fc13ba\") " pod="kuadrant-system/authorino-8b475cf9f-dldt2" Apr 21 04:49:08.628403 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:08.628365 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-dldt2"] Apr 21 04:49:08.628995 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:49:08.628618 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-zzgwp], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-8b475cf9f-dldt2" podUID="8d084e33-dbb8-4897-9fa5-121df2fc13ba" Apr 21 04:49:08.646994 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:08.646965 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zzgwp\" (UniqueName: \"kubernetes.io/projected/8d084e33-dbb8-4897-9fa5-121df2fc13ba-kube-api-access-zzgwp\") pod \"authorino-8b475cf9f-dldt2\" (UID: \"8d084e33-dbb8-4897-9fa5-121df2fc13ba\") " pod="kuadrant-system/authorino-8b475cf9f-dldt2" Apr 21 04:49:08.654854 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:08.654829 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-56fdd757f5-pz44q"] Apr 21 04:49:08.658491 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:08.658476 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56fdd757f5-pz44q" Apr 21 04:49:08.661379 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:08.661359 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 21 04:49:08.661549 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:08.661530 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzgwp\" (UniqueName: \"kubernetes.io/projected/8d084e33-dbb8-4897-9fa5-121df2fc13ba-kube-api-access-zzgwp\") pod \"authorino-8b475cf9f-dldt2\" (UID: \"8d084e33-dbb8-4897-9fa5-121df2fc13ba\") " pod="kuadrant-system/authorino-8b475cf9f-dldt2" Apr 21 04:49:08.669523 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:08.669499 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-56fdd757f5-pz44q"] Apr 21 04:49:08.675007 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:08.674980 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-dldt2" Apr 21 04:49:08.682175 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:08.682153 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-dldt2" Apr 21 04:49:08.721019 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:08.720942 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-56fdd757f5-pz44q"] Apr 21 04:49:08.721263 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:49:08.721240 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-7244x tls-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-56fdd757f5-pz44q" podUID="0acb8a82-c86a-4810-ba96-30abdfdb7ee4" Apr 21 04:49:08.747628 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:08.747599 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/0acb8a82-c86a-4810-ba96-30abdfdb7ee4-tls-cert\") pod \"authorino-56fdd757f5-pz44q\" (UID: \"0acb8a82-c86a-4810-ba96-30abdfdb7ee4\") " pod="kuadrant-system/authorino-56fdd757f5-pz44q" Apr 21 04:49:08.747744 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:08.747701 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7244x\" (UniqueName: \"kubernetes.io/projected/0acb8a82-c86a-4810-ba96-30abdfdb7ee4-kube-api-access-7244x\") pod \"authorino-56fdd757f5-pz44q\" (UID: \"0acb8a82-c86a-4810-ba96-30abdfdb7ee4\") " pod="kuadrant-system/authorino-56fdd757f5-pz44q" Apr 21 04:49:08.750381 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:08.750359 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-d5fc7c786-mbdf7"] Apr 21 04:49:08.753905 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:08.753890 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-d5fc7c786-mbdf7" Apr 21 04:49:08.763104 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:08.763064 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-d5fc7c786-mbdf7"] Apr 21 04:49:08.848349 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:08.848315 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzgwp\" (UniqueName: \"kubernetes.io/projected/8d084e33-dbb8-4897-9fa5-121df2fc13ba-kube-api-access-zzgwp\") pod \"8d084e33-dbb8-4897-9fa5-121df2fc13ba\" (UID: \"8d084e33-dbb8-4897-9fa5-121df2fc13ba\") " Apr 21 04:49:08.848527 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:08.848450 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/0acb8a82-c86a-4810-ba96-30abdfdb7ee4-tls-cert\") pod \"authorino-56fdd757f5-pz44q\" (UID: \"0acb8a82-c86a-4810-ba96-30abdfdb7ee4\") " pod="kuadrant-system/authorino-56fdd757f5-pz44q" Apr 21 04:49:08.848527 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:08.848485 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/404b9a39-e6cc-4ca7-8fcb-bcb93c01851b-tls-cert\") pod \"authorino-d5fc7c786-mbdf7\" (UID: \"404b9a39-e6cc-4ca7-8fcb-bcb93c01851b\") " pod="kuadrant-system/authorino-d5fc7c786-mbdf7" Apr 21 04:49:08.848527 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:08.848510 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbldt\" (UniqueName: \"kubernetes.io/projected/404b9a39-e6cc-4ca7-8fcb-bcb93c01851b-kube-api-access-cbldt\") pod \"authorino-d5fc7c786-mbdf7\" (UID: \"404b9a39-e6cc-4ca7-8fcb-bcb93c01851b\") " pod="kuadrant-system/authorino-d5fc7c786-mbdf7" Apr 21 04:49:08.848661 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:08.848579 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7244x\" (UniqueName: \"kubernetes.io/projected/0acb8a82-c86a-4810-ba96-30abdfdb7ee4-kube-api-access-7244x\") pod \"authorino-56fdd757f5-pz44q\" (UID: \"0acb8a82-c86a-4810-ba96-30abdfdb7ee4\") " pod="kuadrant-system/authorino-56fdd757f5-pz44q" Apr 21 04:49:08.850419 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:08.850387 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d084e33-dbb8-4897-9fa5-121df2fc13ba-kube-api-access-zzgwp" (OuterVolumeSpecName: "kube-api-access-zzgwp") pod "8d084e33-dbb8-4897-9fa5-121df2fc13ba" (UID: "8d084e33-dbb8-4897-9fa5-121df2fc13ba"). InnerVolumeSpecName "kube-api-access-zzgwp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:49:08.850890 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:08.850873 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/0acb8a82-c86a-4810-ba96-30abdfdb7ee4-tls-cert\") pod \"authorino-56fdd757f5-pz44q\" (UID: \"0acb8a82-c86a-4810-ba96-30abdfdb7ee4\") " pod="kuadrant-system/authorino-56fdd757f5-pz44q" Apr 21 04:49:08.856279 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:08.856257 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7244x\" (UniqueName: \"kubernetes.io/projected/0acb8a82-c86a-4810-ba96-30abdfdb7ee4-kube-api-access-7244x\") pod \"authorino-56fdd757f5-pz44q\" (UID: \"0acb8a82-c86a-4810-ba96-30abdfdb7ee4\") " pod="kuadrant-system/authorino-56fdd757f5-pz44q" Apr 21 04:49:08.949679 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:08.949629 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/404b9a39-e6cc-4ca7-8fcb-bcb93c01851b-tls-cert\") pod \"authorino-d5fc7c786-mbdf7\" (UID: \"404b9a39-e6cc-4ca7-8fcb-bcb93c01851b\") " pod="kuadrant-system/authorino-d5fc7c786-mbdf7" Apr 21 04:49:08.949872 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:08.949689 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbldt\" (UniqueName: \"kubernetes.io/projected/404b9a39-e6cc-4ca7-8fcb-bcb93c01851b-kube-api-access-cbldt\") pod \"authorino-d5fc7c786-mbdf7\" (UID: \"404b9a39-e6cc-4ca7-8fcb-bcb93c01851b\") " pod="kuadrant-system/authorino-d5fc7c786-mbdf7" Apr 21 04:49:08.949872 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:08.949759 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zzgwp\" (UniqueName: \"kubernetes.io/projected/8d084e33-dbb8-4897-9fa5-121df2fc13ba-kube-api-access-zzgwp\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:49:08.952140 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:08.952119 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/404b9a39-e6cc-4ca7-8fcb-bcb93c01851b-tls-cert\") pod \"authorino-d5fc7c786-mbdf7\" (UID: \"404b9a39-e6cc-4ca7-8fcb-bcb93c01851b\") " pod="kuadrant-system/authorino-d5fc7c786-mbdf7" Apr 21 04:49:08.958115 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:08.958063 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbldt\" (UniqueName: \"kubernetes.io/projected/404b9a39-e6cc-4ca7-8fcb-bcb93c01851b-kube-api-access-cbldt\") pod \"authorino-d5fc7c786-mbdf7\" (UID: \"404b9a39-e6cc-4ca7-8fcb-bcb93c01851b\") " pod="kuadrant-system/authorino-d5fc7c786-mbdf7" Apr 21 04:49:09.063821 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:09.063737 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-d5fc7c786-mbdf7" Apr 21 04:49:09.190348 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:09.190321 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-d5fc7c786-mbdf7"] Apr 21 04:49:09.192007 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:49:09.191965 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod404b9a39_e6cc_4ca7_8fcb_bcb93c01851b.slice/crio-fb5f84e14cddd537bcdd8af764bbcf04dbe2d6a9b00449d4ee50fe5f7be1b89c WatchSource:0}: Error finding container fb5f84e14cddd537bcdd8af764bbcf04dbe2d6a9b00449d4ee50fe5f7be1b89c: Status 404 returned error can't find the container with id fb5f84e14cddd537bcdd8af764bbcf04dbe2d6a9b00449d4ee50fe5f7be1b89c Apr 21 04:49:09.681360 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:09.681321 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-d5fc7c786-mbdf7" event={"ID":"404b9a39-e6cc-4ca7-8fcb-bcb93c01851b","Type":"ContainerStarted","Data":"e7512c5b5d293de136a68aecbdf7e79ada91619739f147bd1d5090a0cf4e44e8"} Apr 21 04:49:09.681798 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:09.681372 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56fdd757f5-pz44q" Apr 21 04:49:09.681798 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:09.681397 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-dldt2" Apr 21 04:49:09.681798 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:09.681398 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-d5fc7c786-mbdf7" event={"ID":"404b9a39-e6cc-4ca7-8fcb-bcb93c01851b","Type":"ContainerStarted","Data":"fb5f84e14cddd537bcdd8af764bbcf04dbe2d6a9b00449d4ee50fe5f7be1b89c"} Apr 21 04:49:09.690877 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:09.690854 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56fdd757f5-pz44q" Apr 21 04:49:09.700955 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:09.700915 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-d5fc7c786-mbdf7" podStartSLOduration=1.342779204 podStartE2EDuration="1.700903112s" podCreationTimestamp="2026-04-21 04:49:08 +0000 UTC" firstStartedPulling="2026-04-21 04:49:09.193290342 +0000 UTC m=+604.314848128" lastFinishedPulling="2026-04-21 04:49:09.551414247 +0000 UTC m=+604.672972036" observedRunningTime="2026-04-21 04:49:09.700087906 +0000 UTC m=+604.821645708" watchObservedRunningTime="2026-04-21 04:49:09.700903112 +0000 UTC m=+604.822460921" Apr 21 04:49:09.723891 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:09.723864 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-b2wqs"] Apr 21 04:49:09.724063 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:09.724039 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-b2wqs" podUID="ad0e49eb-5d54-4dc7-98d3-1ada1c6596ca" containerName="authorino" containerID="cri-o://d9e2c3fc7ceb62d4f3ace7a9ec1cf0453f61b71260ead0ccfd67e5376e0037ff" gracePeriod=30 Apr 21 04:49:09.738347 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:09.738317 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-dldt2"] Apr 21 04:49:09.742835 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:09.742813 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-dldt2"] Apr 21 04:49:09.857184 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:09.857154 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7244x\" (UniqueName: \"kubernetes.io/projected/0acb8a82-c86a-4810-ba96-30abdfdb7ee4-kube-api-access-7244x\") pod \"0acb8a82-c86a-4810-ba96-30abdfdb7ee4\" (UID: \"0acb8a82-c86a-4810-ba96-30abdfdb7ee4\") " Apr 21 04:49:09.857343 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:09.857198 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/0acb8a82-c86a-4810-ba96-30abdfdb7ee4-tls-cert\") pod \"0acb8a82-c86a-4810-ba96-30abdfdb7ee4\" (UID: \"0acb8a82-c86a-4810-ba96-30abdfdb7ee4\") " Apr 21 04:49:09.859188 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:09.859160 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0acb8a82-c86a-4810-ba96-30abdfdb7ee4-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "0acb8a82-c86a-4810-ba96-30abdfdb7ee4" (UID: "0acb8a82-c86a-4810-ba96-30abdfdb7ee4"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:49:09.859188 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:09.859180 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0acb8a82-c86a-4810-ba96-30abdfdb7ee4-kube-api-access-7244x" (OuterVolumeSpecName: "kube-api-access-7244x") pod "0acb8a82-c86a-4810-ba96-30abdfdb7ee4" (UID: "0acb8a82-c86a-4810-ba96-30abdfdb7ee4"). InnerVolumeSpecName "kube-api-access-7244x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:49:09.957989 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:09.957962 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7244x\" (UniqueName: \"kubernetes.io/projected/0acb8a82-c86a-4810-ba96-30abdfdb7ee4-kube-api-access-7244x\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:49:09.958145 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:09.957992 2568 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/0acb8a82-c86a-4810-ba96-30abdfdb7ee4-tls-cert\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:49:09.958913 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:09.958889 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-b2wqs" Apr 21 04:49:10.058546 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:10.058447 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2gdk\" (UniqueName: \"kubernetes.io/projected/ad0e49eb-5d54-4dc7-98d3-1ada1c6596ca-kube-api-access-f2gdk\") pod \"ad0e49eb-5d54-4dc7-98d3-1ada1c6596ca\" (UID: \"ad0e49eb-5d54-4dc7-98d3-1ada1c6596ca\") " Apr 21 04:49:10.060518 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:10.060491 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad0e49eb-5d54-4dc7-98d3-1ada1c6596ca-kube-api-access-f2gdk" (OuterVolumeSpecName: "kube-api-access-f2gdk") pod "ad0e49eb-5d54-4dc7-98d3-1ada1c6596ca" (UID: "ad0e49eb-5d54-4dc7-98d3-1ada1c6596ca"). InnerVolumeSpecName "kube-api-access-f2gdk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:49:10.160033 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:10.159994 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f2gdk\" (UniqueName: \"kubernetes.io/projected/ad0e49eb-5d54-4dc7-98d3-1ada1c6596ca-kube-api-access-f2gdk\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:49:10.685903 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:10.685867 2568 generic.go:358] "Generic (PLEG): container finished" podID="ad0e49eb-5d54-4dc7-98d3-1ada1c6596ca" containerID="d9e2c3fc7ceb62d4f3ace7a9ec1cf0453f61b71260ead0ccfd67e5376e0037ff" exitCode=0 Apr 21 04:49:10.686364 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:10.685918 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-b2wqs" Apr 21 04:49:10.686364 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:10.685942 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-b2wqs" event={"ID":"ad0e49eb-5d54-4dc7-98d3-1ada1c6596ca","Type":"ContainerDied","Data":"d9e2c3fc7ceb62d4f3ace7a9ec1cf0453f61b71260ead0ccfd67e5376e0037ff"} Apr 21 04:49:10.686364 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:10.685979 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-b2wqs" event={"ID":"ad0e49eb-5d54-4dc7-98d3-1ada1c6596ca","Type":"ContainerDied","Data":"a3df5832bf5dd9e66cf39144ad7f00a4eadaae9ca3b32ad30996419c74d88645"} Apr 21 04:49:10.686364 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:10.685993 2568 scope.go:117] "RemoveContainer" containerID="d9e2c3fc7ceb62d4f3ace7a9ec1cf0453f61b71260ead0ccfd67e5376e0037ff" Apr 21 04:49:10.686364 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:10.686052 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56fdd757f5-pz44q" Apr 21 04:49:10.696845 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:10.695940 2568 scope.go:117] "RemoveContainer" containerID="d9e2c3fc7ceb62d4f3ace7a9ec1cf0453f61b71260ead0ccfd67e5376e0037ff" Apr 21 04:49:10.696845 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:49:10.696370 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9e2c3fc7ceb62d4f3ace7a9ec1cf0453f61b71260ead0ccfd67e5376e0037ff\": container with ID starting with d9e2c3fc7ceb62d4f3ace7a9ec1cf0453f61b71260ead0ccfd67e5376e0037ff not found: ID does not exist" containerID="d9e2c3fc7ceb62d4f3ace7a9ec1cf0453f61b71260ead0ccfd67e5376e0037ff" Apr 21 04:49:10.696845 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:10.696413 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9e2c3fc7ceb62d4f3ace7a9ec1cf0453f61b71260ead0ccfd67e5376e0037ff"} err="failed to get container status \"d9e2c3fc7ceb62d4f3ace7a9ec1cf0453f61b71260ead0ccfd67e5376e0037ff\": rpc error: code = NotFound desc = could not find container \"d9e2c3fc7ceb62d4f3ace7a9ec1cf0453f61b71260ead0ccfd67e5376e0037ff\": container with ID starting with d9e2c3fc7ceb62d4f3ace7a9ec1cf0453f61b71260ead0ccfd67e5376e0037ff not found: ID does not exist" Apr 21 04:49:10.732561 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:10.732527 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-56fdd757f5-pz44q"] Apr 21 04:49:10.738120 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:10.738091 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-56fdd757f5-pz44q"] Apr 21 04:49:10.747731 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:10.747708 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-b2wqs"] Apr 21 04:49:10.751305 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:10.751283 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-b2wqs"] Apr 21 04:49:10.838146 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:10.838119 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-77c85978cd-sk7z6"] Apr 21 04:49:10.838522 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:10.838511 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad0e49eb-5d54-4dc7-98d3-1ada1c6596ca" containerName="authorino" Apr 21 04:49:10.838568 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:10.838524 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad0e49eb-5d54-4dc7-98d3-1ada1c6596ca" containerName="authorino" Apr 21 04:49:10.838602 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:10.838581 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad0e49eb-5d54-4dc7-98d3-1ada1c6596ca" containerName="authorino" Apr 21 04:49:10.842966 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:10.842950 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-77c85978cd-sk7z6" Apr 21 04:49:10.847784 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:10.847767 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-n4cb5\"" Apr 21 04:49:10.861371 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:10.861349 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-77c85978cd-sk7z6"] Apr 21 04:49:10.967130 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:10.967007 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl6s5\" (UniqueName: \"kubernetes.io/projected/baced40b-d8b2-4484-88f0-6445726d4775-kube-api-access-rl6s5\") pod \"maas-controller-77c85978cd-sk7z6\" (UID: \"baced40b-d8b2-4484-88f0-6445726d4775\") " pod="opendatahub/maas-controller-77c85978cd-sk7z6" Apr 21 04:49:11.067452 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:11.067409 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rl6s5\" (UniqueName: \"kubernetes.io/projected/baced40b-d8b2-4484-88f0-6445726d4775-kube-api-access-rl6s5\") pod \"maas-controller-77c85978cd-sk7z6\" (UID: \"baced40b-d8b2-4484-88f0-6445726d4775\") " pod="opendatahub/maas-controller-77c85978cd-sk7z6" Apr 21 04:49:11.076497 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:11.076465 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl6s5\" (UniqueName: \"kubernetes.io/projected/baced40b-d8b2-4484-88f0-6445726d4775-kube-api-access-rl6s5\") pod \"maas-controller-77c85978cd-sk7z6\" (UID: \"baced40b-d8b2-4484-88f0-6445726d4775\") " pod="opendatahub/maas-controller-77c85978cd-sk7z6" Apr 21 04:49:11.153431 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:11.153395 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-77c85978cd-sk7z6" Apr 21 04:49:11.274487 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:11.274454 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-77c85978cd-sk7z6"] Apr 21 04:49:11.277375 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:49:11.277346 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbaced40b_d8b2_4484_88f0_6445726d4775.slice/crio-a168737cbd0bdf2b5ccdd660ac84082c1f37676387233df37635edfbabf83a8e WatchSource:0}: Error finding container a168737cbd0bdf2b5ccdd660ac84082c1f37676387233df37635edfbabf83a8e: Status 404 returned error can't find the container with id a168737cbd0bdf2b5ccdd660ac84082c1f37676387233df37635edfbabf83a8e Apr 21 04:49:11.485664 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:11.485575 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0acb8a82-c86a-4810-ba96-30abdfdb7ee4" path="/var/lib/kubelet/pods/0acb8a82-c86a-4810-ba96-30abdfdb7ee4/volumes" Apr 21 04:49:11.485902 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:11.485884 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d084e33-dbb8-4897-9fa5-121df2fc13ba" path="/var/lib/kubelet/pods/8d084e33-dbb8-4897-9fa5-121df2fc13ba/volumes" Apr 21 04:49:11.486169 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:11.486152 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad0e49eb-5d54-4dc7-98d3-1ada1c6596ca" path="/var/lib/kubelet/pods/ad0e49eb-5d54-4dc7-98d3-1ada1c6596ca/volumes" Apr 21 04:49:11.692844 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:11.692801 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-77c85978cd-sk7z6" event={"ID":"baced40b-d8b2-4484-88f0-6445726d4775","Type":"ContainerStarted","Data":"a168737cbd0bdf2b5ccdd660ac84082c1f37676387233df37635edfbabf83a8e"} Apr 21 04:49:13.701034 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:13.700943 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-77c85978cd-sk7z6" event={"ID":"baced40b-d8b2-4484-88f0-6445726d4775","Type":"ContainerStarted","Data":"c76fdc5f2017e1dcb94eb93b277d82f423189e16d1f30571a22654f7a19adeeb"} Apr 21 04:49:13.701034 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:13.701019 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-77c85978cd-sk7z6" Apr 21 04:49:13.736544 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:13.736493 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-77c85978cd-sk7z6" podStartSLOduration=1.5785616519999999 podStartE2EDuration="3.736479003s" podCreationTimestamp="2026-04-21 04:49:10 +0000 UTC" firstStartedPulling="2026-04-21 04:49:11.278875063 +0000 UTC m=+606.400432851" lastFinishedPulling="2026-04-21 04:49:13.436792406 +0000 UTC m=+608.558350202" observedRunningTime="2026-04-21 04:49:13.735169653 +0000 UTC m=+608.856727460" watchObservedRunningTime="2026-04-21 04:49:13.736479003 +0000 UTC m=+608.858036806" Apr 21 04:49:24.710332 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:24.710242 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-77c85978cd-sk7z6" Apr 21 04:49:37.396676 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:37.396643 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-77c85978cd-sk7z6"] Apr 21 04:49:37.397175 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:37.396895 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-77c85978cd-sk7z6" podUID="baced40b-d8b2-4484-88f0-6445726d4775" containerName="manager" containerID="cri-o://c76fdc5f2017e1dcb94eb93b277d82f423189e16d1f30571a22654f7a19adeeb" gracePeriod=10 Apr 21 04:49:37.638714 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:37.638693 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-77c85978cd-sk7z6" Apr 21 04:49:37.706365 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:37.706281 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rl6s5\" (UniqueName: \"kubernetes.io/projected/baced40b-d8b2-4484-88f0-6445726d4775-kube-api-access-rl6s5\") pod \"baced40b-d8b2-4484-88f0-6445726d4775\" (UID: \"baced40b-d8b2-4484-88f0-6445726d4775\") " Apr 21 04:49:37.708463 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:37.708435 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baced40b-d8b2-4484-88f0-6445726d4775-kube-api-access-rl6s5" (OuterVolumeSpecName: "kube-api-access-rl6s5") pod "baced40b-d8b2-4484-88f0-6445726d4775" (UID: "baced40b-d8b2-4484-88f0-6445726d4775"). InnerVolumeSpecName "kube-api-access-rl6s5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:49:37.783731 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:37.783693 2568 generic.go:358] "Generic (PLEG): container finished" podID="baced40b-d8b2-4484-88f0-6445726d4775" containerID="c76fdc5f2017e1dcb94eb93b277d82f423189e16d1f30571a22654f7a19adeeb" exitCode=0 Apr 21 04:49:37.783881 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:37.783737 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-77c85978cd-sk7z6" event={"ID":"baced40b-d8b2-4484-88f0-6445726d4775","Type":"ContainerDied","Data":"c76fdc5f2017e1dcb94eb93b277d82f423189e16d1f30571a22654f7a19adeeb"} Apr 21 04:49:37.783881 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:37.783761 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-77c85978cd-sk7z6" Apr 21 04:49:37.783881 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:37.783776 2568 scope.go:117] "RemoveContainer" containerID="c76fdc5f2017e1dcb94eb93b277d82f423189e16d1f30571a22654f7a19adeeb" Apr 21 04:49:37.783881 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:37.783764 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-77c85978cd-sk7z6" event={"ID":"baced40b-d8b2-4484-88f0-6445726d4775","Type":"ContainerDied","Data":"a168737cbd0bdf2b5ccdd660ac84082c1f37676387233df37635edfbabf83a8e"} Apr 21 04:49:37.792800 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:37.792784 2568 scope.go:117] "RemoveContainer" containerID="c76fdc5f2017e1dcb94eb93b277d82f423189e16d1f30571a22654f7a19adeeb" Apr 21 04:49:37.793024 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:49:37.793004 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c76fdc5f2017e1dcb94eb93b277d82f423189e16d1f30571a22654f7a19adeeb\": container with ID starting with c76fdc5f2017e1dcb94eb93b277d82f423189e16d1f30571a22654f7a19adeeb not found: ID does not exist" containerID="c76fdc5f2017e1dcb94eb93b277d82f423189e16d1f30571a22654f7a19adeeb" Apr 21 04:49:37.793160 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:37.793031 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c76fdc5f2017e1dcb94eb93b277d82f423189e16d1f30571a22654f7a19adeeb"} err="failed to get container status \"c76fdc5f2017e1dcb94eb93b277d82f423189e16d1f30571a22654f7a19adeeb\": rpc error: code = NotFound desc = could not find container \"c76fdc5f2017e1dcb94eb93b277d82f423189e16d1f30571a22654f7a19adeeb\": container with ID starting with c76fdc5f2017e1dcb94eb93b277d82f423189e16d1f30571a22654f7a19adeeb not found: ID does not exist" Apr 21 04:49:37.807223 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:37.807193 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-77c85978cd-sk7z6"] Apr 21 04:49:37.807536 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:37.807519 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rl6s5\" (UniqueName: \"kubernetes.io/projected/baced40b-d8b2-4484-88f0-6445726d4775-kube-api-access-rl6s5\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:49:37.811365 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:37.811344 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-77c85978cd-sk7z6"] Apr 21 04:49:39.004141 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:39.004105 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 04:49:39.004586 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:39.004328 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="keycloak-system/maas-keycloak-0" podUID="b0d228d8-53f1-48d3-a7cd-39205a1ff870" containerName="keycloak" containerID="cri-o://82f3e635467c2c37a2f914efa79e62b99cfafe707a630554d51fe6e82ac09601" gracePeriod=30 Apr 21 04:49:39.485278 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:39.485246 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baced40b-d8b2-4484-88f0-6445726d4775" path="/var/lib/kubelet/pods/baced40b-d8b2-4484-88f0-6445726d4775/volumes" Apr 21 04:49:40.796426 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:40.796341 2568 generic.go:358] "Generic (PLEG): container finished" podID="b0d228d8-53f1-48d3-a7cd-39205a1ff870" containerID="82f3e635467c2c37a2f914efa79e62b99cfafe707a630554d51fe6e82ac09601" exitCode=143 Apr 21 04:49:40.796426 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:40.796390 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"b0d228d8-53f1-48d3-a7cd-39205a1ff870","Type":"ContainerDied","Data":"82f3e635467c2c37a2f914efa79e62b99cfafe707a630554d51fe6e82ac09601"} Apr 21 04:49:41.046612 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:41.046557 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 21 04:49:41.138091 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:41.138040 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmpt9\" (UniqueName: \"kubernetes.io/projected/b0d228d8-53f1-48d3-a7cd-39205a1ff870-kube-api-access-qmpt9\") pod \"b0d228d8-53f1-48d3-a7cd-39205a1ff870\" (UID: \"b0d228d8-53f1-48d3-a7cd-39205a1ff870\") " Apr 21 04:49:41.140146 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:41.140125 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0d228d8-53f1-48d3-a7cd-39205a1ff870-kube-api-access-qmpt9" (OuterVolumeSpecName: "kube-api-access-qmpt9") pod "b0d228d8-53f1-48d3-a7cd-39205a1ff870" (UID: "b0d228d8-53f1-48d3-a7cd-39205a1ff870"). InnerVolumeSpecName "kube-api-access-qmpt9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:49:41.239491 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:41.239457 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qmpt9\" (UniqueName: \"kubernetes.io/projected/b0d228d8-53f1-48d3-a7cd-39205a1ff870-kube-api-access-qmpt9\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:49:41.800839 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:41.800751 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"b0d228d8-53f1-48d3-a7cd-39205a1ff870","Type":"ContainerDied","Data":"e9b6551db59b61959af0d5867efc339069d46f673d8b4ea0a2e2b67e6dfbad53"} Apr 21 04:49:41.800839 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:41.800769 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 21 04:49:41.800839 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:41.800802 2568 scope.go:117] "RemoveContainer" containerID="82f3e635467c2c37a2f914efa79e62b99cfafe707a630554d51fe6e82ac09601" Apr 21 04:49:41.820906 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:41.820881 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 04:49:41.825342 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:41.825317 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 04:49:41.850600 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:41.850573 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 04:49:41.850945 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:41.850932 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0d228d8-53f1-48d3-a7cd-39205a1ff870" containerName="keycloak" Apr 21 04:49:41.850945 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:41.850946 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d228d8-53f1-48d3-a7cd-39205a1ff870" containerName="keycloak" Apr 21 04:49:41.851029 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:41.850964 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="baced40b-d8b2-4484-88f0-6445726d4775" containerName="manager" Apr 21 04:49:41.851029 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:41.850970 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="baced40b-d8b2-4484-88f0-6445726d4775" containerName="manager" Apr 21 04:49:41.851128 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:41.851031 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="b0d228d8-53f1-48d3-a7cd-39205a1ff870" containerName="keycloak" Apr 21 04:49:41.851128 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:41.851045 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="baced40b-d8b2-4484-88f0-6445726d4775" containerName="manager" Apr 21 04:49:41.855364 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:41.855348 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 21 04:49:41.857997 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:41.857972 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"keycloak-test-realms\"" Apr 21 04:49:41.858121 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:41.858017 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"default-dockercfg-lmh74\"" Apr 21 04:49:41.858121 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:41.857972 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"maas-keycloak-initial-admin\"" Apr 21 04:49:41.858121 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:41.858102 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 21 04:49:41.858295 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:41.858140 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 21 04:49:41.862660 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:41.862628 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 04:49:41.944708 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:41.944667 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chzm8\" (UniqueName: \"kubernetes.io/projected/fafb0b03-f904-4b6b-860f-350804783feb-kube-api-access-chzm8\") pod \"maas-keycloak-0\" (UID: \"fafb0b03-f904-4b6b-860f-350804783feb\") " pod="keycloak-system/maas-keycloak-0" Apr 21 04:49:41.944894 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:41.944728 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-realms\" (UniqueName: \"kubernetes.io/configmap/fafb0b03-f904-4b6b-860f-350804783feb-test-realms\") pod \"maas-keycloak-0\" (UID: \"fafb0b03-f904-4b6b-860f-350804783feb\") " pod="keycloak-system/maas-keycloak-0" Apr 21 04:49:42.045484 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:42.045451 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-chzm8\" (UniqueName: \"kubernetes.io/projected/fafb0b03-f904-4b6b-860f-350804783feb-kube-api-access-chzm8\") pod \"maas-keycloak-0\" (UID: \"fafb0b03-f904-4b6b-860f-350804783feb\") " pod="keycloak-system/maas-keycloak-0" Apr 21 04:49:42.045675 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:42.045495 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"test-realms\" (UniqueName: \"kubernetes.io/configmap/fafb0b03-f904-4b6b-860f-350804783feb-test-realms\") pod \"maas-keycloak-0\" (UID: \"fafb0b03-f904-4b6b-860f-350804783feb\") " pod="keycloak-system/maas-keycloak-0" Apr 21 04:49:42.046148 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:42.046126 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"test-realms\" (UniqueName: \"kubernetes.io/configmap/fafb0b03-f904-4b6b-860f-350804783feb-test-realms\") pod \"maas-keycloak-0\" (UID: \"fafb0b03-f904-4b6b-860f-350804783feb\") " pod="keycloak-system/maas-keycloak-0" Apr 21 04:49:42.055954 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:42.055900 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-chzm8\" (UniqueName: \"kubernetes.io/projected/fafb0b03-f904-4b6b-860f-350804783feb-kube-api-access-chzm8\") pod \"maas-keycloak-0\" (UID: \"fafb0b03-f904-4b6b-860f-350804783feb\") " pod="keycloak-system/maas-keycloak-0" Apr 21 04:49:42.165526 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:42.165472 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 21 04:49:42.288169 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:42.288145 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 21 04:49:42.291035 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:49:42.291009 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfafb0b03_f904_4b6b_860f_350804783feb.slice/crio-82f095834303d1c27e45cdfc2b8ddb75c9d07e0bec199653664e817d5bd220ad WatchSource:0}: Error finding container 82f095834303d1c27e45cdfc2b8ddb75c9d07e0bec199653664e817d5bd220ad: Status 404 returned error can't find the container with id 82f095834303d1c27e45cdfc2b8ddb75c9d07e0bec199653664e817d5bd220ad Apr 21 04:49:42.806623 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:42.806591 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"fafb0b03-f904-4b6b-860f-350804783feb","Type":"ContainerStarted","Data":"985b54f1ba55da97c0c7aae4060ec84f0f24d8a33b77338b611e55aa22bae010"} Apr 21 04:49:42.806623 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:42.806624 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"fafb0b03-f904-4b6b-860f-350804783feb","Type":"ContainerStarted","Data":"82f095834303d1c27e45cdfc2b8ddb75c9d07e0bec199653664e817d5bd220ad"} Apr 21 04:49:42.827187 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:42.827137 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/maas-keycloak-0" podStartSLOduration=1.459661822 podStartE2EDuration="1.827122092s" podCreationTimestamp="2026-04-21 04:49:41 +0000 UTC" firstStartedPulling="2026-04-21 04:49:42.292702542 +0000 UTC m=+637.414260328" lastFinishedPulling="2026-04-21 04:49:42.66016281 +0000 UTC m=+637.781720598" observedRunningTime="2026-04-21 04:49:42.825890723 +0000 UTC m=+637.947448569" watchObservedRunningTime="2026-04-21 04:49:42.827122092 +0000 UTC m=+637.948679899" Apr 21 04:49:43.166537 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:43.166474 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keycloak-system/maas-keycloak-0" Apr 21 04:49:43.168088 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:43.168037 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="fafb0b03-f904-4b6b-860f-350804783feb" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.35:9000/health/started\": dial tcp 10.134.0.35:9000: connect: connection refused" Apr 21 04:49:43.485982 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:43.485890 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0d228d8-53f1-48d3-a7cd-39205a1ff870" path="/var/lib/kubelet/pods/b0d228d8-53f1-48d3-a7cd-39205a1ff870/volumes" Apr 21 04:49:44.166960 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:44.166900 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="fafb0b03-f904-4b6b-860f-350804783feb" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.35:9000/health/started\": dial tcp 10.134.0.35:9000: connect: connection refused" Apr 21 04:49:45.166084 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:45.166019 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="fafb0b03-f904-4b6b-860f-350804783feb" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.35:9000/health/started\": dial tcp 10.134.0.35:9000: connect: connection refused" Apr 21 04:49:46.166690 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:46.166636 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="fafb0b03-f904-4b6b-860f-350804783feb" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.35:9000/health/started\": dial tcp 10.134.0.35:9000: connect: connection refused" Apr 21 04:49:47.166240 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:47.166188 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="fafb0b03-f904-4b6b-860f-350804783feb" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.35:9000/health/started\": dial tcp 10.134.0.35:9000: connect: connection refused" Apr 21 04:49:48.166362 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:48.166300 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="fafb0b03-f904-4b6b-860f-350804783feb" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.35:9000/health/started\": dial tcp 10.134.0.35:9000: connect: connection refused" Apr 21 04:49:49.166628 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:49.166499 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="fafb0b03-f904-4b6b-860f-350804783feb" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.35:9000/health/started\": dial tcp 10.134.0.35:9000: connect: connection refused" Apr 21 04:49:50.166720 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:50.166670 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="fafb0b03-f904-4b6b-860f-350804783feb" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.35:9000/health/started\": dial tcp 10.134.0.35:9000: connect: connection refused" Apr 21 04:49:51.166759 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:51.166703 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="fafb0b03-f904-4b6b-860f-350804783feb" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.35:9000/health/started\": dial tcp 10.134.0.35:9000: connect: connection refused" Apr 21 04:49:52.166476 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:52.166430 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="keycloak-system/maas-keycloak-0" Apr 21 04:49:52.166874 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:52.166839 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="fafb0b03-f904-4b6b-860f-350804783feb" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.35:9000/health/started\": dial tcp 10.134.0.35:9000: connect: connection refused" Apr 21 04:49:53.166403 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:53.166346 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="fafb0b03-f904-4b6b-860f-350804783feb" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.35:9000/health/started\": dial tcp 10.134.0.35:9000: connect: connection refused" Apr 21 04:49:53.697846 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:53.697810 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-b574d96b7-cjfq6"] Apr 21 04:49:53.706244 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:53.706214 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-b574d96b7-cjfq6" Apr 21 04:49:53.708477 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:53.708448 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-b574d96b7-cjfq6"] Apr 21 04:49:53.710360 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:53.710336 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-2zzbn\"" Apr 21 04:49:53.710607 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:53.710336 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 21 04:49:53.710704 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:53.710390 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 21 04:49:53.874368 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:53.874326 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/1208b807-53a8-4942-804a-0dedc391885d-maas-api-tls\") pod \"maas-api-b574d96b7-cjfq6\" (UID: \"1208b807-53a8-4942-804a-0dedc391885d\") " pod="opendatahub/maas-api-b574d96b7-cjfq6" Apr 21 04:49:53.874368 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:53.874382 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7jz5\" (UniqueName: \"kubernetes.io/projected/1208b807-53a8-4942-804a-0dedc391885d-kube-api-access-j7jz5\") pod \"maas-api-b574d96b7-cjfq6\" (UID: \"1208b807-53a8-4942-804a-0dedc391885d\") " pod="opendatahub/maas-api-b574d96b7-cjfq6" Apr 21 04:49:53.976022 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:53.975905 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/1208b807-53a8-4942-804a-0dedc391885d-maas-api-tls\") pod \"maas-api-b574d96b7-cjfq6\" (UID: \"1208b807-53a8-4942-804a-0dedc391885d\") " pod="opendatahub/maas-api-b574d96b7-cjfq6" Apr 21 04:49:53.976022 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:53.975960 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j7jz5\" (UniqueName: \"kubernetes.io/projected/1208b807-53a8-4942-804a-0dedc391885d-kube-api-access-j7jz5\") pod \"maas-api-b574d96b7-cjfq6\" (UID: \"1208b807-53a8-4942-804a-0dedc391885d\") " pod="opendatahub/maas-api-b574d96b7-cjfq6" Apr 21 04:49:53.979352 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:53.979317 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/1208b807-53a8-4942-804a-0dedc391885d-maas-api-tls\") pod \"maas-api-b574d96b7-cjfq6\" (UID: \"1208b807-53a8-4942-804a-0dedc391885d\") " pod="opendatahub/maas-api-b574d96b7-cjfq6" Apr 21 04:49:53.985150 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:53.985121 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7jz5\" (UniqueName: \"kubernetes.io/projected/1208b807-53a8-4942-804a-0dedc391885d-kube-api-access-j7jz5\") pod \"maas-api-b574d96b7-cjfq6\" (UID: \"1208b807-53a8-4942-804a-0dedc391885d\") " pod="opendatahub/maas-api-b574d96b7-cjfq6" Apr 21 04:49:54.022827 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:54.022787 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-b574d96b7-cjfq6" Apr 21 04:49:54.166475 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:54.166428 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="fafb0b03-f904-4b6b-860f-350804783feb" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.35:9000/health/started\": dial tcp 10.134.0.35:9000: connect: connection refused" Apr 21 04:49:54.187544 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:54.187517 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-b574d96b7-cjfq6"] Apr 21 04:49:54.189590 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:49:54.189549 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1208b807_53a8_4942_804a_0dedc391885d.slice/crio-fa5c67dbaff9b67573ba31a905873807ea56fa52e29c217c111e1fe2258dcea0 WatchSource:0}: Error finding container fa5c67dbaff9b67573ba31a905873807ea56fa52e29c217c111e1fe2258dcea0: Status 404 returned error can't find the container with id fa5c67dbaff9b67573ba31a905873807ea56fa52e29c217c111e1fe2258dcea0 Apr 21 04:49:54.191265 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:54.191241 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:49:54.863456 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:54.863414 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-b574d96b7-cjfq6" event={"ID":"1208b807-53a8-4942-804a-0dedc391885d","Type":"ContainerStarted","Data":"fa5c67dbaff9b67573ba31a905873807ea56fa52e29c217c111e1fe2258dcea0"} Apr 21 04:49:55.166031 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:55.165974 2568 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="fafb0b03-f904-4b6b-860f-350804783feb" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.35:9000/health/started\": dial tcp 10.134.0.35:9000: connect: connection refused" Apr 21 04:49:55.869235 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:55.869209 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-b574d96b7-cjfq6" event={"ID":"1208b807-53a8-4942-804a-0dedc391885d","Type":"ContainerStarted","Data":"1ba819328c122b61af3896ad6c1a93118ee2c278fc775b549c890c3201c21798"} Apr 21 04:49:55.869717 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:55.869364 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-b574d96b7-cjfq6" Apr 21 04:49:55.889275 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:55.889221 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-b574d96b7-cjfq6" podStartSLOduration=1.52913481 podStartE2EDuration="2.889206093s" podCreationTimestamp="2026-04-21 04:49:53 +0000 UTC" firstStartedPulling="2026-04-21 04:49:54.191417194 +0000 UTC m=+649.312974980" lastFinishedPulling="2026-04-21 04:49:55.551488469 +0000 UTC m=+650.673046263" observedRunningTime="2026-04-21 04:49:55.888217218 +0000 UTC m=+651.009775037" watchObservedRunningTime="2026-04-21 04:49:55.889206093 +0000 UTC m=+651.010763927" Apr 21 04:49:56.267024 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:56.265693 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="keycloak-system/maas-keycloak-0" Apr 21 04:49:56.282875 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:49:56.282831 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="keycloak-system/maas-keycloak-0" podUID="fafb0b03-f904-4b6b-860f-350804783feb" containerName="keycloak" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 04:50:01.881544 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:01.881501 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-b574d96b7-cjfq6" Apr 21 04:50:06.272300 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:06.272261 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="keycloak-system/maas-keycloak-0" Apr 21 04:50:16.988546 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:16.988514 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-79b69b4cbb-dq6bj"] Apr 21 04:50:16.993545 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:16.993523 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79b69b4cbb-dq6bj" Apr 21 04:50:16.996613 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:16.996593 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"authorino-oidc-ca\"" Apr 21 04:50:17.001914 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:17.001888 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79b69b4cbb-dq6bj"] Apr 21 04:50:17.102184 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:17.102150 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/6ab56247-b48a-436b-a7f8-b0b05baa54f5-tls-cert\") pod \"authorino-79b69b4cbb-dq6bj\" (UID: \"6ab56247-b48a-436b-a7f8-b0b05baa54f5\") " pod="kuadrant-system/authorino-79b69b4cbb-dq6bj" Apr 21 04:50:17.102387 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:17.102206 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/6ab56247-b48a-436b-a7f8-b0b05baa54f5-oidc-ca\") pod \"authorino-79b69b4cbb-dq6bj\" (UID: \"6ab56247-b48a-436b-a7f8-b0b05baa54f5\") " pod="kuadrant-system/authorino-79b69b4cbb-dq6bj" Apr 21 04:50:17.102387 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:17.102243 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q85xp\" (UniqueName: \"kubernetes.io/projected/6ab56247-b48a-436b-a7f8-b0b05baa54f5-kube-api-access-q85xp\") pod \"authorino-79b69b4cbb-dq6bj\" (UID: \"6ab56247-b48a-436b-a7f8-b0b05baa54f5\") " pod="kuadrant-system/authorino-79b69b4cbb-dq6bj" Apr 21 04:50:17.203209 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:17.203162 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/6ab56247-b48a-436b-a7f8-b0b05baa54f5-tls-cert\") pod \"authorino-79b69b4cbb-dq6bj\" (UID: \"6ab56247-b48a-436b-a7f8-b0b05baa54f5\") " pod="kuadrant-system/authorino-79b69b4cbb-dq6bj" Apr 21 04:50:17.203390 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:17.203232 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/6ab56247-b48a-436b-a7f8-b0b05baa54f5-oidc-ca\") pod \"authorino-79b69b4cbb-dq6bj\" (UID: \"6ab56247-b48a-436b-a7f8-b0b05baa54f5\") " pod="kuadrant-system/authorino-79b69b4cbb-dq6bj" Apr 21 04:50:17.203390 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:17.203269 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q85xp\" (UniqueName: \"kubernetes.io/projected/6ab56247-b48a-436b-a7f8-b0b05baa54f5-kube-api-access-q85xp\") pod \"authorino-79b69b4cbb-dq6bj\" (UID: \"6ab56247-b48a-436b-a7f8-b0b05baa54f5\") " pod="kuadrant-system/authorino-79b69b4cbb-dq6bj" Apr 21 04:50:17.204123 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:17.204097 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/6ab56247-b48a-436b-a7f8-b0b05baa54f5-oidc-ca\") pod \"authorino-79b69b4cbb-dq6bj\" (UID: \"6ab56247-b48a-436b-a7f8-b0b05baa54f5\") " pod="kuadrant-system/authorino-79b69b4cbb-dq6bj" Apr 21 04:50:17.205766 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:17.205745 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/6ab56247-b48a-436b-a7f8-b0b05baa54f5-tls-cert\") pod \"authorino-79b69b4cbb-dq6bj\" (UID: \"6ab56247-b48a-436b-a7f8-b0b05baa54f5\") " pod="kuadrant-system/authorino-79b69b4cbb-dq6bj" Apr 21 04:50:17.212464 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:17.212440 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q85xp\" (UniqueName: \"kubernetes.io/projected/6ab56247-b48a-436b-a7f8-b0b05baa54f5-kube-api-access-q85xp\") pod \"authorino-79b69b4cbb-dq6bj\" (UID: \"6ab56247-b48a-436b-a7f8-b0b05baa54f5\") " pod="kuadrant-system/authorino-79b69b4cbb-dq6bj" Apr 21 04:50:17.304032 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:17.303946 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79b69b4cbb-dq6bj" Apr 21 04:50:17.429517 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:17.429487 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79b69b4cbb-dq6bj"] Apr 21 04:50:17.432054 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:50:17.432010 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ab56247_b48a_436b_a7f8_b0b05baa54f5.slice/crio-df5ab6cf12432e8829aac96f5887d59644cefd8b515988d05db950ee8ceb8034 WatchSource:0}: Error finding container df5ab6cf12432e8829aac96f5887d59644cefd8b515988d05db950ee8ceb8034: Status 404 returned error can't find the container with id df5ab6cf12432e8829aac96f5887d59644cefd8b515988d05db950ee8ceb8034 Apr 21 04:50:17.952354 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:17.952315 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79b69b4cbb-dq6bj" event={"ID":"6ab56247-b48a-436b-a7f8-b0b05baa54f5","Type":"ContainerStarted","Data":"66be5e4abad04eeb98760b9727cca02297a61613320c740464b6c00bff2b5099"} Apr 21 04:50:17.952354 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:17.952351 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79b69b4cbb-dq6bj" event={"ID":"6ab56247-b48a-436b-a7f8-b0b05baa54f5","Type":"ContainerStarted","Data":"df5ab6cf12432e8829aac96f5887d59644cefd8b515988d05db950ee8ceb8034"} Apr 21 04:50:17.972220 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:17.972167 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-79b69b4cbb-dq6bj" podStartSLOduration=1.595298588 podStartE2EDuration="1.972152396s" podCreationTimestamp="2026-04-21 04:50:16 +0000 UTC" firstStartedPulling="2026-04-21 04:50:17.433361069 +0000 UTC m=+672.554918854" lastFinishedPulling="2026-04-21 04:50:17.81021487 +0000 UTC m=+672.931772662" observedRunningTime="2026-04-21 04:50:17.969535613 +0000 UTC m=+673.091093420" watchObservedRunningTime="2026-04-21 04:50:17.972152396 +0000 UTC m=+673.093710204" Apr 21 04:50:17.997364 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:17.997331 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-d5fc7c786-mbdf7"] Apr 21 04:50:17.997825 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:17.997528 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-d5fc7c786-mbdf7" podUID="404b9a39-e6cc-4ca7-8fcb-bcb93c01851b" containerName="authorino" containerID="cri-o://e7512c5b5d293de136a68aecbdf7e79ada91619739f147bd1d5090a0cf4e44e8" gracePeriod=30 Apr 21 04:50:18.268554 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:18.268531 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-d5fc7c786-mbdf7" Apr 21 04:50:18.415830 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:18.415792 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/404b9a39-e6cc-4ca7-8fcb-bcb93c01851b-tls-cert\") pod \"404b9a39-e6cc-4ca7-8fcb-bcb93c01851b\" (UID: \"404b9a39-e6cc-4ca7-8fcb-bcb93c01851b\") " Apr 21 04:50:18.416011 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:18.415894 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbldt\" (UniqueName: \"kubernetes.io/projected/404b9a39-e6cc-4ca7-8fcb-bcb93c01851b-kube-api-access-cbldt\") pod \"404b9a39-e6cc-4ca7-8fcb-bcb93c01851b\" (UID: \"404b9a39-e6cc-4ca7-8fcb-bcb93c01851b\") " Apr 21 04:50:18.417929 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:18.417899 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/404b9a39-e6cc-4ca7-8fcb-bcb93c01851b-kube-api-access-cbldt" (OuterVolumeSpecName: "kube-api-access-cbldt") pod "404b9a39-e6cc-4ca7-8fcb-bcb93c01851b" (UID: "404b9a39-e6cc-4ca7-8fcb-bcb93c01851b"). InnerVolumeSpecName "kube-api-access-cbldt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:50:18.428526 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:18.428492 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/404b9a39-e6cc-4ca7-8fcb-bcb93c01851b-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "404b9a39-e6cc-4ca7-8fcb-bcb93c01851b" (UID: "404b9a39-e6cc-4ca7-8fcb-bcb93c01851b"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:50:18.517357 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:18.517323 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cbldt\" (UniqueName: \"kubernetes.io/projected/404b9a39-e6cc-4ca7-8fcb-bcb93c01851b-kube-api-access-cbldt\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:50:18.517357 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:18.517352 2568 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/404b9a39-e6cc-4ca7-8fcb-bcb93c01851b-tls-cert\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:50:18.956791 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:18.956758 2568 generic.go:358] "Generic (PLEG): container finished" podID="404b9a39-e6cc-4ca7-8fcb-bcb93c01851b" containerID="e7512c5b5d293de136a68aecbdf7e79ada91619739f147bd1d5090a0cf4e44e8" exitCode=0 Apr 21 04:50:18.956989 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:18.956806 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-d5fc7c786-mbdf7" Apr 21 04:50:18.956989 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:18.956839 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-d5fc7c786-mbdf7" event={"ID":"404b9a39-e6cc-4ca7-8fcb-bcb93c01851b","Type":"ContainerDied","Data":"e7512c5b5d293de136a68aecbdf7e79ada91619739f147bd1d5090a0cf4e44e8"} Apr 21 04:50:18.956989 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:18.956878 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-d5fc7c786-mbdf7" event={"ID":"404b9a39-e6cc-4ca7-8fcb-bcb93c01851b","Type":"ContainerDied","Data":"fb5f84e14cddd537bcdd8af764bbcf04dbe2d6a9b00449d4ee50fe5f7be1b89c"} Apr 21 04:50:18.956989 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:18.956893 2568 scope.go:117] "RemoveContainer" containerID="e7512c5b5d293de136a68aecbdf7e79ada91619739f147bd1d5090a0cf4e44e8" Apr 21 04:50:18.965760 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:18.965742 2568 scope.go:117] "RemoveContainer" containerID="e7512c5b5d293de136a68aecbdf7e79ada91619739f147bd1d5090a0cf4e44e8" Apr 21 04:50:18.965995 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:50:18.965969 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7512c5b5d293de136a68aecbdf7e79ada91619739f147bd1d5090a0cf4e44e8\": container with ID starting with e7512c5b5d293de136a68aecbdf7e79ada91619739f147bd1d5090a0cf4e44e8 not found: ID does not exist" containerID="e7512c5b5d293de136a68aecbdf7e79ada91619739f147bd1d5090a0cf4e44e8" Apr 21 04:50:18.966063 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:18.966004 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7512c5b5d293de136a68aecbdf7e79ada91619739f147bd1d5090a0cf4e44e8"} err="failed to get container status \"e7512c5b5d293de136a68aecbdf7e79ada91619739f147bd1d5090a0cf4e44e8\": rpc error: code = NotFound desc = could not find container \"e7512c5b5d293de136a68aecbdf7e79ada91619739f147bd1d5090a0cf4e44e8\": container with ID starting with e7512c5b5d293de136a68aecbdf7e79ada91619739f147bd1d5090a0cf4e44e8 not found: ID does not exist" Apr 21 04:50:18.978290 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:18.978270 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-d5fc7c786-mbdf7"] Apr 21 04:50:18.985951 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:18.985927 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-d5fc7c786-mbdf7"] Apr 21 04:50:19.485865 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:19.485826 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="404b9a39-e6cc-4ca7-8fcb-bcb93c01851b" path="/var/lib/kubelet/pods/404b9a39-e6cc-4ca7-8fcb-bcb93c01851b/volumes" Apr 21 04:50:26.520453 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:26.520422 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j"] Apr 21 04:50:26.520893 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:26.520832 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="404b9a39-e6cc-4ca7-8fcb-bcb93c01851b" containerName="authorino" Apr 21 04:50:26.520893 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:26.520844 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="404b9a39-e6cc-4ca7-8fcb-bcb93c01851b" containerName="authorino" Apr 21 04:50:26.520963 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:26.520906 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="404b9a39-e6cc-4ca7-8fcb-bcb93c01851b" containerName="authorino" Apr 21 04:50:26.532777 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:26.532747 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j" Apr 21 04:50:26.533576 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:26.533548 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j"] Apr 21 04:50:26.535546 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:26.535508 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 21 04:50:26.535667 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:26.535548 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 21 04:50:26.536959 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:26.536877 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 21 04:50:26.536959 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:26.536934 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-pxf66\"" Apr 21 04:50:26.694824 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:26.694788 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cee87130-14f5-441b-a69a-e8c4d1c2b24d-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j\" (UID: \"cee87130-14f5-441b-a69a-e8c4d1c2b24d\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j" Apr 21 04:50:26.695003 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:26.694833 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cee87130-14f5-441b-a69a-e8c4d1c2b24d-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j\" (UID: \"cee87130-14f5-441b-a69a-e8c4d1c2b24d\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j" Apr 21 04:50:26.695003 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:26.694869 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cee87130-14f5-441b-a69a-e8c4d1c2b24d-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j\" (UID: \"cee87130-14f5-441b-a69a-e8c4d1c2b24d\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j" Apr 21 04:50:26.695003 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:26.694895 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cee87130-14f5-441b-a69a-e8c4d1c2b24d-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j\" (UID: \"cee87130-14f5-441b-a69a-e8c4d1c2b24d\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j" Apr 21 04:50:26.695003 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:26.694935 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cee87130-14f5-441b-a69a-e8c4d1c2b24d-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j\" (UID: \"cee87130-14f5-441b-a69a-e8c4d1c2b24d\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j" Apr 21 04:50:26.695003 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:26.694956 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2fjx\" (UniqueName: \"kubernetes.io/projected/cee87130-14f5-441b-a69a-e8c4d1c2b24d-kube-api-access-n2fjx\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j\" (UID: \"cee87130-14f5-441b-a69a-e8c4d1c2b24d\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j" Apr 21 04:50:26.795535 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:26.795445 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cee87130-14f5-441b-a69a-e8c4d1c2b24d-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j\" (UID: \"cee87130-14f5-441b-a69a-e8c4d1c2b24d\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j" Apr 21 04:50:26.795535 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:26.795491 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cee87130-14f5-441b-a69a-e8c4d1c2b24d-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j\" (UID: \"cee87130-14f5-441b-a69a-e8c4d1c2b24d\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j" Apr 21 04:50:26.795535 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:26.795521 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cee87130-14f5-441b-a69a-e8c4d1c2b24d-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j\" (UID: \"cee87130-14f5-441b-a69a-e8c4d1c2b24d\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j" Apr 21 04:50:26.795808 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:26.795543 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cee87130-14f5-441b-a69a-e8c4d1c2b24d-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j\" (UID: \"cee87130-14f5-441b-a69a-e8c4d1c2b24d\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j" Apr 21 04:50:26.795808 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:26.795579 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cee87130-14f5-441b-a69a-e8c4d1c2b24d-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j\" (UID: \"cee87130-14f5-441b-a69a-e8c4d1c2b24d\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j" Apr 21 04:50:26.795808 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:26.795607 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n2fjx\" (UniqueName: \"kubernetes.io/projected/cee87130-14f5-441b-a69a-e8c4d1c2b24d-kube-api-access-n2fjx\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j\" (UID: \"cee87130-14f5-441b-a69a-e8c4d1c2b24d\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j" Apr 21 04:50:26.796009 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:26.795985 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cee87130-14f5-441b-a69a-e8c4d1c2b24d-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j\" (UID: \"cee87130-14f5-441b-a69a-e8c4d1c2b24d\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j" Apr 21 04:50:26.796102 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:26.796051 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cee87130-14f5-441b-a69a-e8c4d1c2b24d-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j\" (UID: \"cee87130-14f5-441b-a69a-e8c4d1c2b24d\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j" Apr 21 04:50:26.796164 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:26.796104 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cee87130-14f5-441b-a69a-e8c4d1c2b24d-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j\" (UID: \"cee87130-14f5-441b-a69a-e8c4d1c2b24d\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j" Apr 21 04:50:26.797832 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:26.797815 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cee87130-14f5-441b-a69a-e8c4d1c2b24d-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j\" (UID: \"cee87130-14f5-441b-a69a-e8c4d1c2b24d\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j" Apr 21 04:50:26.797984 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:26.797966 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cee87130-14f5-441b-a69a-e8c4d1c2b24d-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j\" (UID: \"cee87130-14f5-441b-a69a-e8c4d1c2b24d\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j" Apr 21 04:50:26.826266 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:26.826241 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2fjx\" (UniqueName: \"kubernetes.io/projected/cee87130-14f5-441b-a69a-e8c4d1c2b24d-kube-api-access-n2fjx\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j\" (UID: \"cee87130-14f5-441b-a69a-e8c4d1c2b24d\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j" Apr 21 04:50:26.845233 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:26.845207 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j" Apr 21 04:50:26.975494 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:26.975469 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j"] Apr 21 04:50:26.977378 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:50:26.977352 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcee87130_14f5_441b_a69a_e8c4d1c2b24d.slice/crio-5f8cf049304256b4332097f03f67c4d34f05af6abca35a05790121636325d36e WatchSource:0}: Error finding container 5f8cf049304256b4332097f03f67c4d34f05af6abca35a05790121636325d36e: Status 404 returned error can't find the container with id 5f8cf049304256b4332097f03f67c4d34f05af6abca35a05790121636325d36e Apr 21 04:50:26.990142 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:26.990106 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j" event={"ID":"cee87130-14f5-441b-a69a-e8c4d1c2b24d","Type":"ContainerStarted","Data":"5f8cf049304256b4332097f03f67c4d34f05af6abca35a05790121636325d36e"} Apr 21 04:50:34.023200 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:34.023164 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j" event={"ID":"cee87130-14f5-441b-a69a-e8c4d1c2b24d","Type":"ContainerStarted","Data":"27c9c760c563e9f89caa40388ca45289481e099ea5587480d8e11650d585d23f"} Apr 21 04:50:39.042349 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:39.042259 2568 generic.go:358] "Generic (PLEG): container finished" podID="cee87130-14f5-441b-a69a-e8c4d1c2b24d" containerID="27c9c760c563e9f89caa40388ca45289481e099ea5587480d8e11650d585d23f" exitCode=0 Apr 21 04:50:39.042349 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:39.042331 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j" event={"ID":"cee87130-14f5-441b-a69a-e8c4d1c2b24d","Type":"ContainerDied","Data":"27c9c760c563e9f89caa40388ca45289481e099ea5587480d8e11650d585d23f"} Apr 21 04:50:41.051620 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:41.051584 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j" event={"ID":"cee87130-14f5-441b-a69a-e8c4d1c2b24d","Type":"ContainerStarted","Data":"03780040f22b0d3a0a923d7a4316e465639cab6d436b34a159bf5ab1cca23e59"} Apr 21 04:50:41.052022 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:41.051809 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j" Apr 21 04:50:41.072652 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:41.072600 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j" podStartSLOduration=1.9757722 podStartE2EDuration="15.072584177s" podCreationTimestamp="2026-04-21 04:50:26 +0000 UTC" firstStartedPulling="2026-04-21 04:50:26.979198298 +0000 UTC m=+682.100756083" lastFinishedPulling="2026-04-21 04:50:40.076010275 +0000 UTC m=+695.197568060" observedRunningTime="2026-04-21 04:50:41.07108049 +0000 UTC m=+696.192638287" watchObservedRunningTime="2026-04-21 04:50:41.072584177 +0000 UTC m=+696.194141983" Apr 21 04:50:52.068175 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:52.068143 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j" Apr 21 04:50:55.218427 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:55.218338 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs"] Apr 21 04:50:55.225678 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:55.225656 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs" Apr 21 04:50:55.228568 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:55.228548 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 21 04:50:55.236130 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:55.236107 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs"] Apr 21 04:50:55.361132 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:55.361094 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhdbn\" (UniqueName: \"kubernetes.io/projected/b742a6de-7ac8-4680-90d7-054e1e2e3e42-kube-api-access-zhdbn\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs\" (UID: \"b742a6de-7ac8-4680-90d7-054e1e2e3e42\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs" Apr 21 04:50:55.361338 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:55.361213 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b742a6de-7ac8-4680-90d7-054e1e2e3e42-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs\" (UID: \"b742a6de-7ac8-4680-90d7-054e1e2e3e42\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs" Apr 21 04:50:55.361338 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:55.361266 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b742a6de-7ac8-4680-90d7-054e1e2e3e42-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs\" (UID: \"b742a6de-7ac8-4680-90d7-054e1e2e3e42\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs" Apr 21 04:50:55.361338 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:55.361308 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b742a6de-7ac8-4680-90d7-054e1e2e3e42-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs\" (UID: \"b742a6de-7ac8-4680-90d7-054e1e2e3e42\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs" Apr 21 04:50:55.361474 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:55.361362 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b742a6de-7ac8-4680-90d7-054e1e2e3e42-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs\" (UID: \"b742a6de-7ac8-4680-90d7-054e1e2e3e42\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs" Apr 21 04:50:55.361474 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:55.361386 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b742a6de-7ac8-4680-90d7-054e1e2e3e42-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs\" (UID: \"b742a6de-7ac8-4680-90d7-054e1e2e3e42\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs" Apr 21 04:50:55.462409 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:55.462368 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b742a6de-7ac8-4680-90d7-054e1e2e3e42-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs\" (UID: \"b742a6de-7ac8-4680-90d7-054e1e2e3e42\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs" Apr 21 04:50:55.462613 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:55.462419 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b742a6de-7ac8-4680-90d7-054e1e2e3e42-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs\" (UID: \"b742a6de-7ac8-4680-90d7-054e1e2e3e42\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs" Apr 21 04:50:55.462613 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:55.462456 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b742a6de-7ac8-4680-90d7-054e1e2e3e42-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs\" (UID: \"b742a6de-7ac8-4680-90d7-054e1e2e3e42\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs" Apr 21 04:50:55.462613 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:55.462491 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b742a6de-7ac8-4680-90d7-054e1e2e3e42-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs\" (UID: \"b742a6de-7ac8-4680-90d7-054e1e2e3e42\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs" Apr 21 04:50:55.462613 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:55.462515 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b742a6de-7ac8-4680-90d7-054e1e2e3e42-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs\" (UID: \"b742a6de-7ac8-4680-90d7-054e1e2e3e42\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs" Apr 21 04:50:55.462613 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:55.462579 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhdbn\" (UniqueName: \"kubernetes.io/projected/b742a6de-7ac8-4680-90d7-054e1e2e3e42-kube-api-access-zhdbn\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs\" (UID: \"b742a6de-7ac8-4680-90d7-054e1e2e3e42\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs" Apr 21 04:50:55.462871 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:55.462766 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b742a6de-7ac8-4680-90d7-054e1e2e3e42-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs\" (UID: \"b742a6de-7ac8-4680-90d7-054e1e2e3e42\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs" Apr 21 04:50:55.462871 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:55.462826 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b742a6de-7ac8-4680-90d7-054e1e2e3e42-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs\" (UID: \"b742a6de-7ac8-4680-90d7-054e1e2e3e42\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs" Apr 21 04:50:55.462871 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:55.462852 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b742a6de-7ac8-4680-90d7-054e1e2e3e42-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs\" (UID: \"b742a6de-7ac8-4680-90d7-054e1e2e3e42\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs" Apr 21 04:50:55.464932 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:55.464904 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b742a6de-7ac8-4680-90d7-054e1e2e3e42-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs\" (UID: \"b742a6de-7ac8-4680-90d7-054e1e2e3e42\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs" Apr 21 04:50:55.465328 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:55.465305 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b742a6de-7ac8-4680-90d7-054e1e2e3e42-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs\" (UID: \"b742a6de-7ac8-4680-90d7-054e1e2e3e42\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs" Apr 21 04:50:55.471353 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:55.471300 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhdbn\" (UniqueName: \"kubernetes.io/projected/b742a6de-7ac8-4680-90d7-054e1e2e3e42-kube-api-access-zhdbn\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs\" (UID: \"b742a6de-7ac8-4680-90d7-054e1e2e3e42\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs" Apr 21 04:50:55.534470 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:55.534437 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs" Apr 21 04:50:55.662144 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:55.662120 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs"] Apr 21 04:50:55.664600 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:50:55.664555 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb742a6de_7ac8_4680_90d7_054e1e2e3e42.slice/crio-2abf0a6120125f00ce7f0babdba710c24aef5d61ba0b7b932be41108a67576ee WatchSource:0}: Error finding container 2abf0a6120125f00ce7f0babdba710c24aef5d61ba0b7b932be41108a67576ee: Status 404 returned error can't find the container with id 2abf0a6120125f00ce7f0babdba710c24aef5d61ba0b7b932be41108a67576ee Apr 21 04:50:56.110372 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:56.110338 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs" event={"ID":"b742a6de-7ac8-4680-90d7-054e1e2e3e42","Type":"ContainerStarted","Data":"d0372a80aa3c2cfefef34cc28a87b919929959bc2201d5c93095e54f28c8b8a4"} Apr 21 04:50:56.110372 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:50:56.110374 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs" event={"ID":"b742a6de-7ac8-4680-90d7-054e1e2e3e42","Type":"ContainerStarted","Data":"2abf0a6120125f00ce7f0babdba710c24aef5d61ba0b7b932be41108a67576ee"} Apr 21 04:51:05.150083 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:05.150043 2568 generic.go:358] "Generic (PLEG): container finished" podID="b742a6de-7ac8-4680-90d7-054e1e2e3e42" containerID="d0372a80aa3c2cfefef34cc28a87b919929959bc2201d5c93095e54f28c8b8a4" exitCode=0 Apr 21 04:51:05.150567 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:05.150123 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs" event={"ID":"b742a6de-7ac8-4680-90d7-054e1e2e3e42","Type":"ContainerDied","Data":"d0372a80aa3c2cfefef34cc28a87b919929959bc2201d5c93095e54f28c8b8a4"} Apr 21 04:51:06.155508 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:06.155472 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs" event={"ID":"b742a6de-7ac8-4680-90d7-054e1e2e3e42","Type":"ContainerStarted","Data":"6e9bb23b8470d485c6aa50bbbbde23e6dadea290203945e5afdf800192a7fc38"} Apr 21 04:51:06.155912 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:06.155669 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs" Apr 21 04:51:06.174007 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:06.173952 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs" podStartSLOduration=10.992354013 podStartE2EDuration="11.173938127s" podCreationTimestamp="2026-04-21 04:50:55 +0000 UTC" firstStartedPulling="2026-04-21 04:51:05.15088321 +0000 UTC m=+720.272440994" lastFinishedPulling="2026-04-21 04:51:05.33246732 +0000 UTC m=+720.454025108" observedRunningTime="2026-04-21 04:51:06.171968566 +0000 UTC m=+721.293526372" watchObservedRunningTime="2026-04-21 04:51:06.173938127 +0000 UTC m=+721.295495933" Apr 21 04:51:17.172833 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:17.172799 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs" Apr 21 04:51:48.873108 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:48.873059 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-896c5bf95-7dhq9"] Apr 21 04:51:48.876788 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:48.876766 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-896c5bf95-7dhq9" Apr 21 04:51:48.883516 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:48.883490 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-896c5bf95-7dhq9"] Apr 21 04:51:49.060734 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:49.060704 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/447a6cda-c20b-4a28-a650-5250e0889a94-tls-cert\") pod \"authorino-896c5bf95-7dhq9\" (UID: \"447a6cda-c20b-4a28-a650-5250e0889a94\") " pod="kuadrant-system/authorino-896c5bf95-7dhq9" Apr 21 04:51:49.060734 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:49.060739 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw77r\" (UniqueName: \"kubernetes.io/projected/447a6cda-c20b-4a28-a650-5250e0889a94-kube-api-access-fw77r\") pod \"authorino-896c5bf95-7dhq9\" (UID: \"447a6cda-c20b-4a28-a650-5250e0889a94\") " pod="kuadrant-system/authorino-896c5bf95-7dhq9" Apr 21 04:51:49.060952 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:49.060764 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/447a6cda-c20b-4a28-a650-5250e0889a94-oidc-ca\") pod \"authorino-896c5bf95-7dhq9\" (UID: \"447a6cda-c20b-4a28-a650-5250e0889a94\") " pod="kuadrant-system/authorino-896c5bf95-7dhq9" Apr 21 04:51:49.162148 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:49.162034 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/447a6cda-c20b-4a28-a650-5250e0889a94-tls-cert\") pod \"authorino-896c5bf95-7dhq9\" (UID: \"447a6cda-c20b-4a28-a650-5250e0889a94\") " pod="kuadrant-system/authorino-896c5bf95-7dhq9" Apr 21 04:51:49.162148 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:49.162126 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fw77r\" (UniqueName: \"kubernetes.io/projected/447a6cda-c20b-4a28-a650-5250e0889a94-kube-api-access-fw77r\") pod \"authorino-896c5bf95-7dhq9\" (UID: \"447a6cda-c20b-4a28-a650-5250e0889a94\") " pod="kuadrant-system/authorino-896c5bf95-7dhq9" Apr 21 04:51:49.162354 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:49.162153 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/447a6cda-c20b-4a28-a650-5250e0889a94-oidc-ca\") pod \"authorino-896c5bf95-7dhq9\" (UID: \"447a6cda-c20b-4a28-a650-5250e0889a94\") " pod="kuadrant-system/authorino-896c5bf95-7dhq9" Apr 21 04:51:49.162751 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:49.162732 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/447a6cda-c20b-4a28-a650-5250e0889a94-oidc-ca\") pod \"authorino-896c5bf95-7dhq9\" (UID: \"447a6cda-c20b-4a28-a650-5250e0889a94\") " pod="kuadrant-system/authorino-896c5bf95-7dhq9" Apr 21 04:51:49.164743 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:49.164722 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/447a6cda-c20b-4a28-a650-5250e0889a94-tls-cert\") pod \"authorino-896c5bf95-7dhq9\" (UID: \"447a6cda-c20b-4a28-a650-5250e0889a94\") " pod="kuadrant-system/authorino-896c5bf95-7dhq9" Apr 21 04:51:49.169890 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:49.169859 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw77r\" (UniqueName: \"kubernetes.io/projected/447a6cda-c20b-4a28-a650-5250e0889a94-kube-api-access-fw77r\") pod \"authorino-896c5bf95-7dhq9\" (UID: \"447a6cda-c20b-4a28-a650-5250e0889a94\") " pod="kuadrant-system/authorino-896c5bf95-7dhq9" Apr 21 04:51:49.187679 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:49.187657 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-896c5bf95-7dhq9" Apr 21 04:51:49.311224 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:49.311196 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-896c5bf95-7dhq9"] Apr 21 04:51:49.313880 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:51:49.313851 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod447a6cda_c20b_4a28_a650_5250e0889a94.slice/crio-e54f350981ddc8a7aebb255328eae59ff7ba6f659e233a7f89d1f0eb8b6fd18f WatchSource:0}: Error finding container e54f350981ddc8a7aebb255328eae59ff7ba6f659e233a7f89d1f0eb8b6fd18f: Status 404 returned error can't find the container with id e54f350981ddc8a7aebb255328eae59ff7ba6f659e233a7f89d1f0eb8b6fd18f Apr 21 04:51:49.320640 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:49.320613 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-896c5bf95-7dhq9" event={"ID":"447a6cda-c20b-4a28-a650-5250e0889a94","Type":"ContainerStarted","Data":"e54f350981ddc8a7aebb255328eae59ff7ba6f659e233a7f89d1f0eb8b6fd18f"} Apr 21 04:51:50.325466 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:50.325429 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-896c5bf95-7dhq9" event={"ID":"447a6cda-c20b-4a28-a650-5250e0889a94","Type":"ContainerStarted","Data":"4da44612172859b2a674b3d82929a6b1896d09b5045dafbd1b8bd8733aba4149"} Apr 21 04:51:50.342146 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:50.342093 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-896c5bf95-7dhq9" podStartSLOduration=2.023083234 podStartE2EDuration="2.342079568s" podCreationTimestamp="2026-04-21 04:51:48 +0000 UTC" firstStartedPulling="2026-04-21 04:51:49.315249491 +0000 UTC m=+764.436807276" lastFinishedPulling="2026-04-21 04:51:49.634245822 +0000 UTC m=+764.755803610" observedRunningTime="2026-04-21 04:51:50.33992429 +0000 UTC m=+765.461482097" watchObservedRunningTime="2026-04-21 04:51:50.342079568 +0000 UTC m=+765.463637365" Apr 21 04:51:50.369480 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:50.369432 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79b69b4cbb-dq6bj"] Apr 21 04:51:50.370289 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:50.370243 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-79b69b4cbb-dq6bj" podUID="6ab56247-b48a-436b-a7f8-b0b05baa54f5" containerName="authorino" containerID="cri-o://66be5e4abad04eeb98760b9727cca02297a61613320c740464b6c00bff2b5099" gracePeriod=30 Apr 21 04:51:50.630304 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:50.630275 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79b69b4cbb-dq6bj" Apr 21 04:51:50.778798 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:50.778757 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/6ab56247-b48a-436b-a7f8-b0b05baa54f5-oidc-ca\") pod \"6ab56247-b48a-436b-a7f8-b0b05baa54f5\" (UID: \"6ab56247-b48a-436b-a7f8-b0b05baa54f5\") " Apr 21 04:51:50.778994 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:50.778847 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q85xp\" (UniqueName: \"kubernetes.io/projected/6ab56247-b48a-436b-a7f8-b0b05baa54f5-kube-api-access-q85xp\") pod \"6ab56247-b48a-436b-a7f8-b0b05baa54f5\" (UID: \"6ab56247-b48a-436b-a7f8-b0b05baa54f5\") " Apr 21 04:51:50.778994 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:50.778879 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/6ab56247-b48a-436b-a7f8-b0b05baa54f5-tls-cert\") pod \"6ab56247-b48a-436b-a7f8-b0b05baa54f5\" (UID: \"6ab56247-b48a-436b-a7f8-b0b05baa54f5\") " Apr 21 04:51:50.780953 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:50.780922 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ab56247-b48a-436b-a7f8-b0b05baa54f5-kube-api-access-q85xp" (OuterVolumeSpecName: "kube-api-access-q85xp") pod "6ab56247-b48a-436b-a7f8-b0b05baa54f5" (UID: "6ab56247-b48a-436b-a7f8-b0b05baa54f5"). InnerVolumeSpecName "kube-api-access-q85xp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:51:50.785207 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:50.785179 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ab56247-b48a-436b-a7f8-b0b05baa54f5-oidc-ca" (OuterVolumeSpecName: "oidc-ca") pod "6ab56247-b48a-436b-a7f8-b0b05baa54f5" (UID: "6ab56247-b48a-436b-a7f8-b0b05baa54f5"). InnerVolumeSpecName "oidc-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:51:50.789678 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:50.789656 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ab56247-b48a-436b-a7f8-b0b05baa54f5-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "6ab56247-b48a-436b-a7f8-b0b05baa54f5" (UID: "6ab56247-b48a-436b-a7f8-b0b05baa54f5"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:51:50.880690 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:50.880660 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q85xp\" (UniqueName: \"kubernetes.io/projected/6ab56247-b48a-436b-a7f8-b0b05baa54f5-kube-api-access-q85xp\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:51:50.880690 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:50.880689 2568 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/6ab56247-b48a-436b-a7f8-b0b05baa54f5-tls-cert\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:51:50.880889 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:50.880699 2568 reconciler_common.go:299] "Volume detached for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/6ab56247-b48a-436b-a7f8-b0b05baa54f5-oidc-ca\") on node \"ip-10-0-141-241.ec2.internal\" DevicePath \"\"" Apr 21 04:51:51.329984 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:51.329948 2568 generic.go:358] "Generic (PLEG): container finished" podID="6ab56247-b48a-436b-a7f8-b0b05baa54f5" containerID="66be5e4abad04eeb98760b9727cca02297a61613320c740464b6c00bff2b5099" exitCode=0 Apr 21 04:51:51.330420 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:51.330010 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79b69b4cbb-dq6bj" Apr 21 04:51:51.330420 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:51.330034 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79b69b4cbb-dq6bj" event={"ID":"6ab56247-b48a-436b-a7f8-b0b05baa54f5","Type":"ContainerDied","Data":"66be5e4abad04eeb98760b9727cca02297a61613320c740464b6c00bff2b5099"} Apr 21 04:51:51.330420 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:51.330101 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79b69b4cbb-dq6bj" event={"ID":"6ab56247-b48a-436b-a7f8-b0b05baa54f5","Type":"ContainerDied","Data":"df5ab6cf12432e8829aac96f5887d59644cefd8b515988d05db950ee8ceb8034"} Apr 21 04:51:51.330420 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:51.330124 2568 scope.go:117] "RemoveContainer" containerID="66be5e4abad04eeb98760b9727cca02297a61613320c740464b6c00bff2b5099" Apr 21 04:51:51.340397 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:51.340377 2568 scope.go:117] "RemoveContainer" containerID="66be5e4abad04eeb98760b9727cca02297a61613320c740464b6c00bff2b5099" Apr 21 04:51:51.340674 ip-10-0-141-241 kubenswrapper[2568]: E0421 04:51:51.340656 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66be5e4abad04eeb98760b9727cca02297a61613320c740464b6c00bff2b5099\": container with ID starting with 66be5e4abad04eeb98760b9727cca02297a61613320c740464b6c00bff2b5099 not found: ID does not exist" containerID="66be5e4abad04eeb98760b9727cca02297a61613320c740464b6c00bff2b5099" Apr 21 04:51:51.340724 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:51.340686 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66be5e4abad04eeb98760b9727cca02297a61613320c740464b6c00bff2b5099"} err="failed to get container status \"66be5e4abad04eeb98760b9727cca02297a61613320c740464b6c00bff2b5099\": rpc error: code = NotFound desc = could not find container \"66be5e4abad04eeb98760b9727cca02297a61613320c740464b6c00bff2b5099\": container with ID starting with 66be5e4abad04eeb98760b9727cca02297a61613320c740464b6c00bff2b5099 not found: ID does not exist" Apr 21 04:51:51.367222 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:51.367190 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79b69b4cbb-dq6bj"] Apr 21 04:51:51.375297 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:51.375263 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-79b69b4cbb-dq6bj"] Apr 21 04:51:51.486187 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:51:51.486097 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ab56247-b48a-436b-a7f8-b0b05baa54f5" path="/var/lib/kubelet/pods/6ab56247-b48a-436b-a7f8-b0b05baa54f5/volumes" Apr 21 04:52:29.266218 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:29.266126 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-896c5bf95-7dhq9_447a6cda-c20b-4a28-a650-5250e0889a94/authorino/0.log" Apr 21 04:52:33.288781 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:33.288745 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-b574d96b7-cjfq6_1208b807-53a8-4942-804a-0dedc391885d/maas-api/0.log" Apr 21 04:52:33.751734 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:33.751695 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-55ddb68486-pbs7j_18136690-4103-4545-8843-72079fd2605c/manager/0.log" Apr 21 04:52:35.242205 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:35.242178 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-896c5bf95-7dhq9_447a6cda-c20b-4a28-a650-5250e0889a94/authorino/0.log" Apr 21 04:52:35.465476 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:35.465442 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-9wb8h_5e69f849-cd12-40d2-8bea-8818945861ad/manager/0.log" Apr 21 04:52:35.804198 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:35.804165 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-xw2hr_7d1e892c-2a50-4136-bda4-431ffab2513b/manager/0.log" Apr 21 04:52:36.588044 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:36.588018 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-788fdfdbbd-4svlw_abcb51c1-3dee-45ae-8fb8-36db34a5e745/kube-auth-proxy/0.log" Apr 21 04:52:36.909006 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:36.908973 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-686d5855f4-tcz66_e7fc8dd3-0312-4bab-a12c-6a11df14266e/router/0.log" Apr 21 04:52:37.734482 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:37.734454 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j_cee87130-14f5-441b-a69a-e8c4d1c2b24d/storage-initializer/0.log" Apr 21 04:52:37.743132 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:37.743105 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-5dx5j_cee87130-14f5-441b-a69a-e8c4d1c2b24d/main/0.log" Apr 21 04:52:37.850267 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:37.850239 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs_b742a6de-7ac8-4680-90d7-054e1e2e3e42/storage-initializer/0.log" Apr 21 04:52:37.861374 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:37.861348 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-2gxzs_b742a6de-7ac8-4680-90d7-054e1e2e3e42/main/0.log" Apr 21 04:52:44.437606 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:44.437568 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-4s8qh_973fae08-b8f2-43d5-9852-9711b218e560/global-pull-secret-syncer/0.log" Apr 21 04:52:44.534760 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:44.534725 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-8djwj_12c88bd4-6fd7-4968-a57a-c248d0a14470/konnectivity-agent/0.log" Apr 21 04:52:44.675761 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:44.675729 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-241.ec2.internal_b815c8ee131098fba59a15d8b002ae1a/haproxy/0.log" Apr 21 04:52:48.953903 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:48.953870 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-896c5bf95-7dhq9_447a6cda-c20b-4a28-a650-5250e0889a94/authorino/0.log" Apr 21 04:52:49.008598 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:49.008515 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-9wb8h_5e69f849-cd12-40d2-8bea-8818945861ad/manager/0.log" Apr 21 04:52:49.131718 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:49.131690 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-xw2hr_7d1e892c-2a50-4136-bda4-431ffab2513b/manager/0.log" Apr 21 04:52:50.655390 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:50.655361 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_872de7ac-6622-48c0-b75b-c05be54702c3/alertmanager/0.log" Apr 21 04:52:50.683621 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:50.683597 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_872de7ac-6622-48c0-b75b-c05be54702c3/config-reloader/0.log" Apr 21 04:52:50.708555 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:50.708530 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_872de7ac-6622-48c0-b75b-c05be54702c3/kube-rbac-proxy-web/0.log" Apr 21 04:52:50.742550 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:50.742518 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_872de7ac-6622-48c0-b75b-c05be54702c3/kube-rbac-proxy/0.log" Apr 21 04:52:50.774712 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:50.774685 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_872de7ac-6622-48c0-b75b-c05be54702c3/kube-rbac-proxy-metric/0.log" Apr 21 04:52:50.811198 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:50.811175 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_872de7ac-6622-48c0-b75b-c05be54702c3/prom-label-proxy/0.log" Apr 21 04:52:50.839396 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:50.839372 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_872de7ac-6622-48c0-b75b-c05be54702c3/init-config-reloader/0.log" Apr 21 04:52:50.878838 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:50.878753 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-g25vj_8cf0095a-7001-4da1-893d-f6430e613fe9/cluster-monitoring-operator/0.log" Apr 21 04:52:51.133568 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:51.133480 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-n5d8t_f8c9b1cf-c943-4008-bc8d-4783051875d2/node-exporter/0.log" Apr 21 04:52:51.153925 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:51.153901 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-n5d8t_f8c9b1cf-c943-4008-bc8d-4783051875d2/kube-rbac-proxy/0.log" Apr 21 04:52:51.175984 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:51.175957 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-n5d8t_f8c9b1cf-c943-4008-bc8d-4783051875d2/init-textfile/0.log" Apr 21 04:52:51.280405 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:51.280378 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-rjx6k_0d8be9f5-8edd-4d4d-b9f0-3dffedcf1c75/kube-rbac-proxy-main/0.log" Apr 21 04:52:51.302792 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:51.302770 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-rjx6k_0d8be9f5-8edd-4d4d-b9f0-3dffedcf1c75/kube-rbac-proxy-self/0.log" Apr 21 04:52:51.326009 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:51.325977 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-rjx6k_0d8be9f5-8edd-4d4d-b9f0-3dffedcf1c75/openshift-state-metrics/0.log" Apr 21 04:52:51.364308 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:51.364280 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1d1329d2-f78b-491e-aea8-de3ec08202c7/prometheus/0.log" Apr 21 04:52:51.392255 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:51.392170 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1d1329d2-f78b-491e-aea8-de3ec08202c7/config-reloader/0.log" Apr 21 04:52:51.417227 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:51.417202 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1d1329d2-f78b-491e-aea8-de3ec08202c7/thanos-sidecar/0.log" Apr 21 04:52:51.441725 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:51.441697 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1d1329d2-f78b-491e-aea8-de3ec08202c7/kube-rbac-proxy-web/0.log" Apr 21 04:52:51.465262 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:51.465235 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1d1329d2-f78b-491e-aea8-de3ec08202c7/kube-rbac-proxy/0.log" Apr 21 04:52:51.487471 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:51.487441 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1d1329d2-f78b-491e-aea8-de3ec08202c7/kube-rbac-proxy-thanos/0.log" Apr 21 04:52:51.509591 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:51.509560 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1d1329d2-f78b-491e-aea8-de3ec08202c7/init-config-reloader/0.log" Apr 21 04:52:51.585240 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:51.585213 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-94zhn_609ee659-7c06-49c7-a370-5afdac306be6/prometheus-operator-admission-webhook/0.log" Apr 21 04:52:51.705708 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:51.705622 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b9c54d565-czdz9_524c70fa-e764-4096-9e62-1e894feb43cd/thanos-query/0.log" Apr 21 04:52:51.728329 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:51.728303 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b9c54d565-czdz9_524c70fa-e764-4096-9e62-1e894feb43cd/kube-rbac-proxy-web/0.log" Apr 21 04:52:51.750085 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:51.750050 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b9c54d565-czdz9_524c70fa-e764-4096-9e62-1e894feb43cd/kube-rbac-proxy/0.log" Apr 21 04:52:51.777681 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:51.777659 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b9c54d565-czdz9_524c70fa-e764-4096-9e62-1e894feb43cd/prom-label-proxy/0.log" Apr 21 04:52:51.804559 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:51.804535 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b9c54d565-czdz9_524c70fa-e764-4096-9e62-1e894feb43cd/kube-rbac-proxy-rules/0.log" Apr 21 04:52:51.833228 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:51.833198 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b9c54d565-czdz9_524c70fa-e764-4096-9e62-1e894feb43cd/kube-rbac-proxy-metrics/0.log" Apr 21 04:52:52.877569 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:52.877537 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tffw4/perf-node-gather-daemonset-hk6x8"] Apr 21 04:52:52.877970 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:52.877923 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ab56247-b48a-436b-a7f8-b0b05baa54f5" containerName="authorino" Apr 21 04:52:52.877970 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:52.877934 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ab56247-b48a-436b-a7f8-b0b05baa54f5" containerName="authorino" Apr 21 04:52:52.878060 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:52.878015 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="6ab56247-b48a-436b-a7f8-b0b05baa54f5" containerName="authorino" Apr 21 04:52:52.881421 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:52.881404 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-hk6x8" Apr 21 04:52:52.884080 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:52.884044 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tffw4\"/\"kube-root-ca.crt\"" Apr 21 04:52:52.885030 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:52.885010 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tffw4\"/\"openshift-service-ca.crt\"" Apr 21 04:52:52.885124 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:52.885050 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-tffw4\"/\"default-dockercfg-49xnb\"" Apr 21 04:52:52.890304 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:52.890284 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tffw4/perf-node-gather-daemonset-hk6x8"] Apr 21 04:52:52.931572 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:52.931539 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8137d574-de2c-4bba-b666-6a1b51711c5c-sys\") pod \"perf-node-gather-daemonset-hk6x8\" (UID: \"8137d574-de2c-4bba-b666-6a1b51711c5c\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-hk6x8" Apr 21 04:52:52.931707 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:52.931575 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94wx6\" (UniqueName: \"kubernetes.io/projected/8137d574-de2c-4bba-b666-6a1b51711c5c-kube-api-access-94wx6\") pod \"perf-node-gather-daemonset-hk6x8\" (UID: \"8137d574-de2c-4bba-b666-6a1b51711c5c\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-hk6x8" Apr 21 04:52:52.931707 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:52.931677 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8137d574-de2c-4bba-b666-6a1b51711c5c-podres\") pod \"perf-node-gather-daemonset-hk6x8\" (UID: \"8137d574-de2c-4bba-b666-6a1b51711c5c\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-hk6x8" Apr 21 04:52:52.931800 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:52.931736 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8137d574-de2c-4bba-b666-6a1b51711c5c-proc\") pod \"perf-node-gather-daemonset-hk6x8\" (UID: \"8137d574-de2c-4bba-b666-6a1b51711c5c\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-hk6x8" Apr 21 04:52:52.931800 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:52.931771 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8137d574-de2c-4bba-b666-6a1b51711c5c-lib-modules\") pod \"perf-node-gather-daemonset-hk6x8\" (UID: \"8137d574-de2c-4bba-b666-6a1b51711c5c\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-hk6x8" Apr 21 04:52:53.032770 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:53.032737 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8137d574-de2c-4bba-b666-6a1b51711c5c-sys\") pod \"perf-node-gather-daemonset-hk6x8\" (UID: \"8137d574-de2c-4bba-b666-6a1b51711c5c\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-hk6x8" Apr 21 04:52:53.032770 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:53.032775 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-94wx6\" (UniqueName: \"kubernetes.io/projected/8137d574-de2c-4bba-b666-6a1b51711c5c-kube-api-access-94wx6\") pod \"perf-node-gather-daemonset-hk6x8\" (UID: \"8137d574-de2c-4bba-b666-6a1b51711c5c\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-hk6x8" Apr 21 04:52:53.032988 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:53.032825 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8137d574-de2c-4bba-b666-6a1b51711c5c-podres\") pod \"perf-node-gather-daemonset-hk6x8\" (UID: \"8137d574-de2c-4bba-b666-6a1b51711c5c\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-hk6x8" Apr 21 04:52:53.032988 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:53.032854 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8137d574-de2c-4bba-b666-6a1b51711c5c-proc\") pod \"perf-node-gather-daemonset-hk6x8\" (UID: \"8137d574-de2c-4bba-b666-6a1b51711c5c\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-hk6x8" Apr 21 04:52:53.032988 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:53.032865 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8137d574-de2c-4bba-b666-6a1b51711c5c-sys\") pod \"perf-node-gather-daemonset-hk6x8\" (UID: \"8137d574-de2c-4bba-b666-6a1b51711c5c\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-hk6x8" Apr 21 04:52:53.032988 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:53.032883 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8137d574-de2c-4bba-b666-6a1b51711c5c-lib-modules\") pod \"perf-node-gather-daemonset-hk6x8\" (UID: \"8137d574-de2c-4bba-b666-6a1b51711c5c\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-hk6x8" Apr 21 04:52:53.032988 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:53.032924 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8137d574-de2c-4bba-b666-6a1b51711c5c-proc\") pod \"perf-node-gather-daemonset-hk6x8\" (UID: \"8137d574-de2c-4bba-b666-6a1b51711c5c\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-hk6x8" Apr 21 04:52:53.033191 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:53.033002 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8137d574-de2c-4bba-b666-6a1b51711c5c-lib-modules\") pod \"perf-node-gather-daemonset-hk6x8\" (UID: \"8137d574-de2c-4bba-b666-6a1b51711c5c\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-hk6x8" Apr 21 04:52:53.033191 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:53.033001 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8137d574-de2c-4bba-b666-6a1b51711c5c-podres\") pod \"perf-node-gather-daemonset-hk6x8\" (UID: \"8137d574-de2c-4bba-b666-6a1b51711c5c\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-hk6x8" Apr 21 04:52:53.041979 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:53.041951 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-94wx6\" (UniqueName: \"kubernetes.io/projected/8137d574-de2c-4bba-b666-6a1b51711c5c-kube-api-access-94wx6\") pod \"perf-node-gather-daemonset-hk6x8\" (UID: \"8137d574-de2c-4bba-b666-6a1b51711c5c\") " pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-hk6x8" Apr 21 04:52:53.192736 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:53.192648 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-hk6x8" Apr 21 04:52:53.321931 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:53.321906 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tffw4/perf-node-gather-daemonset-hk6x8"] Apr 21 04:52:53.324121 ip-10-0-141-241 kubenswrapper[2568]: W0421 04:52:53.324093 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8137d574_de2c_4bba_b666_6a1b51711c5c.slice/crio-aef37a9dc302611b4d61f6edd814437d001031fffbcee5984a13e1dca009b0b9 WatchSource:0}: Error finding container aef37a9dc302611b4d61f6edd814437d001031fffbcee5984a13e1dca009b0b9: Status 404 returned error can't find the container with id aef37a9dc302611b4d61f6edd814437d001031fffbcee5984a13e1dca009b0b9 Apr 21 04:52:53.559287 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:53.559252 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wdf4b_8f53d634-201a-47a8-bb8f-a939d320e536/console-operator/2.log" Apr 21 04:52:53.559916 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:53.559755 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-hk6x8" event={"ID":"8137d574-de2c-4bba-b666-6a1b51711c5c","Type":"ContainerStarted","Data":"39fa42f88468e45f9540eaab00378ec08f65a594270ef7098464f7f2cae05af4"} Apr 21 04:52:53.559916 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:53.559796 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-hk6x8" event={"ID":"8137d574-de2c-4bba-b666-6a1b51711c5c","Type":"ContainerStarted","Data":"aef37a9dc302611b4d61f6edd814437d001031fffbcee5984a13e1dca009b0b9"} Apr 21 04:52:53.559916 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:53.559875 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-hk6x8" Apr 21 04:52:53.568001 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:53.567977 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wdf4b_8f53d634-201a-47a8-bb8f-a939d320e536/console-operator/3.log" Apr 21 04:52:53.576428 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:53.576362 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-hk6x8" podStartSLOduration=1.576347317 podStartE2EDuration="1.576347317s" podCreationTimestamp="2026-04-21 04:52:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:52:53.575340696 +0000 UTC m=+828.696898503" watchObservedRunningTime="2026-04-21 04:52:53.576347317 +0000 UTC m=+828.697905127" Apr 21 04:52:55.424301 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:55.424266 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qjtkj_70e12877-2720-42f7-b047-316b48c6b8fe/dns/0.log" Apr 21 04:52:55.449220 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:55.449185 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qjtkj_70e12877-2720-42f7-b047-316b48c6b8fe/kube-rbac-proxy/0.log" Apr 21 04:52:55.577187 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:55.577152 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-n7x9t_a74e0bd7-e17f-4529-9299-93c38644ab68/dns-node-resolver/0.log" Apr 21 04:52:56.097927 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:56.097898 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-hm47b_fed25630-f93d-40db-800e-f8042fc4f7ca/node-ca/0.log" Apr 21 04:52:57.002054 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:57.002022 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-788fdfdbbd-4svlw_abcb51c1-3dee-45ae-8fb8-36db34a5e745/kube-auth-proxy/0.log" Apr 21 04:52:57.102886 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:57.102859 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-686d5855f4-tcz66_e7fc8dd3-0312-4bab-a12c-6a11df14266e/router/0.log" Apr 21 04:52:57.777765 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:57.777736 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-k8p5j_f8e3c014-c3ca-4d5c-ae42-2a22ad9530c8/serve-healthcheck-canary/0.log" Apr 21 04:52:58.277559 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:58.277529 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-srd87_74465859-3c82-4f58-832b-74cc4fbe41ce/insights-operator/1.log" Apr 21 04:52:58.278116 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:58.278087 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-srd87_74465859-3c82-4f58-832b-74cc4fbe41ce/insights-operator/0.log" Apr 21 04:52:58.371717 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:58.371688 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-l4ghz_58730f05-2dc5-4837-8ad1-3f20245c3215/kube-rbac-proxy/0.log" Apr 21 04:52:58.393514 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:58.393482 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-l4ghz_58730f05-2dc5-4837-8ad1-3f20245c3215/exporter/0.log" Apr 21 04:52:58.417585 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:58.417558 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-l4ghz_58730f05-2dc5-4837-8ad1-3f20245c3215/extractor/0.log" Apr 21 04:52:59.578811 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:52:59.578782 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-tffw4/perf-node-gather-daemonset-hk6x8" Apr 21 04:53:00.489057 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:53:00.489017 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-b574d96b7-cjfq6_1208b807-53a8-4942-804a-0dedc391885d/maas-api/0.log" Apr 21 04:53:00.628044 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:53:00.628009 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-55ddb68486-pbs7j_18136690-4103-4545-8843-72079fd2605c/manager/0.log" Apr 21 04:53:02.135461 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:53:02.135427 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-bb95b586d-plmjx_868e0256-9c68-4772-8ae4-45e88a055901/manager/0.log" Apr 21 04:53:06.527785 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:53:06.527747 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-vlqww_02ca5737-060f-403c-8352-2dec9f92bd2e/migrator/0.log" Apr 21 04:53:06.549141 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:53:06.549064 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-vlqww_02ca5737-060f-403c-8352-2dec9f92bd2e/graceful-termination/0.log" Apr 21 04:53:07.932934 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:53:07.932910 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2pbgh_18ab0325-5097-4d89-bf24-9c599b9efbdc/kube-multus/0.log" Apr 21 04:53:08.306565 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:53:08.306487 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-km9z2_4d82872b-b0cd-4247-abb3-ce1e75dfd32b/kube-multus-additional-cni-plugins/0.log" Apr 21 04:53:08.328786 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:53:08.328758 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-km9z2_4d82872b-b0cd-4247-abb3-ce1e75dfd32b/egress-router-binary-copy/0.log" Apr 21 04:53:08.357492 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:53:08.357469 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-km9z2_4d82872b-b0cd-4247-abb3-ce1e75dfd32b/cni-plugins/0.log" Apr 21 04:53:08.380413 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:53:08.380388 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-km9z2_4d82872b-b0cd-4247-abb3-ce1e75dfd32b/bond-cni-plugin/0.log" Apr 21 04:53:08.404060 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:53:08.404034 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-km9z2_4d82872b-b0cd-4247-abb3-ce1e75dfd32b/routeoverride-cni/0.log" Apr 21 04:53:08.433824 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:53:08.433800 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-km9z2_4d82872b-b0cd-4247-abb3-ce1e75dfd32b/whereabouts-cni-bincopy/0.log" Apr 21 04:53:08.455871 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:53:08.455844 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-km9z2_4d82872b-b0cd-4247-abb3-ce1e75dfd32b/whereabouts-cni/0.log" Apr 21 04:53:08.697010 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:53:08.696982 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-z4rqh_8743d92c-6080-4066-ad83-55bb582a3f6c/network-metrics-daemon/0.log" Apr 21 04:53:08.755272 ip-10-0-141-241 kubenswrapper[2568]: I0421 04:53:08.755246 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-z4rqh_8743d92c-6080-4066-ad83-55bb582a3f6c/kube-rbac-proxy/0.log"