Apr 16 14:27:09.869023 ip-10-0-140-144 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 14:27:09.869035 ip-10-0-140-144 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 14:27:09.869043 ip-10-0-140-144 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 14:27:09.869340 ip-10-0-140-144 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 14:27:21.019367 ip-10-0-140-144 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 14:27:21.019383 ip-10-0-140-144 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot fbe04fdd364844f782996b389cf2a709 -- Apr 16 14:29:53.255334 ip-10-0-140-144 systemd[1]: Starting Kubernetes Kubelet... Apr 16 14:29:53.719411 ip-10-0-140-144 kubenswrapper[2579]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:29:53.719411 ip-10-0-140-144 kubenswrapper[2579]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 14:29:53.719411 ip-10-0-140-144 kubenswrapper[2579]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:29:53.719411 ip-10-0-140-144 kubenswrapper[2579]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 14:29:53.719411 ip-10-0-140-144 kubenswrapper[2579]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:29:53.721153 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.721057 2579 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 14:29:53.723575 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723552 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:29:53.723575 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723576 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:29:53.723640 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723580 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:29:53.723640 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723583 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:29:53.723640 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723587 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:29:53.723640 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723590 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:29:53.723640 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723593 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:29:53.723640 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723595 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:29:53.723640 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723598 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:29:53.723640 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723600 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:29:53.723640 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723603 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:29:53.723640 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723606 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:29:53.723640 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723609 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:29:53.723640 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723612 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:29:53.723640 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723615 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:29:53.723640 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723617 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:29:53.723640 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723620 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:29:53.723640 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723623 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:29:53.723640 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723626 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:29:53.723640 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723629 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:29:53.723640 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723632 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:29:53.723640 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723636 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:29:53.724166 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723638 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:29:53.724166 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723641 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:29:53.724166 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723644 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:29:53.724166 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723647 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:29:53.724166 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723650 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:29:53.724166 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723653 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:29:53.724166 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723656 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:29:53.724166 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723659 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:29:53.724166 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723661 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:29:53.724166 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723664 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:29:53.724166 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723667 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:29:53.724166 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723669 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:29:53.724166 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723672 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:29:53.724166 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723675 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:29:53.724166 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723677 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:29:53.724166 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723679 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:29:53.724166 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723682 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:29:53.724166 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723684 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:29:53.724166 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723687 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:29:53.724166 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723690 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:29:53.724679 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723692 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:29:53.724679 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723695 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:29:53.724679 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723698 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:29:53.724679 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723700 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:29:53.724679 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723702 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:29:53.724679 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723707 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:29:53.724679 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723711 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:29:53.724679 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723713 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:29:53.724679 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723716 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:29:53.724679 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723719 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:29:53.724679 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723722 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:29:53.724679 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723724 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:29:53.724679 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723727 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:29:53.724679 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723731 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:29:53.724679 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723733 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:29:53.724679 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723736 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:29:53.724679 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723739 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:29:53.724679 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723741 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:29:53.724679 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723744 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:29:53.725155 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723747 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:29:53.725155 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723749 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:29:53.725155 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723753 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:29:53.725155 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723757 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:29:53.725155 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723760 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:29:53.725155 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723771 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:29:53.725155 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723775 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:29:53.725155 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723778 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:29:53.725155 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723780 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:29:53.725155 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723783 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:29:53.725155 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723786 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:29:53.725155 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723788 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:29:53.725155 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723791 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:29:53.725155 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723794 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:29:53.725155 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723796 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:29:53.725155 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723800 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:29:53.725155 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723802 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:29:53.725155 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723805 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:29:53.725155 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723807 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:29:53.725607 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723810 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:29:53.725607 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723813 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:29:53.725607 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723816 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:29:53.725607 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723818 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:29:53.725607 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723821 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:29:53.725607 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.723824 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:29:53.725607 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724273 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:29:53.725607 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724281 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:29:53.725607 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724284 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:29:53.725607 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724287 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:29:53.725607 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724290 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:29:53.725607 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724293 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:29:53.725607 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724296 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:29:53.725607 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724298 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:29:53.725607 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724301 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:29:53.725607 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724304 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:29:53.725607 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724306 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:29:53.725607 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724309 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:29:53.725607 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724312 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:29:53.725607 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724315 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:29:53.725607 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724318 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:29:53.726123 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724320 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:29:53.726123 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724323 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:29:53.726123 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724326 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:29:53.726123 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724329 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:29:53.726123 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724331 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:29:53.726123 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724334 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:29:53.726123 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724337 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:29:53.726123 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724339 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:29:53.726123 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724342 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:29:53.726123 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724345 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:29:53.726123 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724348 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:29:53.726123 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724350 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:29:53.726123 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724353 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:29:53.726123 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724355 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:29:53.726123 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724358 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:29:53.726123 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724361 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:29:53.726123 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724364 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:29:53.726123 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724366 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:29:53.726123 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724369 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:29:53.726123 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724372 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:29:53.726654 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724374 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:29:53.726654 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724377 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:29:53.726654 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724380 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:29:53.726654 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724382 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:29:53.726654 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724385 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:29:53.726654 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724387 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:29:53.726654 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724390 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:29:53.726654 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724393 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:29:53.726654 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724395 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:29:53.726654 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724398 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:29:53.726654 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724401 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:29:53.726654 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724403 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:29:53.726654 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724406 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:29:53.726654 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724408 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:29:53.726654 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724411 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:29:53.726654 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724414 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:29:53.726654 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724416 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:29:53.726654 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724419 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:29:53.726654 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724421 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:29:53.727133 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724424 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:29:53.727133 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724427 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:29:53.727133 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724430 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:29:53.727133 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724432 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:29:53.727133 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724435 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:29:53.727133 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724438 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:29:53.727133 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724440 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:29:53.727133 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724443 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:29:53.727133 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724445 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:29:53.727133 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724448 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:29:53.727133 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724450 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:29:53.727133 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724454 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:29:53.727133 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724457 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:29:53.727133 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724459 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:29:53.727133 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724462 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:29:53.727133 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724467 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:29:53.727133 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724470 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:29:53.727133 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724473 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:29:53.727133 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724476 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:29:53.727611 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724479 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:29:53.727611 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724482 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:29:53.727611 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724486 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:29:53.727611 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724489 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:29:53.727611 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724491 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:29:53.727611 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724494 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:29:53.727611 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724497 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:29:53.727611 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724499 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:29:53.727611 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724502 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:29:53.727611 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724505 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:29:53.727611 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724508 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:29:53.727611 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724511 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:29:53.727611 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.724514 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:29:53.727611 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724598 2579 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 14:29:53.727611 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724617 2579 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 14:29:53.727611 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724624 2579 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 14:29:53.727611 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724629 2579 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 14:29:53.727611 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724634 2579 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 14:29:53.727611 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724637 2579 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 14:29:53.727611 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724642 2579 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 14:29:53.727611 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724647 2579 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 14:29:53.728129 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724650 2579 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 14:29:53.728129 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724653 2579 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 14:29:53.728129 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724656 2579 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 14:29:53.728129 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724660 2579 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 14:29:53.728129 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724664 2579 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 14:29:53.728129 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724667 2579 flags.go:64] FLAG: --cgroup-root="" Apr 16 14:29:53.728129 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724670 2579 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 14:29:53.728129 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724673 2579 flags.go:64] FLAG: --client-ca-file="" Apr 16 14:29:53.728129 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724676 2579 flags.go:64] FLAG: --cloud-config="" Apr 16 14:29:53.728129 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724681 2579 flags.go:64] FLAG: --cloud-provider="external" Apr 16 14:29:53.728129 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724684 2579 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 14:29:53.728129 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724689 2579 flags.go:64] FLAG: --cluster-domain="" Apr 16 14:29:53.728129 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724692 2579 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 14:29:53.728129 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724696 2579 flags.go:64] FLAG: --config-dir="" Apr 16 14:29:53.728129 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724699 2579 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 14:29:53.728129 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724702 2579 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 14:29:53.728129 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724706 2579 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 14:29:53.728129 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724710 2579 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 14:29:53.728129 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724713 2579 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 14:29:53.728129 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724717 2579 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 14:29:53.728129 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724720 2579 flags.go:64] FLAG: --contention-profiling="false" Apr 16 14:29:53.728129 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724723 2579 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 14:29:53.728129 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724726 2579 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 14:29:53.728129 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724729 2579 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 14:29:53.728129 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724732 2579 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 14:29:53.728741 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724736 2579 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 14:29:53.728741 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724739 2579 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 14:29:53.728741 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724742 2579 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 14:29:53.728741 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724745 2579 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 14:29:53.728741 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724748 2579 flags.go:64] FLAG: --enable-server="true" Apr 16 14:29:53.728741 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724751 2579 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 14:29:53.728741 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724756 2579 flags.go:64] FLAG: --event-burst="100" Apr 16 14:29:53.728741 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724759 2579 flags.go:64] FLAG: --event-qps="50" Apr 16 14:29:53.728741 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724762 2579 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 14:29:53.728741 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724765 2579 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 14:29:53.728741 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724768 2579 flags.go:64] FLAG: --eviction-hard="" Apr 16 14:29:53.728741 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724772 2579 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 14:29:53.728741 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724775 2579 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 14:29:53.728741 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724778 2579 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 14:29:53.728741 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724782 2579 flags.go:64] FLAG: --eviction-soft="" Apr 16 14:29:53.728741 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724785 2579 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 14:29:53.728741 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724790 2579 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 14:29:53.728741 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724793 2579 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 14:29:53.728741 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724796 2579 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 14:29:53.728741 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724799 2579 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 14:29:53.728741 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724802 2579 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 14:29:53.728741 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724805 2579 flags.go:64] FLAG: --feature-gates="" Apr 16 14:29:53.728741 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724809 2579 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 14:29:53.728741 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724812 2579 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 14:29:53.728741 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724815 2579 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 14:29:53.729357 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724820 2579 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 14:29:53.729357 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724823 2579 flags.go:64] FLAG: --healthz-port="10248" Apr 16 14:29:53.729357 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724826 2579 flags.go:64] FLAG: --help="false" Apr 16 14:29:53.729357 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724829 2579 flags.go:64] FLAG: --hostname-override="ip-10-0-140-144.ec2.internal" Apr 16 14:29:53.729357 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724833 2579 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 14:29:53.729357 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724836 2579 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 14:29:53.729357 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724839 2579 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 14:29:53.729357 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724842 2579 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 14:29:53.729357 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724846 2579 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 14:29:53.729357 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724849 2579 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 14:29:53.729357 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724852 2579 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 14:29:53.729357 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724855 2579 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 14:29:53.729357 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724858 2579 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 14:29:53.729357 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724861 2579 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 14:29:53.729357 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724864 2579 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 14:29:53.729357 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724867 2579 flags.go:64] FLAG: --kube-reserved="" Apr 16 14:29:53.729357 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724870 2579 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 14:29:53.729357 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724873 2579 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 14:29:53.729357 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724876 2579 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 14:29:53.729357 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724879 2579 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 14:29:53.729357 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724882 2579 flags.go:64] FLAG: --lock-file="" Apr 16 14:29:53.729357 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724885 2579 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 14:29:53.729357 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724888 2579 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 14:29:53.729357 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724892 2579 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 14:29:53.729943 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724898 2579 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 14:29:53.729943 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724901 2579 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 14:29:53.729943 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724904 2579 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 14:29:53.729943 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724907 2579 flags.go:64] FLAG: --logging-format="text" Apr 16 14:29:53.729943 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724909 2579 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 14:29:53.729943 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724913 2579 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 14:29:53.729943 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724916 2579 flags.go:64] FLAG: --manifest-url="" Apr 16 14:29:53.729943 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724918 2579 flags.go:64] FLAG: --manifest-url-header="" Apr 16 14:29:53.729943 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724925 2579 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 14:29:53.729943 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724929 2579 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 14:29:53.729943 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724933 2579 flags.go:64] FLAG: --max-pods="110" Apr 16 14:29:53.729943 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724936 2579 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 14:29:53.729943 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724939 2579 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 14:29:53.729943 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724942 2579 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 14:29:53.729943 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724945 2579 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 14:29:53.729943 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724948 2579 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 14:29:53.729943 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724951 2579 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 14:29:53.729943 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724954 2579 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 14:29:53.729943 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724963 2579 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 14:29:53.729943 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724966 2579 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 14:29:53.729943 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724969 2579 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 14:29:53.729943 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724972 2579 flags.go:64] FLAG: --pod-cidr="" Apr 16 14:29:53.729943 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724975 2579 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 14:29:53.730529 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724980 2579 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 14:29:53.730529 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724983 2579 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 14:29:53.730529 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724987 2579 flags.go:64] FLAG: --pods-per-core="0" Apr 16 14:29:53.730529 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724993 2579 flags.go:64] FLAG: --port="10250" Apr 16 14:29:53.730529 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.724997 2579 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 14:29:53.730529 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725000 2579 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-02f6184fc2884c717" Apr 16 14:29:53.730529 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725003 2579 flags.go:64] FLAG: --qos-reserved="" Apr 16 14:29:53.730529 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725006 2579 flags.go:64] FLAG: --read-only-port="10255" Apr 16 14:29:53.730529 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725010 2579 flags.go:64] FLAG: --register-node="true" Apr 16 14:29:53.730529 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725013 2579 flags.go:64] FLAG: --register-schedulable="true" Apr 16 14:29:53.730529 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725016 2579 flags.go:64] FLAG: --register-with-taints="" Apr 16 14:29:53.730529 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725019 2579 flags.go:64] FLAG: --registry-burst="10" Apr 16 14:29:53.730529 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725022 2579 flags.go:64] FLAG: --registry-qps="5" Apr 16 14:29:53.730529 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725025 2579 flags.go:64] FLAG: --reserved-cpus="" Apr 16 14:29:53.730529 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725040 2579 flags.go:64] FLAG: --reserved-memory="" Apr 16 14:29:53.730529 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725044 2579 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 14:29:53.730529 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725047 2579 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 14:29:53.730529 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725050 2579 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 14:29:53.730529 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725053 2579 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 14:29:53.730529 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725057 2579 flags.go:64] FLAG: --runonce="false" Apr 16 14:29:53.730529 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725060 2579 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 14:29:53.730529 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725063 2579 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 14:29:53.730529 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725066 2579 flags.go:64] FLAG: --seccomp-default="false" Apr 16 14:29:53.730529 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725069 2579 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 14:29:53.730529 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725072 2579 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 14:29:53.730529 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725075 2579 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 14:29:53.731162 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725079 2579 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 14:29:53.731162 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725082 2579 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 14:29:53.731162 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725085 2579 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 14:29:53.731162 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725088 2579 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 14:29:53.731162 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725091 2579 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 14:29:53.731162 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725093 2579 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 14:29:53.731162 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725097 2579 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 14:29:53.731162 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725100 2579 flags.go:64] FLAG: --system-cgroups="" Apr 16 14:29:53.731162 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725103 2579 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 14:29:53.731162 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725110 2579 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 14:29:53.731162 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725113 2579 flags.go:64] FLAG: --tls-cert-file="" Apr 16 14:29:53.731162 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725116 2579 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 14:29:53.731162 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725120 2579 flags.go:64] FLAG: --tls-min-version="" Apr 16 14:29:53.731162 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725123 2579 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 14:29:53.731162 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725128 2579 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 14:29:53.731162 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725131 2579 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 14:29:53.731162 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725134 2579 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 14:29:53.731162 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725138 2579 flags.go:64] FLAG: --v="2" Apr 16 14:29:53.731162 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725142 2579 flags.go:64] FLAG: --version="false" Apr 16 14:29:53.731162 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725146 2579 flags.go:64] FLAG: --vmodule="" Apr 16 14:29:53.731162 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725151 2579 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 14:29:53.731162 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725154 2579 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 14:29:53.731162 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725303 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:29:53.731162 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725308 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:29:53.731162 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725311 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:29:53.731799 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725315 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:29:53.731799 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725318 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:29:53.731799 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725321 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:29:53.731799 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725323 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:29:53.731799 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725326 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:29:53.731799 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725329 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:29:53.731799 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725331 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:29:53.731799 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725334 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:29:53.731799 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725337 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:29:53.731799 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725339 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:29:53.731799 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725342 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:29:53.731799 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725344 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:29:53.731799 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725347 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:29:53.731799 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725350 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:29:53.731799 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725352 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:29:53.731799 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725355 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:29:53.731799 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725359 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:29:53.731799 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725361 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:29:53.731799 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725364 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:29:53.731799 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725367 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:29:53.732338 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725370 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:29:53.732338 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725374 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:29:53.732338 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725377 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:29:53.732338 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725379 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:29:53.732338 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725382 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:29:53.732338 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725385 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:29:53.732338 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725388 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:29:53.732338 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725390 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:29:53.732338 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725393 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:29:53.732338 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725395 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:29:53.732338 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725398 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:29:53.732338 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725401 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:29:53.732338 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725403 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:29:53.732338 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725408 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:29:53.732338 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725417 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:29:53.732338 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725420 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:29:53.732338 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725423 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:29:53.732338 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725426 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:29:53.732338 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725430 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:29:53.732338 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725434 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:29:53.732835 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725437 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:29:53.732835 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725440 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:29:53.732835 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725442 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:29:53.732835 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725445 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:29:53.732835 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725447 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:29:53.732835 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725450 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:29:53.732835 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725453 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:29:53.732835 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725455 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:29:53.732835 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725460 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:29:53.732835 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725462 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:29:53.732835 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725465 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:29:53.732835 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725468 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:29:53.732835 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725470 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:29:53.732835 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725475 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:29:53.732835 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725477 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:29:53.732835 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725480 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:29:53.732835 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725483 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:29:53.732835 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725485 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:29:53.732835 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725488 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:29:53.732835 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725490 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:29:53.733343 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725493 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:29:53.733343 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725496 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:29:53.733343 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725498 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:29:53.733343 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725501 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:29:53.733343 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725504 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:29:53.733343 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725506 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:29:53.733343 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725509 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:29:53.733343 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725517 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:29:53.733343 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725520 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:29:53.733343 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725523 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:29:53.733343 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725525 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:29:53.733343 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725528 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:29:53.733343 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725530 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:29:53.733343 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725533 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:29:53.733343 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725536 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:29:53.733343 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725538 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:29:53.733343 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725541 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:29:53.733343 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725543 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:29:53.733343 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725546 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:29:53.733805 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725549 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:29:53.733805 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725553 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:29:53.733805 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725555 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:29:53.733805 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.725558 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:29:53.733805 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.725564 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:29:53.733805 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.732885 2579 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 14:29:53.733805 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.732902 2579 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 14:29:53.733805 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.732951 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:29:53.733805 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.732957 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:29:53.733805 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.732962 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:29:53.733805 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.732965 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:29:53.733805 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.732968 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:29:53.733805 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.732971 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:29:53.733805 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.732974 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:29:53.733805 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.732977 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:29:53.734203 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.732980 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:29:53.734203 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.732983 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:29:53.734203 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.732985 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:29:53.734203 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.732988 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:29:53.734203 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.732991 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:29:53.734203 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.732993 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:29:53.734203 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.732996 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:29:53.734203 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.732998 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:29:53.734203 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733001 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:29:53.734203 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733004 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:29:53.734203 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733006 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:29:53.734203 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733009 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:29:53.734203 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733011 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:29:53.734203 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733014 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:29:53.734203 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733016 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:29:53.734203 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733019 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:29:53.734203 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733021 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:29:53.734203 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733024 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:29:53.734203 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733026 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:29:53.734203 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733045 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:29:53.734701 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733048 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:29:53.734701 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733051 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:29:53.734701 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733054 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:29:53.734701 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733058 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:29:53.734701 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733061 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:29:53.734701 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733064 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:29:53.734701 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733067 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:29:53.734701 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733070 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:29:53.734701 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733073 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:29:53.734701 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733076 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:29:53.734701 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733078 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:29:53.734701 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733081 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:29:53.734701 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733083 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:29:53.734701 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733086 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:29:53.734701 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733089 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:29:53.734701 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733092 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:29:53.734701 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733094 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:29:53.734701 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733097 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:29:53.734701 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733100 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:29:53.734701 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733102 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:29:53.735207 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733105 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:29:53.735207 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733108 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:29:53.735207 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733111 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:29:53.735207 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733113 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:29:53.735207 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733116 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:29:53.735207 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733119 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:29:53.735207 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733121 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:29:53.735207 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733124 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:29:53.735207 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733127 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:29:53.735207 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733129 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:29:53.735207 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733131 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:29:53.735207 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733134 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:29:53.735207 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733136 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:29:53.735207 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733140 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:29:53.735207 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733142 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:29:53.735207 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733146 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:29:53.735207 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733155 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:29:53.735207 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733158 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:29:53.735207 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733161 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:29:53.735207 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733163 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:29:53.735737 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733166 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:29:53.735737 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733168 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:29:53.735737 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733171 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:29:53.735737 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733173 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:29:53.735737 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733177 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:29:53.735737 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733180 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:29:53.735737 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733183 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:29:53.735737 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733186 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:29:53.735737 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733188 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:29:53.735737 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733191 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:29:53.735737 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733194 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:29:53.735737 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733196 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:29:53.735737 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733198 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:29:53.735737 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733201 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:29:53.735737 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733203 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:29:53.735737 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733206 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:29:53.735737 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733209 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:29:53.735737 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733211 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:29:53.736196 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.733217 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:29:53.736196 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733321 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:29:53.736196 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733326 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:29:53.736196 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733329 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:29:53.736196 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733332 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:29:53.736196 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733335 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:29:53.736196 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733337 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:29:53.736196 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733341 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:29:53.736196 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733343 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:29:53.736196 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733346 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:29:53.736196 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733349 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:29:53.736196 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733353 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:29:53.736196 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733356 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:29:53.736196 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733358 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:29:53.736196 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733361 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:29:53.736591 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733364 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:29:53.736591 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733368 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:29:53.736591 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733372 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:29:53.736591 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733375 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:29:53.736591 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733377 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:29:53.736591 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733380 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:29:53.736591 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733383 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:29:53.736591 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733386 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:29:53.736591 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733388 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:29:53.736591 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733391 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:29:53.736591 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733394 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:29:53.736591 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733396 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:29:53.736591 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733400 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:29:53.736591 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733403 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:29:53.736591 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733405 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:29:53.736591 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733408 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:29:53.736591 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733410 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:29:53.736591 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733413 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:29:53.736591 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733416 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:29:53.737317 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733418 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:29:53.737317 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733421 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:29:53.737317 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733423 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:29:53.737317 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733426 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:29:53.737317 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733428 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:29:53.737317 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733431 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:29:53.737317 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733434 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:29:53.737317 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733436 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:29:53.737317 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733439 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:29:53.737317 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733442 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:29:53.737317 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733446 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:29:53.737317 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733449 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:29:53.737317 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733452 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:29:53.737317 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733454 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:29:53.737317 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733457 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:29:53.737317 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733459 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:29:53.737317 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733462 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:29:53.737317 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733464 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:29:53.737317 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733467 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:29:53.737317 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733469 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:29:53.737980 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733472 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:29:53.737980 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733474 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:29:53.737980 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733477 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:29:53.737980 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733479 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:29:53.737980 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733482 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:29:53.737980 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733484 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:29:53.737980 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733487 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:29:53.737980 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733489 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:29:53.737980 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733492 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:29:53.737980 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733495 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:29:53.737980 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733497 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:29:53.737980 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733500 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:29:53.737980 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733503 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:29:53.737980 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733505 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:29:53.737980 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733508 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:29:53.737980 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733511 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:29:53.737980 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733513 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:29:53.737980 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733517 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:29:53.737980 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733520 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:29:53.737980 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733522 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:29:53.738698 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733525 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:29:53.738698 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733528 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:29:53.738698 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733530 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:29:53.738698 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733533 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:29:53.738698 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733535 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:29:53.738698 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733538 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:29:53.738698 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733541 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:29:53.738698 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733543 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:29:53.738698 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733546 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:29:53.738698 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733548 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:29:53.738698 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733551 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:29:53.738698 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733553 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:29:53.738698 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:53.733556 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:29:53.738698 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.733561 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:29:53.738698 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.734301 2579 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 14:29:53.739118 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.736454 2579 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 14:29:53.739217 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.739162 2579 server.go:1019] "Starting client certificate rotation" Apr 16 14:29:53.739438 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.739419 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 14:29:53.739496 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.739467 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 14:29:53.765902 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.765874 2579 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 14:29:53.770667 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.770644 2579 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 14:29:53.788562 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.788538 2579 log.go:25] "Validated CRI v1 runtime API" Apr 16 14:29:53.793969 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.793948 2579 log.go:25] "Validated CRI v1 image API" Apr 16 14:29:53.795911 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.795893 2579 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 14:29:53.797909 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.797888 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 14:29:53.800896 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.800875 2579 fs.go:135] Filesystem UUIDs: map[36a2182a-3095-4bcd-a07e-41c9de4b4ff6:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 f6a80d73-cd66-4744-93d8-7a7cef244bba:/dev/nvme0n1p3] Apr 16 14:29:53.800896 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.800896 2579 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 14:29:53.807941 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.807820 2579 manager.go:217] Machine: {Timestamp:2026-04-16 14:29:53.805686556 +0000 UTC m=+0.423476313 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100119 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec25e15ef59dc8c947de7afa7183e26a SystemUUID:ec25e15e-f59d-c8c9-47de-7afa7183e26a BootID:fbe04fdd-3648-44f7-8299-6b389cf2a709 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:7a:fb:34:bb:57 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:7a:fb:34:bb:57 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ee:03:0b:f9:56:35 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 14:29:53.807941 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.807929 2579 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 14:29:53.808108 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.808073 2579 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 14:29:53.809307 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.809277 2579 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 14:29:53.809483 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.809310 2579 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-140-144.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 14:29:53.809565 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.809498 2579 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 14:29:53.809565 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.809510 2579 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 14:29:53.809565 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.809529 2579 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 14:29:53.810592 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.810580 2579 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 14:29:53.811532 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.811519 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 16 14:29:53.811678 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.811667 2579 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 14:29:53.814582 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.814570 2579 kubelet.go:491] "Attempting to sync node with API server" Apr 16 14:29:53.815094 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.815083 2579 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 14:29:53.815146 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.815112 2579 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 14:29:53.815146 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.815127 2579 kubelet.go:397] "Adding apiserver pod source" Apr 16 14:29:53.815146 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.815140 2579 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 14:29:53.816257 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.816243 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 14:29:53.816322 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.816268 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 14:29:53.817578 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.817558 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-pg7nn" Apr 16 14:29:53.819465 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.819442 2579 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 14:29:53.821250 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.821237 2579 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 14:29:53.822692 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.822679 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 14:29:53.822759 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.822696 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 14:29:53.822759 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.822704 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 14:29:53.822759 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.822713 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 14:29:53.822759 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.822718 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 14:29:53.822759 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.822725 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 14:29:53.822759 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.822731 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 14:29:53.822759 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.822736 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 14:29:53.822759 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.822743 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 14:29:53.822759 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.822749 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 14:29:53.822759 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.822759 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 14:29:53.823102 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.822771 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 14:29:53.823686 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.823675 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 14:29:53.823720 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.823687 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 14:29:53.824370 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:53.824346 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 14:29:53.824428 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:53.824404 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-140-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 14:29:53.824787 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.824769 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-pg7nn" Apr 16 14:29:53.827255 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.827242 2579 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 14:29:53.827330 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.827278 2579 server.go:1295] "Started kubelet" Apr 16 14:29:53.827412 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.827359 2579 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 14:29:53.827515 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.827424 2579 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 14:29:53.827515 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.827490 2579 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 14:29:53.827965 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.827947 2579 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 14:29:53.828698 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.828679 2579 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 14:29:53.828688 ip-10-0-140-144 systemd[1]: Started Kubernetes Kubelet. Apr 16 14:29:53.830390 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.830376 2579 server.go:317] "Adding debug handlers to kubelet server" Apr 16 14:29:53.834948 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.834928 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 14:29:53.835413 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.835397 2579 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 14:29:53.836109 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.836092 2579 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 14:29:53.836109 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.836104 2579 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 14:29:53.836222 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.836119 2579 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 14:29:53.836325 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:53.836299 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-144.ec2.internal\" not found" Apr 16 14:29:53.836398 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.836331 2579 reconstruct.go:97] "Volume reconstruction finished" Apr 16 14:29:53.836398 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.836343 2579 reconciler.go:26] "Reconciler: start to sync state" Apr 16 14:29:53.837251 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.837232 2579 factory.go:55] Registering systemd factory Apr 16 14:29:53.837251 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.837253 2579 factory.go:223] Registration of the systemd container factory successfully Apr 16 14:29:53.837569 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.837547 2579 factory.go:153] Registering CRI-O factory Apr 16 14:29:53.837569 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.837565 2579 factory.go:223] Registration of the crio container factory successfully Apr 16 14:29:53.837676 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.837630 2579 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 14:29:53.837676 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.837659 2579 factory.go:103] Registering Raw factory Apr 16 14:29:53.837676 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.837675 2579 manager.go:1196] Started watching for new ooms in manager Apr 16 14:29:53.838094 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.838083 2579 manager.go:319] Starting recovery of all containers Apr 16 14:29:53.838481 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.838461 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:29:53.838737 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:53.838718 2579 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 14:29:53.845020 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:53.844999 2579 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-140-144.ec2.internal\" not found" node="ip-10-0-140-144.ec2.internal" Apr 16 14:29:53.853266 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.853110 2579 manager.go:324] Recovery completed Apr 16 14:29:53.854479 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:53.854454 2579 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 16 14:29:53.857444 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.857432 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:29:53.859701 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.859685 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:29:53.859772 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.859717 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:29:53.859772 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.859731 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:29:53.860290 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.860277 2579 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 14:29:53.860351 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.860290 2579 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 14:29:53.860351 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.860312 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 16 14:29:53.862807 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.862795 2579 policy_none.go:49] "None policy: Start" Apr 16 14:29:53.862841 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.862812 2579 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 14:29:53.862841 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.862822 2579 state_mem.go:35] "Initializing new in-memory state store" Apr 16 14:29:53.902749 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.902727 2579 manager.go:341] "Starting Device Plugin manager" Apr 16 14:29:53.915278 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:53.902787 2579 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 14:29:53.915278 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.902802 2579 server.go:85] "Starting device plugin registration server" Apr 16 14:29:53.915278 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.903087 2579 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 14:29:53.915278 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.903100 2579 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 14:29:53.915278 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.903180 2579 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 14:29:53.915278 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.903260 2579 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 14:29:53.915278 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.903268 2579 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 14:29:53.915278 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:53.903816 2579 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 14:29:53.915278 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:53.903853 2579 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-140-144.ec2.internal\" not found" Apr 16 14:29:53.972023 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.971932 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 14:29:53.973108 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.973092 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 14:29:53.973172 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.973130 2579 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 14:29:53.973172 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.973158 2579 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 14:29:53.973172 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.973166 2579 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 14:29:53.973296 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:53.973207 2579 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 14:29:53.976631 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:53.976614 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:29:54.003781 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.003759 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:29:54.004738 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.004721 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:29:54.004820 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.004752 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:29:54.004820 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.004762 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:29:54.004820 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.004788 2579 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-144.ec2.internal" Apr 16 14:29:54.013744 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.013729 2579 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-140-144.ec2.internal" Apr 16 14:29:54.013801 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:54.013752 2579 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-140-144.ec2.internal\": node \"ip-10-0-140-144.ec2.internal\" not found" Apr 16 14:29:54.027528 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:54.027509 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-144.ec2.internal\" not found" Apr 16 14:29:54.073501 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.073464 2579 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-140-144.ec2.internal"] Apr 16 14:29:54.073585 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.073561 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:29:54.074535 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.074518 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:29:54.074611 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.074550 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:29:54.074611 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.074561 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:29:54.075795 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.075782 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:29:54.075929 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.075915 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal" Apr 16 14:29:54.075965 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.075960 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:29:54.076479 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.076460 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:29:54.076551 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.076485 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:29:54.076551 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.076495 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:29:54.076551 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.076462 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:29:54.076638 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.076560 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:29:54.076638 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.076575 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:29:54.078346 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.078331 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-144.ec2.internal" Apr 16 14:29:54.078399 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.078362 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:29:54.079006 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.078987 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:29:54.079084 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.079021 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:29:54.079084 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.079051 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:29:54.102872 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:54.102846 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-144.ec2.internal\" not found" node="ip-10-0-140-144.ec2.internal" Apr 16 14:29:54.107316 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:54.107298 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-144.ec2.internal\" not found" node="ip-10-0-140-144.ec2.internal" Apr 16 14:29:54.128170 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:54.128148 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-144.ec2.internal\" not found" Apr 16 14:29:54.137828 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.137806 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e28845082dd3a225de8448aa7a81a8c9-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal\" (UID: \"e28845082dd3a225de8448aa7a81a8c9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal" Apr 16 14:29:54.137890 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.137835 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e28845082dd3a225de8448aa7a81a8c9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal\" (UID: \"e28845082dd3a225de8448aa7a81a8c9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal" Apr 16 14:29:54.137890 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.137853 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ab72f00075c1175e59c7e696357a6702-config\") pod \"kube-apiserver-proxy-ip-10-0-140-144.ec2.internal\" (UID: \"ab72f00075c1175e59c7e696357a6702\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-144.ec2.internal" Apr 16 14:29:54.229126 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:54.229048 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-144.ec2.internal\" not found" Apr 16 14:29:54.238598 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.238574 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e28845082dd3a225de8448aa7a81a8c9-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal\" (UID: \"e28845082dd3a225de8448aa7a81a8c9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal" Apr 16 14:29:54.238657 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.238604 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e28845082dd3a225de8448aa7a81a8c9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal\" (UID: \"e28845082dd3a225de8448aa7a81a8c9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal" Apr 16 14:29:54.238657 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.238624 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ab72f00075c1175e59c7e696357a6702-config\") pod \"kube-apiserver-proxy-ip-10-0-140-144.ec2.internal\" (UID: \"ab72f00075c1175e59c7e696357a6702\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-144.ec2.internal" Apr 16 14:29:54.238723 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.238666 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e28845082dd3a225de8448aa7a81a8c9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal\" (UID: \"e28845082dd3a225de8448aa7a81a8c9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal" Apr 16 14:29:54.238723 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.238668 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e28845082dd3a225de8448aa7a81a8c9-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal\" (UID: \"e28845082dd3a225de8448aa7a81a8c9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal" Apr 16 14:29:54.238723 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.238713 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ab72f00075c1175e59c7e696357a6702-config\") pod \"kube-apiserver-proxy-ip-10-0-140-144.ec2.internal\" (UID: \"ab72f00075c1175e59c7e696357a6702\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-144.ec2.internal" Apr 16 14:29:54.329574 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:54.329540 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-144.ec2.internal\" not found" Apr 16 14:29:54.407049 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.407001 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal" Apr 16 14:29:54.410689 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.410672 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-144.ec2.internal" Apr 16 14:29:54.429777 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:54.429736 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-144.ec2.internal\" not found" Apr 16 14:29:54.530380 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:54.530275 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-144.ec2.internal\" not found" Apr 16 14:29:54.630779 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:54.630748 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-144.ec2.internal\" not found" Apr 16 14:29:54.731394 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:54.731362 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-144.ec2.internal\" not found" Apr 16 14:29:54.738746 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.738724 2579 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 14:29:54.738869 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.738859 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 14:29:54.738931 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.738906 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 14:29:54.807894 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.807823 2579 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:29:54.815797 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.815766 2579 apiserver.go:52] "Watching apiserver" Apr 16 14:29:54.825658 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.825142 2579 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 14:29:54.826380 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.826330 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 14:24:53 +0000 UTC" deadline="2028-01-12 00:25:01.45728909 +0000 UTC" Apr 16 14:29:54.826380 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.826376 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15249h55m6.630915543s" Apr 16 14:29:54.827185 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.827158 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-t5h5z","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kdm8","openshift-cluster-node-tuning-operator/tuned-7jmkf","openshift-dns/node-resolver-vjfvw","openshift-multus/multus-v76f6","openshift-multus/network-metrics-daemon-m57qr","openshift-image-registry/node-ca-4648f","openshift-multus/multus-additional-cni-plugins-7dhhl","openshift-network-diagnostics/network-check-target-l6qjc","openshift-network-operator/iptables-alerter-k7pvw","openshift-ovn-kubernetes/ovnkube-node-vclsl"] Apr 16 14:29:54.829150 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.829134 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-t5h5z" Apr 16 14:29:54.830377 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.830361 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kdm8" Apr 16 14:29:54.831518 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.831497 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-wfqmd\"" Apr 16 14:29:54.831628 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.831497 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 14:29:54.831628 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.831497 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 14:29:54.832545 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.832528 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 14:29:54.832640 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.832550 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 14:29:54.832640 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.832611 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 14:29:54.832736 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.832703 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-vd8z2\"" Apr 16 14:29:54.832794 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.832778 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.832880 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.832865 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vjfvw" Apr 16 14:29:54.834337 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.834316 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.834656 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.834637 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-x9kfh\"" Apr 16 14:29:54.834970 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.834930 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:29:54.835095 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.834993 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 14:29:54.835095 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.835007 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-d74dj\"" Apr 16 14:29:54.835095 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.835014 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 14:29:54.835095 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.835066 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 14:29:54.835095 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.835089 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 14:29:54.835563 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.835551 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal" Apr 16 14:29:54.836359 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.836197 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 14:29:54.836359 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.836256 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 14:29:54.836359 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.836267 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 14:29:54.836646 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.836394 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 14:29:54.836646 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.836642 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m57qr" Apr 16 14:29:54.836752 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.836685 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-84lv2\"" Apr 16 14:29:54.836752 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:54.836709 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m57qr" podUID="20614211-2bef-41dc-aad8-94242eb8364c" Apr 16 14:29:54.836844 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.836759 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4648f" Apr 16 14:29:54.838085 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.838070 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7dhhl" Apr 16 14:29:54.838898 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.838880 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 14:29:54.839399 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.839384 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l6qjc" Apr 16 14:29:54.839483 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:54.839454 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l6qjc" podUID="38b35ba8-4e6c-4198-bb10-ea4df7f8816a" Apr 16 14:29:54.840131 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.839935 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 14:29:54.840131 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.839942 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 14:29:54.840131 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.840025 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 14:29:54.840318 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.840278 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-hs6n7\"" Apr 16 14:29:54.840371 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.840337 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-ddxcq\"" Apr 16 14:29:54.840371 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.840364 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 14:29:54.840929 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.840914 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-k7pvw" Apr 16 14:29:54.840995 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.840928 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-host-var-lib-cni-multus\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.840995 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.840956 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/76137a5e-4deb-4e87-b7e5-d17bde0111e4-host\") pod \"node-ca-4648f\" (UID: \"76137a5e-4deb-4e87-b7e5-d17bde0111e4\") " pod="openshift-image-registry/node-ca-4648f" Apr 16 14:29:54.840995 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.840982 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/76137a5e-4deb-4e87-b7e5-d17bde0111e4-serviceca\") pod \"node-ca-4648f\" (UID: \"76137a5e-4deb-4e87-b7e5-d17bde0111e4\") " pod="openshift-image-registry/node-ca-4648f" Apr 16 14:29:54.841159 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841010 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl5zt\" (UniqueName: \"kubernetes.io/projected/83ceccd4-351d-4f7b-a366-eff6c1e34ba1-kube-api-access-fl5zt\") pod \"aws-ebs-csi-driver-node-7kdm8\" (UID: \"83ceccd4-351d-4f7b-a366-eff6c1e34ba1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kdm8" Apr 16 14:29:54.841159 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841120 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20614211-2bef-41dc-aad8-94242eb8364c-metrics-certs\") pod \"network-metrics-daemon-m57qr\" (UID: \"20614211-2bef-41dc-aad8-94242eb8364c\") " pod="openshift-multus/network-metrics-daemon-m57qr" Apr 16 14:29:54.841261 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841167 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/545d3883-e3bc-4b57-b29f-358dd3038d53-etc-modprobe-d\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.841261 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841194 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/545d3883-e3bc-4b57-b29f-358dd3038d53-var-lib-kubelet\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.841261 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841216 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-system-cni-dir\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.841261 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841247 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-multus-cni-dir\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.841399 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841269 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-host-run-k8s-cni-cncf-io\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.841399 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841286 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvrnm\" (UniqueName: \"kubernetes.io/projected/1ec5f4c5-a526-4cf8-82d9-0b554ed2fbaa-kube-api-access-rvrnm\") pod \"node-resolver-vjfvw\" (UID: \"1ec5f4c5-a526-4cf8-82d9-0b554ed2fbaa\") " pod="openshift-dns/node-resolver-vjfvw" Apr 16 14:29:54.841399 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841300 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/545d3883-e3bc-4b57-b29f-358dd3038d53-etc-sysctl-conf\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.841399 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841346 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45lhk\" (UniqueName: \"kubernetes.io/projected/79ff87e3-bc60-4320-a398-0c605679612c-kube-api-access-45lhk\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.841399 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841381 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/545d3883-e3bc-4b57-b29f-358dd3038d53-run\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.841583 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841403 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qqdb\" (UniqueName: \"kubernetes.io/projected/545d3883-e3bc-4b57-b29f-358dd3038d53-kube-api-access-4qqdb\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.841583 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841429 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-host-run-netns\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.841583 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841446 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/42343077-d6d1-4ad1-ba48-405f8545fbef-konnectivity-ca\") pod \"konnectivity-agent-t5h5z\" (UID: \"42343077-d6d1-4ad1-ba48-405f8545fbef\") " pod="kube-system/konnectivity-agent-t5h5z" Apr 16 14:29:54.841583 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841461 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/83ceccd4-351d-4f7b-a366-eff6c1e34ba1-registration-dir\") pod \"aws-ebs-csi-driver-node-7kdm8\" (UID: \"83ceccd4-351d-4f7b-a366-eff6c1e34ba1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kdm8" Apr 16 14:29:54.841583 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841477 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/83ceccd4-351d-4f7b-a366-eff6c1e34ba1-sys-fs\") pod \"aws-ebs-csi-driver-node-7kdm8\" (UID: \"83ceccd4-351d-4f7b-a366-eff6c1e34ba1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kdm8" Apr 16 14:29:54.841583 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841495 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1ec5f4c5-a526-4cf8-82d9-0b554ed2fbaa-tmp-dir\") pod \"node-resolver-vjfvw\" (UID: \"1ec5f4c5-a526-4cf8-82d9-0b554ed2fbaa\") " pod="openshift-dns/node-resolver-vjfvw" Apr 16 14:29:54.841583 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841541 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/545d3883-e3bc-4b57-b29f-358dd3038d53-etc-tuned\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.841583 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841556 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/545d3883-e3bc-4b57-b29f-358dd3038d53-tmp\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.841583 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841578 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-cnibin\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.841905 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841593 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/79ff87e3-bc60-4320-a398-0c605679612c-cni-binary-copy\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.841905 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841607 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-host-var-lib-cni-bin\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.841905 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841624 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/83ceccd4-351d-4f7b-a366-eff6c1e34ba1-device-dir\") pod \"aws-ebs-csi-driver-node-7kdm8\" (UID: \"83ceccd4-351d-4f7b-a366-eff6c1e34ba1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kdm8" Apr 16 14:29:54.841905 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841638 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/83ceccd4-351d-4f7b-a366-eff6c1e34ba1-etc-selinux\") pod \"aws-ebs-csi-driver-node-7kdm8\" (UID: \"83ceccd4-351d-4f7b-a366-eff6c1e34ba1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kdm8" Apr 16 14:29:54.841905 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841657 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/545d3883-e3bc-4b57-b29f-358dd3038d53-etc-sysconfig\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.841905 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841692 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-host-var-lib-kubelet\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.841905 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841723 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-hostroot\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.841905 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841762 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-multus-conf-dir\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.841905 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841792 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/79ff87e3-bc60-4320-a398-0c605679612c-multus-daemon-config\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.841905 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841809 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/83ceccd4-351d-4f7b-a366-eff6c1e34ba1-socket-dir\") pod \"aws-ebs-csi-driver-node-7kdm8\" (UID: \"83ceccd4-351d-4f7b-a366-eff6c1e34ba1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kdm8" Apr 16 14:29:54.841905 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841844 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlwrp\" (UniqueName: \"kubernetes.io/projected/20614211-2bef-41dc-aad8-94242eb8364c-kube-api-access-dlwrp\") pod \"network-metrics-daemon-m57qr\" (UID: \"20614211-2bef-41dc-aad8-94242eb8364c\") " pod="openshift-multus/network-metrics-daemon-m57qr" Apr 16 14:29:54.841905 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841881 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/545d3883-e3bc-4b57-b29f-358dd3038d53-etc-sysctl-d\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.841905 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841904 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/545d3883-e3bc-4b57-b29f-358dd3038d53-lib-modules\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.842335 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841919 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-multus-socket-dir-parent\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.842335 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841935 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-host-run-multus-certs\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.842335 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841953 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-etc-kubernetes\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.842335 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841975 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/83ceccd4-351d-4f7b-a366-eff6c1e34ba1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7kdm8\" (UID: \"83ceccd4-351d-4f7b-a366-eff6c1e34ba1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kdm8" Apr 16 14:29:54.842335 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.841994 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/545d3883-e3bc-4b57-b29f-358dd3038d53-etc-kubernetes\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.842335 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.842008 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8krdt\" (UniqueName: \"kubernetes.io/projected/76137a5e-4deb-4e87-b7e5-d17bde0111e4-kube-api-access-8krdt\") pod \"node-ca-4648f\" (UID: \"76137a5e-4deb-4e87-b7e5-d17bde0111e4\") " pod="openshift-image-registry/node-ca-4648f" Apr 16 14:29:54.842335 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.842022 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/42343077-d6d1-4ad1-ba48-405f8545fbef-agent-certs\") pod \"konnectivity-agent-t5h5z\" (UID: \"42343077-d6d1-4ad1-ba48-405f8545fbef\") " pod="kube-system/konnectivity-agent-t5h5z" Apr 16 14:29:54.842335 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.842053 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1ec5f4c5-a526-4cf8-82d9-0b554ed2fbaa-hosts-file\") pod \"node-resolver-vjfvw\" (UID: \"1ec5f4c5-a526-4cf8-82d9-0b554ed2fbaa\") " pod="openshift-dns/node-resolver-vjfvw" Apr 16 14:29:54.842335 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.842075 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/545d3883-e3bc-4b57-b29f-358dd3038d53-etc-systemd\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.842335 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.842096 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/545d3883-e3bc-4b57-b29f-358dd3038d53-sys\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.842335 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.842111 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/545d3883-e3bc-4b57-b29f-358dd3038d53-host\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.842335 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.842139 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-os-release\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.842335 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.842264 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:54.842774 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.842730 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-ktzjx\"" Apr 16 14:29:54.842814 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.842784 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:29:54.842814 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.842804 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 14:29:54.843021 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.843006 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 14:29:54.844606 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.844432 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 14:29:54.844606 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.844483 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 14:29:54.844606 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.844484 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 14:29:54.844606 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.844506 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-c7xln\"" Apr 16 14:29:54.844892 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.844661 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 14:29:54.844892 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.844699 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 14:29:54.844892 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.844774 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 14:29:54.847572 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.847540 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal"] Apr 16 14:29:54.848949 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.848926 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 14:29:54.849094 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.849073 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-144.ec2.internal" Apr 16 14:29:54.852156 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.852137 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 14:29:54.855991 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.855975 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-140-144.ec2.internal"] Apr 16 14:29:54.856079 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.856044 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 14:29:54.867835 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.867816 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-w7gpj" Apr 16 14:29:54.875086 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:54.875049 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab72f00075c1175e59c7e696357a6702.slice/crio-f16a25e83fa398d669ac3a5807514b60a750304b102e82b2a80be816832c3061 WatchSource:0}: Error finding container f16a25e83fa398d669ac3a5807514b60a750304b102e82b2a80be816832c3061: Status 404 returned error can't find the container with id f16a25e83fa398d669ac3a5807514b60a750304b102e82b2a80be816832c3061 Apr 16 14:29:54.875361 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:54.875339 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode28845082dd3a225de8448aa7a81a8c9.slice/crio-fb81689a0419b7c548b5eac932d16f8b79fb6a869a1ba916890a88e702b29eb3 WatchSource:0}: Error finding container fb81689a0419b7c548b5eac932d16f8b79fb6a869a1ba916890a88e702b29eb3: Status 404 returned error can't find the container with id fb81689a0419b7c548b5eac932d16f8b79fb6a869a1ba916890a88e702b29eb3 Apr 16 14:29:54.876570 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.876554 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-w7gpj" Apr 16 14:29:54.881119 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.881103 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:29:54.937555 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.937531 2579 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 14:29:54.942570 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.942549 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/79ff87e3-bc60-4320-a398-0c605679612c-cni-binary-copy\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.942669 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.942582 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-host-var-lib-kubelet\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.942669 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.942606 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-hostroot\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.942669 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.942635 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/88fbb2da-2de8-4d14-aa18-1817fb16e61c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7dhhl\" (UID: \"88fbb2da-2de8-4d14-aa18-1817fb16e61c\") " pod="openshift-multus/multus-additional-cni-plugins-7dhhl" Apr 16 14:29:54.942669 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.942658 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-host-var-lib-kubelet\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.942811 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.942676 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-hostroot\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.942811 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.942679 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv7f7\" (UniqueName: \"kubernetes.io/projected/88fbb2da-2de8-4d14-aa18-1817fb16e61c-kube-api-access-nv7f7\") pod \"multus-additional-cni-plugins-7dhhl\" (UID: \"88fbb2da-2de8-4d14-aa18-1817fb16e61c\") " pod="openshift-multus/multus-additional-cni-plugins-7dhhl" Apr 16 14:29:54.942811 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.942725 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/83ceccd4-351d-4f7b-a366-eff6c1e34ba1-etc-selinux\") pod \"aws-ebs-csi-driver-node-7kdm8\" (UID: \"83ceccd4-351d-4f7b-a366-eff6c1e34ba1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kdm8" Apr 16 14:29:54.942811 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.942748 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-etc-kubernetes\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.942811 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.942771 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk9qc\" (UniqueName: \"kubernetes.io/projected/38b35ba8-4e6c-4198-bb10-ea4df7f8816a-kube-api-access-nk9qc\") pod \"network-check-target-l6qjc\" (UID: \"38b35ba8-4e6c-4198-bb10-ea4df7f8816a\") " pod="openshift-network-diagnostics/network-check-target-l6qjc" Apr 16 14:29:54.942811 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.942789 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-host-kubelet\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:54.942811 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.942807 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-host-slash\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:54.943121 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.942830 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/83ceccd4-351d-4f7b-a366-eff6c1e34ba1-etc-selinux\") pod \"aws-ebs-csi-driver-node-7kdm8\" (UID: \"83ceccd4-351d-4f7b-a366-eff6c1e34ba1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kdm8" Apr 16 14:29:54.943121 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.942841 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dc8ac89f-dee2-4e7c-b409-39b0900c673e-ovnkube-config\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:54.943121 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.942879 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-etc-kubernetes\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.943121 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.942920 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dlwrp\" (UniqueName: \"kubernetes.io/projected/20614211-2bef-41dc-aad8-94242eb8364c-kube-api-access-dlwrp\") pod \"network-metrics-daemon-m57qr\" (UID: \"20614211-2bef-41dc-aad8-94242eb8364c\") " pod="openshift-multus/network-metrics-daemon-m57qr" Apr 16 14:29:54.943121 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.942956 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/545d3883-e3bc-4b57-b29f-358dd3038d53-etc-sysctl-d\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.943121 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.942979 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-multus-socket-dir-parent\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.943121 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.942997 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/88fbb2da-2de8-4d14-aa18-1817fb16e61c-cni-binary-copy\") pod \"multus-additional-cni-plugins-7dhhl\" (UID: \"88fbb2da-2de8-4d14-aa18-1817fb16e61c\") " pod="openshift-multus/multus-additional-cni-plugins-7dhhl" Apr 16 14:29:54.943121 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943059 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-multus-socket-dir-parent\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.943121 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943058 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2f2a6e36-a780-456c-a172-4207fbfa5df6-iptables-alerter-script\") pod \"iptables-alerter-k7pvw\" (UID: \"2f2a6e36-a780-456c-a172-4207fbfa5df6\") " pod="openshift-network-operator/iptables-alerter-k7pvw" Apr 16 14:29:54.943121 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943092 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-host-run-netns\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:54.943121 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943120 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-run-systemd\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:54.943121 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943125 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/545d3883-e3bc-4b57-b29f-358dd3038d53-etc-sysctl-d\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.943612 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943120 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/79ff87e3-bc60-4320-a398-0c605679612c-cni-binary-copy\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.943612 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943159 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/83ceccd4-351d-4f7b-a366-eff6c1e34ba1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7kdm8\" (UID: \"83ceccd4-351d-4f7b-a366-eff6c1e34ba1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kdm8" Apr 16 14:29:54.943612 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943184 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-node-log\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:54.943612 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943205 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/545d3883-e3bc-4b57-b29f-358dd3038d53-etc-systemd\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.943612 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943228 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/545d3883-e3bc-4b57-b29f-358dd3038d53-sys\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.943612 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943244 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-os-release\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.943612 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943260 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9mxr\" (UniqueName: \"kubernetes.io/projected/dc8ac89f-dee2-4e7c-b409-39b0900c673e-kube-api-access-h9mxr\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:54.943612 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943279 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fl5zt\" (UniqueName: \"kubernetes.io/projected/83ceccd4-351d-4f7b-a366-eff6c1e34ba1-kube-api-access-fl5zt\") pod \"aws-ebs-csi-driver-node-7kdm8\" (UID: \"83ceccd4-351d-4f7b-a366-eff6c1e34ba1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kdm8" Apr 16 14:29:54.943612 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943288 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/545d3883-e3bc-4b57-b29f-358dd3038d53-sys\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.943612 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943298 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/545d3883-e3bc-4b57-b29f-358dd3038d53-etc-modprobe-d\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.943612 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943227 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/83ceccd4-351d-4f7b-a366-eff6c1e34ba1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7kdm8\" (UID: \"83ceccd4-351d-4f7b-a366-eff6c1e34ba1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kdm8" Apr 16 14:29:54.943612 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943332 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-os-release\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.943612 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943332 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/545d3883-e3bc-4b57-b29f-358dd3038d53-var-lib-kubelet\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.943612 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943369 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-system-cni-dir\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.943612 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943376 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/545d3883-e3bc-4b57-b29f-358dd3038d53-etc-modprobe-d\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.943612 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943336 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/545d3883-e3bc-4b57-b29f-358dd3038d53-etc-systemd\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.943612 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943395 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-host-run-k8s-cni-cncf-io\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.943612 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943416 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-system-cni-dir\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.944422 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943425 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-etc-openvswitch\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:54.944422 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943446 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-host-run-k8s-cni-cncf-io\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.944422 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943450 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-host-cni-netd\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:54.944422 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943479 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/545d3883-e3bc-4b57-b29f-358dd3038d53-etc-sysctl-conf\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.944422 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943524 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-45lhk\" (UniqueName: \"kubernetes.io/projected/79ff87e3-bc60-4320-a398-0c605679612c-kube-api-access-45lhk\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.944422 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943586 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-var-lib-openvswitch\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:54.944422 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943605 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/545d3883-e3bc-4b57-b29f-358dd3038d53-etc-sysctl-conf\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.944422 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943612 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dc8ac89f-dee2-4e7c-b409-39b0900c673e-env-overrides\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:54.944422 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943653 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/88fbb2da-2de8-4d14-aa18-1817fb16e61c-system-cni-dir\") pod \"multus-additional-cni-plugins-7dhhl\" (UID: \"88fbb2da-2de8-4d14-aa18-1817fb16e61c\") " pod="openshift-multus/multus-additional-cni-plugins-7dhhl" Apr 16 14:29:54.944422 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943678 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-systemd-units\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:54.944422 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943718 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-log-socket\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:54.944422 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943746 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/83ceccd4-351d-4f7b-a366-eff6c1e34ba1-registration-dir\") pod \"aws-ebs-csi-driver-node-7kdm8\" (UID: \"83ceccd4-351d-4f7b-a366-eff6c1e34ba1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kdm8" Apr 16 14:29:54.944422 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943772 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/83ceccd4-351d-4f7b-a366-eff6c1e34ba1-sys-fs\") pod \"aws-ebs-csi-driver-node-7kdm8\" (UID: \"83ceccd4-351d-4f7b-a366-eff6c1e34ba1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kdm8" Apr 16 14:29:54.944422 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943371 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/545d3883-e3bc-4b57-b29f-358dd3038d53-var-lib-kubelet\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.944422 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943806 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/545d3883-e3bc-4b57-b29f-358dd3038d53-etc-tuned\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.944422 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943825 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/83ceccd4-351d-4f7b-a366-eff6c1e34ba1-registration-dir\") pod \"aws-ebs-csi-driver-node-7kdm8\" (UID: \"83ceccd4-351d-4f7b-a366-eff6c1e34ba1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kdm8" Apr 16 14:29:54.944422 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943851 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-host-var-lib-cni-bin\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.945081 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943856 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/83ceccd4-351d-4f7b-a366-eff6c1e34ba1-sys-fs\") pod \"aws-ebs-csi-driver-node-7kdm8\" (UID: \"83ceccd4-351d-4f7b-a366-eff6c1e34ba1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kdm8" Apr 16 14:29:54.945081 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943869 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-multus-conf-dir\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.945081 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943885 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/42343077-d6d1-4ad1-ba48-405f8545fbef-agent-certs\") pod \"konnectivity-agent-t5h5z\" (UID: \"42343077-d6d1-4ad1-ba48-405f8545fbef\") " pod="kube-system/konnectivity-agent-t5h5z" Apr 16 14:29:54.945081 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943904 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-multus-conf-dir\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.945081 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943908 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/88fbb2da-2de8-4d14-aa18-1817fb16e61c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7dhhl\" (UID: \"88fbb2da-2de8-4d14-aa18-1817fb16e61c\") " pod="openshift-multus/multus-additional-cni-plugins-7dhhl" Apr 16 14:29:54.945081 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943884 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-host-var-lib-cni-bin\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.945081 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943935 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/76137a5e-4deb-4e87-b7e5-d17bde0111e4-serviceca\") pod \"node-ca-4648f\" (UID: \"76137a5e-4deb-4e87-b7e5-d17bde0111e4\") " pod="openshift-image-registry/node-ca-4648f" Apr 16 14:29:54.945081 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943970 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/42343077-d6d1-4ad1-ba48-405f8545fbef-konnectivity-ca\") pod \"konnectivity-agent-t5h5z\" (UID: \"42343077-d6d1-4ad1-ba48-405f8545fbef\") " pod="kube-system/konnectivity-agent-t5h5z" Apr 16 14:29:54.945081 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.943990 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/83ceccd4-351d-4f7b-a366-eff6c1e34ba1-device-dir\") pod \"aws-ebs-csi-driver-node-7kdm8\" (UID: \"83ceccd4-351d-4f7b-a366-eff6c1e34ba1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kdm8" Apr 16 14:29:54.945081 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944007 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/545d3883-e3bc-4b57-b29f-358dd3038d53-etc-sysconfig\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.945081 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944021 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/79ff87e3-bc60-4320-a398-0c605679612c-multus-daemon-config\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.945081 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944065 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/83ceccd4-351d-4f7b-a366-eff6c1e34ba1-device-dir\") pod \"aws-ebs-csi-driver-node-7kdm8\" (UID: \"83ceccd4-351d-4f7b-a366-eff6c1e34ba1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kdm8" Apr 16 14:29:54.945081 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944071 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-host-run-multus-certs\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.945081 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944089 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/545d3883-e3bc-4b57-b29f-358dd3038d53-etc-sysconfig\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.945081 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944096 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-host-run-multus-certs\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.945081 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944102 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-host-run-ovn-kubernetes\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:54.945081 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944136 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/83ceccd4-351d-4f7b-a366-eff6c1e34ba1-socket-dir\") pod \"aws-ebs-csi-driver-node-7kdm8\" (UID: \"83ceccd4-351d-4f7b-a366-eff6c1e34ba1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kdm8" Apr 16 14:29:54.945081 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944121 2579 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 14:29:54.945867 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944207 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/545d3883-e3bc-4b57-b29f-358dd3038d53-lib-modules\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.945867 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944233 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/545d3883-e3bc-4b57-b29f-358dd3038d53-etc-kubernetes\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.945867 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944258 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/83ceccd4-351d-4f7b-a366-eff6c1e34ba1-socket-dir\") pod \"aws-ebs-csi-driver-node-7kdm8\" (UID: \"83ceccd4-351d-4f7b-a366-eff6c1e34ba1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kdm8" Apr 16 14:29:54.945867 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944317 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8krdt\" (UniqueName: \"kubernetes.io/projected/76137a5e-4deb-4e87-b7e5-d17bde0111e4-kube-api-access-8krdt\") pod \"node-ca-4648f\" (UID: \"76137a5e-4deb-4e87-b7e5-d17bde0111e4\") " pod="openshift-image-registry/node-ca-4648f" Apr 16 14:29:54.945867 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944355 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/545d3883-e3bc-4b57-b29f-358dd3038d53-etc-kubernetes\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.945867 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944362 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/545d3883-e3bc-4b57-b29f-358dd3038d53-lib-modules\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.945867 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944362 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-run-openvswitch\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:54.945867 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944412 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1ec5f4c5-a526-4cf8-82d9-0b554ed2fbaa-hosts-file\") pod \"node-resolver-vjfvw\" (UID: \"1ec5f4c5-a526-4cf8-82d9-0b554ed2fbaa\") " pod="openshift-dns/node-resolver-vjfvw" Apr 16 14:29:54.945867 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944439 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/545d3883-e3bc-4b57-b29f-358dd3038d53-host\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.945867 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944463 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-host-var-lib-cni-multus\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.945867 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944486 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/76137a5e-4deb-4e87-b7e5-d17bde0111e4-host\") pod \"node-ca-4648f\" (UID: \"76137a5e-4deb-4e87-b7e5-d17bde0111e4\") " pod="openshift-image-registry/node-ca-4648f" Apr 16 14:29:54.945867 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944510 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1ec5f4c5-a526-4cf8-82d9-0b554ed2fbaa-hosts-file\") pod \"node-resolver-vjfvw\" (UID: \"1ec5f4c5-a526-4cf8-82d9-0b554ed2fbaa\") " pod="openshift-dns/node-resolver-vjfvw" Apr 16 14:29:54.945867 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944515 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/88fbb2da-2de8-4d14-aa18-1817fb16e61c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7dhhl\" (UID: \"88fbb2da-2de8-4d14-aa18-1817fb16e61c\") " pod="openshift-multus/multus-additional-cni-plugins-7dhhl" Apr 16 14:29:54.945867 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944531 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/42343077-d6d1-4ad1-ba48-405f8545fbef-konnectivity-ca\") pod \"konnectivity-agent-t5h5z\" (UID: \"42343077-d6d1-4ad1-ba48-405f8545fbef\") " pod="kube-system/konnectivity-agent-t5h5z" Apr 16 14:29:54.945867 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944549 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxtbv\" (UniqueName: \"kubernetes.io/projected/2f2a6e36-a780-456c-a172-4207fbfa5df6-kube-api-access-kxtbv\") pod \"iptables-alerter-k7pvw\" (UID: \"2f2a6e36-a780-456c-a172-4207fbfa5df6\") " pod="openshift-network-operator/iptables-alerter-k7pvw" Apr 16 14:29:54.945867 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944554 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/76137a5e-4deb-4e87-b7e5-d17bde0111e4-serviceca\") pod \"node-ca-4648f\" (UID: \"76137a5e-4deb-4e87-b7e5-d17bde0111e4\") " pod="openshift-image-registry/node-ca-4648f" Apr 16 14:29:54.945867 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944576 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dc8ac89f-dee2-4e7c-b409-39b0900c673e-ovn-node-metrics-cert\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:54.946657 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944596 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/545d3883-e3bc-4b57-b29f-358dd3038d53-host\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.946657 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944614 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/79ff87e3-bc60-4320-a398-0c605679612c-multus-daemon-config\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.946657 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944619 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20614211-2bef-41dc-aad8-94242eb8364c-metrics-certs\") pod \"network-metrics-daemon-m57qr\" (UID: \"20614211-2bef-41dc-aad8-94242eb8364c\") " pod="openshift-multus/network-metrics-daemon-m57qr" Apr 16 14:29:54.946657 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944607 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-host-var-lib-cni-multus\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.946657 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944639 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/76137a5e-4deb-4e87-b7e5-d17bde0111e4-host\") pod \"node-ca-4648f\" (UID: \"76137a5e-4deb-4e87-b7e5-d17bde0111e4\") " pod="openshift-image-registry/node-ca-4648f" Apr 16 14:29:54.946657 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944645 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-multus-cni-dir\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.946657 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944676 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/88fbb2da-2de8-4d14-aa18-1817fb16e61c-cnibin\") pod \"multus-additional-cni-plugins-7dhhl\" (UID: \"88fbb2da-2de8-4d14-aa18-1817fb16e61c\") " pod="openshift-multus/multus-additional-cni-plugins-7dhhl" Apr 16 14:29:54.946657 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:54.944684 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:29:54.946657 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944693 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-multus-cni-dir\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.946657 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944705 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvrnm\" (UniqueName: \"kubernetes.io/projected/1ec5f4c5-a526-4cf8-82d9-0b554ed2fbaa-kube-api-access-rvrnm\") pod \"node-resolver-vjfvw\" (UID: \"1ec5f4c5-a526-4cf8-82d9-0b554ed2fbaa\") " pod="openshift-dns/node-resolver-vjfvw" Apr 16 14:29:54.946657 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944731 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-run-ovn\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:54.946657 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:54.944766 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20614211-2bef-41dc-aad8-94242eb8364c-metrics-certs podName:20614211-2bef-41dc-aad8-94242eb8364c nodeName:}" failed. No retries permitted until 2026-04-16 14:29:55.44473577 +0000 UTC m=+2.062525547 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/20614211-2bef-41dc-aad8-94242eb8364c-metrics-certs") pod "network-metrics-daemon-m57qr" (UID: "20614211-2bef-41dc-aad8-94242eb8364c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:29:54.946657 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944810 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-host-cni-bin\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:54.946657 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944841 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/545d3883-e3bc-4b57-b29f-358dd3038d53-run\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.946657 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944867 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qqdb\" (UniqueName: \"kubernetes.io/projected/545d3883-e3bc-4b57-b29f-358dd3038d53-kube-api-access-4qqdb\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.946657 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944892 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-host-run-netns\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.946657 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944918 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/88fbb2da-2de8-4d14-aa18-1817fb16e61c-os-release\") pod \"multus-additional-cni-plugins-7dhhl\" (UID: \"88fbb2da-2de8-4d14-aa18-1817fb16e61c\") " pod="openshift-multus/multus-additional-cni-plugins-7dhhl" Apr 16 14:29:54.947321 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944909 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/545d3883-e3bc-4b57-b29f-358dd3038d53-run\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.947321 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944944 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2f2a6e36-a780-456c-a172-4207fbfa5df6-host-slash\") pod \"iptables-alerter-k7pvw\" (UID: \"2f2a6e36-a780-456c-a172-4207fbfa5df6\") " pod="openshift-network-operator/iptables-alerter-k7pvw" Apr 16 14:29:54.947321 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944949 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-host-run-netns\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.947321 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.944966 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:54.947321 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.945012 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dc8ac89f-dee2-4e7c-b409-39b0900c673e-ovnkube-script-lib\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:54.947321 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.945084 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1ec5f4c5-a526-4cf8-82d9-0b554ed2fbaa-tmp-dir\") pod \"node-resolver-vjfvw\" (UID: \"1ec5f4c5-a526-4cf8-82d9-0b554ed2fbaa\") " pod="openshift-dns/node-resolver-vjfvw" Apr 16 14:29:54.947321 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.945104 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/545d3883-e3bc-4b57-b29f-358dd3038d53-tmp\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.947321 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.945118 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-cnibin\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.947321 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.945193 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/79ff87e3-bc60-4320-a398-0c605679612c-cnibin\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.947321 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.945312 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1ec5f4c5-a526-4cf8-82d9-0b554ed2fbaa-tmp-dir\") pod \"node-resolver-vjfvw\" (UID: \"1ec5f4c5-a526-4cf8-82d9-0b554ed2fbaa\") " pod="openshift-dns/node-resolver-vjfvw" Apr 16 14:29:54.947598 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.947333 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/545d3883-e3bc-4b57-b29f-358dd3038d53-tmp\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.947598 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.947360 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/545d3883-e3bc-4b57-b29f-358dd3038d53-etc-tuned\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.947653 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.947617 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/42343077-d6d1-4ad1-ba48-405f8545fbef-agent-certs\") pod \"konnectivity-agent-t5h5z\" (UID: \"42343077-d6d1-4ad1-ba48-405f8545fbef\") " pod="kube-system/konnectivity-agent-t5h5z" Apr 16 14:29:54.952271 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.952245 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlwrp\" (UniqueName: \"kubernetes.io/projected/20614211-2bef-41dc-aad8-94242eb8364c-kube-api-access-dlwrp\") pod \"network-metrics-daemon-m57qr\" (UID: \"20614211-2bef-41dc-aad8-94242eb8364c\") " pod="openshift-multus/network-metrics-daemon-m57qr" Apr 16 14:29:54.952532 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.952509 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl5zt\" (UniqueName: \"kubernetes.io/projected/83ceccd4-351d-4f7b-a366-eff6c1e34ba1-kube-api-access-fl5zt\") pod \"aws-ebs-csi-driver-node-7kdm8\" (UID: \"83ceccd4-351d-4f7b-a366-eff6c1e34ba1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kdm8" Apr 16 14:29:54.952570 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.952543 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-45lhk\" (UniqueName: \"kubernetes.io/projected/79ff87e3-bc60-4320-a398-0c605679612c-kube-api-access-45lhk\") pod \"multus-v76f6\" (UID: \"79ff87e3-bc60-4320-a398-0c605679612c\") " pod="openshift-multus/multus-v76f6" Apr 16 14:29:54.953279 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.953259 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8krdt\" (UniqueName: \"kubernetes.io/projected/76137a5e-4deb-4e87-b7e5-d17bde0111e4-kube-api-access-8krdt\") pod \"node-ca-4648f\" (UID: \"76137a5e-4deb-4e87-b7e5-d17bde0111e4\") " pod="openshift-image-registry/node-ca-4648f" Apr 16 14:29:54.953350 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.953327 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvrnm\" (UniqueName: \"kubernetes.io/projected/1ec5f4c5-a526-4cf8-82d9-0b554ed2fbaa-kube-api-access-rvrnm\") pod \"node-resolver-vjfvw\" (UID: \"1ec5f4c5-a526-4cf8-82d9-0b554ed2fbaa\") " pod="openshift-dns/node-resolver-vjfvw" Apr 16 14:29:54.953631 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.953613 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qqdb\" (UniqueName: \"kubernetes.io/projected/545d3883-e3bc-4b57-b29f-358dd3038d53-kube-api-access-4qqdb\") pod \"tuned-7jmkf\" (UID: \"545d3883-e3bc-4b57-b29f-358dd3038d53\") " pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:54.976248 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.976208 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-144.ec2.internal" event={"ID":"ab72f00075c1175e59c7e696357a6702","Type":"ContainerStarted","Data":"f16a25e83fa398d669ac3a5807514b60a750304b102e82b2a80be816832c3061"} Apr 16 14:29:54.977199 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:54.977180 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal" event={"ID":"e28845082dd3a225de8448aa7a81a8c9","Type":"ContainerStarted","Data":"fb81689a0419b7c548b5eac932d16f8b79fb6a869a1ba916890a88e702b29eb3"} Apr 16 14:29:55.006853 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.006829 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:29:55.045912 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.045889 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nk9qc\" (UniqueName: \"kubernetes.io/projected/38b35ba8-4e6c-4198-bb10-ea4df7f8816a-kube-api-access-nk9qc\") pod \"network-check-target-l6qjc\" (UID: \"38b35ba8-4e6c-4198-bb10-ea4df7f8816a\") " pod="openshift-network-diagnostics/network-check-target-l6qjc" Apr 16 14:29:55.046000 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.045917 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-host-kubelet\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.046000 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.045940 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-host-slash\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.046000 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.045977 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-host-slash\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.046107 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.046003 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-host-kubelet\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.046151 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.046136 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dc8ac89f-dee2-4e7c-b409-39b0900c673e-ovnkube-config\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.046184 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.046172 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/88fbb2da-2de8-4d14-aa18-1817fb16e61c-cni-binary-copy\") pod \"multus-additional-cni-plugins-7dhhl\" (UID: \"88fbb2da-2de8-4d14-aa18-1817fb16e61c\") " pod="openshift-multus/multus-additional-cni-plugins-7dhhl" Apr 16 14:29:55.046212 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.046197 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2f2a6e36-a780-456c-a172-4207fbfa5df6-iptables-alerter-script\") pod \"iptables-alerter-k7pvw\" (UID: \"2f2a6e36-a780-456c-a172-4207fbfa5df6\") " pod="openshift-network-operator/iptables-alerter-k7pvw" Apr 16 14:29:55.046243 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.046222 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-host-run-netns\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.046277 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.046244 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-run-systemd\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.046277 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.046269 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-node-log\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.046350 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.046297 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-host-run-netns\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.046350 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.046312 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-node-log\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.046350 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.046319 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-run-systemd\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.046350 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.046337 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h9mxr\" (UniqueName: \"kubernetes.io/projected/dc8ac89f-dee2-4e7c-b409-39b0900c673e-kube-api-access-h9mxr\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.046504 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.046375 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-etc-openvswitch\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.046504 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.046402 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-host-cni-netd\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.046504 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.046431 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-var-lib-openvswitch\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.046504 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.046455 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dc8ac89f-dee2-4e7c-b409-39b0900c673e-env-overrides\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.046504 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.046480 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/88fbb2da-2de8-4d14-aa18-1817fb16e61c-system-cni-dir\") pod \"multus-additional-cni-plugins-7dhhl\" (UID: \"88fbb2da-2de8-4d14-aa18-1817fb16e61c\") " pod="openshift-multus/multus-additional-cni-plugins-7dhhl" Apr 16 14:29:55.046504 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.046490 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-host-cni-netd\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.046731 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.046494 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-etc-openvswitch\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.046731 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.046504 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-systemd-units\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.046731 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.046555 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-var-lib-openvswitch\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.046731 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.046581 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-systemd-units\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.046731 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.046704 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/88fbb2da-2de8-4d14-aa18-1817fb16e61c-system-cni-dir\") pod \"multus-additional-cni-plugins-7dhhl\" (UID: \"88fbb2da-2de8-4d14-aa18-1817fb16e61c\") " pod="openshift-multus/multus-additional-cni-plugins-7dhhl" Apr 16 14:29:55.046731 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.046722 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/88fbb2da-2de8-4d14-aa18-1817fb16e61c-cni-binary-copy\") pod \"multus-additional-cni-plugins-7dhhl\" (UID: \"88fbb2da-2de8-4d14-aa18-1817fb16e61c\") " pod="openshift-multus/multus-additional-cni-plugins-7dhhl" Apr 16 14:29:55.046953 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.046732 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-log-socket\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.046953 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.046773 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-log-socket\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.046953 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.046779 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/88fbb2da-2de8-4d14-aa18-1817fb16e61c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7dhhl\" (UID: \"88fbb2da-2de8-4d14-aa18-1817fb16e61c\") " pod="openshift-multus/multus-additional-cni-plugins-7dhhl" Apr 16 14:29:55.046953 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.046809 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-host-run-ovn-kubernetes\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.046953 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.046828 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-run-openvswitch\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.046953 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.046846 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/88fbb2da-2de8-4d14-aa18-1817fb16e61c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7dhhl\" (UID: \"88fbb2da-2de8-4d14-aa18-1817fb16e61c\") " pod="openshift-multus/multus-additional-cni-plugins-7dhhl" Apr 16 14:29:55.046953 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.046780 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dc8ac89f-dee2-4e7c-b409-39b0900c673e-ovnkube-config\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.046953 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.046870 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kxtbv\" (UniqueName: \"kubernetes.io/projected/2f2a6e36-a780-456c-a172-4207fbfa5df6-kube-api-access-kxtbv\") pod \"iptables-alerter-k7pvw\" (UID: \"2f2a6e36-a780-456c-a172-4207fbfa5df6\") " pod="openshift-network-operator/iptables-alerter-k7pvw" Apr 16 14:29:55.046953 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.046873 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2f2a6e36-a780-456c-a172-4207fbfa5df6-iptables-alerter-script\") pod \"iptables-alerter-k7pvw\" (UID: \"2f2a6e36-a780-456c-a172-4207fbfa5df6\") " pod="openshift-network-operator/iptables-alerter-k7pvw" Apr 16 14:29:55.046953 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.046887 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-run-openvswitch\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.046953 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.046895 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dc8ac89f-dee2-4e7c-b409-39b0900c673e-ovn-node-metrics-cert\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.046953 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.046907 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dc8ac89f-dee2-4e7c-b409-39b0900c673e-env-overrides\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.046953 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.046929 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-host-run-ovn-kubernetes\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.047506 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.047014 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/88fbb2da-2de8-4d14-aa18-1817fb16e61c-cnibin\") pod \"multus-additional-cni-plugins-7dhhl\" (UID: \"88fbb2da-2de8-4d14-aa18-1817fb16e61c\") " pod="openshift-multus/multus-additional-cni-plugins-7dhhl" Apr 16 14:29:55.047506 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.047050 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-run-ovn\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.047506 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.047066 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-host-cni-bin\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.047506 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.047083 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/88fbb2da-2de8-4d14-aa18-1817fb16e61c-os-release\") pod \"multus-additional-cni-plugins-7dhhl\" (UID: \"88fbb2da-2de8-4d14-aa18-1817fb16e61c\") " pod="openshift-multus/multus-additional-cni-plugins-7dhhl" Apr 16 14:29:55.047506 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.047109 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2f2a6e36-a780-456c-a172-4207fbfa5df6-host-slash\") pod \"iptables-alerter-k7pvw\" (UID: \"2f2a6e36-a780-456c-a172-4207fbfa5df6\") " pod="openshift-network-operator/iptables-alerter-k7pvw" Apr 16 14:29:55.047506 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.047114 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/88fbb2da-2de8-4d14-aa18-1817fb16e61c-cnibin\") pod \"multus-additional-cni-plugins-7dhhl\" (UID: \"88fbb2da-2de8-4d14-aa18-1817fb16e61c\") " pod="openshift-multus/multus-additional-cni-plugins-7dhhl" Apr 16 14:29:55.047506 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.047132 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-run-ovn\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.047506 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.047135 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.047506 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.047156 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2f2a6e36-a780-456c-a172-4207fbfa5df6-host-slash\") pod \"iptables-alerter-k7pvw\" (UID: \"2f2a6e36-a780-456c-a172-4207fbfa5df6\") " pod="openshift-network-operator/iptables-alerter-k7pvw" Apr 16 14:29:55.047506 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.047136 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-host-cni-bin\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.047506 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.047173 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dc8ac89f-dee2-4e7c-b409-39b0900c673e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.047506 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.047191 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dc8ac89f-dee2-4e7c-b409-39b0900c673e-ovnkube-script-lib\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.047506 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.047201 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/88fbb2da-2de8-4d14-aa18-1817fb16e61c-os-release\") pod \"multus-additional-cni-plugins-7dhhl\" (UID: \"88fbb2da-2de8-4d14-aa18-1817fb16e61c\") " pod="openshift-multus/multus-additional-cni-plugins-7dhhl" Apr 16 14:29:55.047506 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.047217 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/88fbb2da-2de8-4d14-aa18-1817fb16e61c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7dhhl\" (UID: \"88fbb2da-2de8-4d14-aa18-1817fb16e61c\") " pod="openshift-multus/multus-additional-cni-plugins-7dhhl" Apr 16 14:29:55.047506 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.047226 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/88fbb2da-2de8-4d14-aa18-1817fb16e61c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7dhhl\" (UID: \"88fbb2da-2de8-4d14-aa18-1817fb16e61c\") " pod="openshift-multus/multus-additional-cni-plugins-7dhhl" Apr 16 14:29:55.047506 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.047259 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nv7f7\" (UniqueName: \"kubernetes.io/projected/88fbb2da-2de8-4d14-aa18-1817fb16e61c-kube-api-access-nv7f7\") pod \"multus-additional-cni-plugins-7dhhl\" (UID: \"88fbb2da-2de8-4d14-aa18-1817fb16e61c\") " pod="openshift-multus/multus-additional-cni-plugins-7dhhl" Apr 16 14:29:55.047992 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.047739 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dc8ac89f-dee2-4e7c-b409-39b0900c673e-ovnkube-script-lib\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.047992 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.047739 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/88fbb2da-2de8-4d14-aa18-1817fb16e61c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7dhhl\" (UID: \"88fbb2da-2de8-4d14-aa18-1817fb16e61c\") " pod="openshift-multus/multus-additional-cni-plugins-7dhhl" Apr 16 14:29:55.048121 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.048101 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/88fbb2da-2de8-4d14-aa18-1817fb16e61c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7dhhl\" (UID: \"88fbb2da-2de8-4d14-aa18-1817fb16e61c\") " pod="openshift-multus/multus-additional-cni-plugins-7dhhl" Apr 16 14:29:55.049105 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.049090 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dc8ac89f-dee2-4e7c-b409-39b0900c673e-ovn-node-metrics-cert\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.056661 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:55.056641 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:29:55.056661 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:55.056660 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:29:55.056804 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:55.056670 2579 projected.go:194] Error preparing data for projected volume kube-api-access-nk9qc for pod openshift-network-diagnostics/network-check-target-l6qjc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:29:55.056804 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:55.056735 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/38b35ba8-4e6c-4198-bb10-ea4df7f8816a-kube-api-access-nk9qc podName:38b35ba8-4e6c-4198-bb10-ea4df7f8816a nodeName:}" failed. No retries permitted until 2026-04-16 14:29:55.556719711 +0000 UTC m=+2.174509453 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-nk9qc" (UniqueName: "kubernetes.io/projected/38b35ba8-4e6c-4198-bb10-ea4df7f8816a-kube-api-access-nk9qc") pod "network-check-target-l6qjc" (UID: "38b35ba8-4e6c-4198-bb10-ea4df7f8816a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:29:55.058714 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.058662 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxtbv\" (UniqueName: \"kubernetes.io/projected/2f2a6e36-a780-456c-a172-4207fbfa5df6-kube-api-access-kxtbv\") pod \"iptables-alerter-k7pvw\" (UID: \"2f2a6e36-a780-456c-a172-4207fbfa5df6\") " pod="openshift-network-operator/iptables-alerter-k7pvw" Apr 16 14:29:55.058899 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.058884 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9mxr\" (UniqueName: \"kubernetes.io/projected/dc8ac89f-dee2-4e7c-b409-39b0900c673e-kube-api-access-h9mxr\") pod \"ovnkube-node-vclsl\" (UID: \"dc8ac89f-dee2-4e7c-b409-39b0900c673e\") " pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.059584 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.059562 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv7f7\" (UniqueName: \"kubernetes.io/projected/88fbb2da-2de8-4d14-aa18-1817fb16e61c-kube-api-access-nv7f7\") pod \"multus-additional-cni-plugins-7dhhl\" (UID: \"88fbb2da-2de8-4d14-aa18-1817fb16e61c\") " pod="openshift-multus/multus-additional-cni-plugins-7dhhl" Apr 16 14:29:55.151917 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.151886 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-t5h5z" Apr 16 14:29:55.158940 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:55.158911 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42343077_d6d1_4ad1_ba48_405f8545fbef.slice/crio-d53a50e839b09ce3b816a12a71e5748e46907f4fba71b54f5124f54573186243 WatchSource:0}: Error finding container d53a50e839b09ce3b816a12a71e5748e46907f4fba71b54f5124f54573186243: Status 404 returned error can't find the container with id d53a50e839b09ce3b816a12a71e5748e46907f4fba71b54f5124f54573186243 Apr 16 14:29:55.181864 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.181837 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kdm8" Apr 16 14:29:55.187560 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:55.187537 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83ceccd4_351d_4f7b_a366_eff6c1e34ba1.slice/crio-e0d63d8e35f023802dc991f8f4372c84a9dc6851a267fa137567e95e68465ce9 WatchSource:0}: Error finding container e0d63d8e35f023802dc991f8f4372c84a9dc6851a267fa137567e95e68465ce9: Status 404 returned error can't find the container with id e0d63d8e35f023802dc991f8f4372c84a9dc6851a267fa137567e95e68465ce9 Apr 16 14:29:55.188331 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.188316 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" Apr 16 14:29:55.193456 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.193432 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vjfvw" Apr 16 14:29:55.194103 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:55.194083 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod545d3883_e3bc_4b57_b29f_358dd3038d53.slice/crio-c99cc7de9e2351dddb30c03673e8370313cd4090aaa4c20a132d17487acfada4 WatchSource:0}: Error finding container c99cc7de9e2351dddb30c03673e8370313cd4090aaa4c20a132d17487acfada4: Status 404 returned error can't find the container with id c99cc7de9e2351dddb30c03673e8370313cd4090aaa4c20a132d17487acfada4 Apr 16 14:29:55.199976 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.199955 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-v76f6" Apr 16 14:29:55.200655 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:55.200635 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ec5f4c5_a526_4cf8_82d9_0b554ed2fbaa.slice/crio-89ab58d581b9081b458a27354d38f6db34b1ff631257887967eef966071ccba8 WatchSource:0}: Error finding container 89ab58d581b9081b458a27354d38f6db34b1ff631257887967eef966071ccba8: Status 404 returned error can't find the container with id 89ab58d581b9081b458a27354d38f6db34b1ff631257887967eef966071ccba8 Apr 16 14:29:55.204746 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.204691 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4648f" Apr 16 14:29:55.206671 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:55.206647 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79ff87e3_bc60_4320_a398_0c605679612c.slice/crio-fb30f814956a9830258afaec48a4508ae84f3a11388202e79ebc4942a166b6da WatchSource:0}: Error finding container fb30f814956a9830258afaec48a4508ae84f3a11388202e79ebc4942a166b6da: Status 404 returned error can't find the container with id fb30f814956a9830258afaec48a4508ae84f3a11388202e79ebc4942a166b6da Apr 16 14:29:55.209518 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.209498 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7dhhl" Apr 16 14:29:55.212348 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:55.212322 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76137a5e_4deb_4e87_b7e5_d17bde0111e4.slice/crio-00ffd9056ccfb61f5941253289abf2d572ddba72d3694e5ca308a8a212d44261 WatchSource:0}: Error finding container 00ffd9056ccfb61f5941253289abf2d572ddba72d3694e5ca308a8a212d44261: Status 404 returned error can't find the container with id 00ffd9056ccfb61f5941253289abf2d572ddba72d3694e5ca308a8a212d44261 Apr 16 14:29:55.215178 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.215156 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-k7pvw" Apr 16 14:29:55.217335 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:55.217299 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88fbb2da_2de8_4d14_aa18_1817fb16e61c.slice/crio-94c567e32bdca336f5f627214e4809a9795ea6904ae21d503b4dafc9c4a8d8b5 WatchSource:0}: Error finding container 94c567e32bdca336f5f627214e4809a9795ea6904ae21d503b4dafc9c4a8d8b5: Status 404 returned error can't find the container with id 94c567e32bdca336f5f627214e4809a9795ea6904ae21d503b4dafc9c4a8d8b5 Apr 16 14:29:55.220904 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.220735 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:29:55.223300 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:55.223224 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f2a6e36_a780_456c_a172_4207fbfa5df6.slice/crio-7ba36364dbc98159700bb81d59e32da9eb792e4276965f9da8861ec2327f5a60 WatchSource:0}: Error finding container 7ba36364dbc98159700bb81d59e32da9eb792e4276965f9da8861ec2327f5a60: Status 404 returned error can't find the container with id 7ba36364dbc98159700bb81d59e32da9eb792e4276965f9da8861ec2327f5a60 Apr 16 14:29:55.229612 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:29:55.229587 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc8ac89f_dee2_4e7c_b409_39b0900c673e.slice/crio-abea9571cb942419754eff9fbc63e2c78dc5a09c22d331917b86f6e1335f7fc6 WatchSource:0}: Error finding container abea9571cb942419754eff9fbc63e2c78dc5a09c22d331917b86f6e1335f7fc6: Status 404 returned error can't find the container with id abea9571cb942419754eff9fbc63e2c78dc5a09c22d331917b86f6e1335f7fc6 Apr 16 14:29:55.450350 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.450092 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20614211-2bef-41dc-aad8-94242eb8364c-metrics-certs\") pod \"network-metrics-daemon-m57qr\" (UID: \"20614211-2bef-41dc-aad8-94242eb8364c\") " pod="openshift-multus/network-metrics-daemon-m57qr" Apr 16 14:29:55.450350 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:55.450288 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:29:55.450350 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:55.450347 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20614211-2bef-41dc-aad8-94242eb8364c-metrics-certs podName:20614211-2bef-41dc-aad8-94242eb8364c nodeName:}" failed. No retries permitted until 2026-04-16 14:29:56.450329135 +0000 UTC m=+3.068118879 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/20614211-2bef-41dc-aad8-94242eb8364c-metrics-certs") pod "network-metrics-daemon-m57qr" (UID: "20614211-2bef-41dc-aad8-94242eb8364c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:29:55.651857 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.651763 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nk9qc\" (UniqueName: \"kubernetes.io/projected/38b35ba8-4e6c-4198-bb10-ea4df7f8816a-kube-api-access-nk9qc\") pod \"network-check-target-l6qjc\" (UID: \"38b35ba8-4e6c-4198-bb10-ea4df7f8816a\") " pod="openshift-network-diagnostics/network-check-target-l6qjc" Apr 16 14:29:55.652054 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:55.651929 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:29:55.652054 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:55.651949 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:29:55.652054 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:55.651961 2579 projected.go:194] Error preparing data for projected volume kube-api-access-nk9qc for pod openshift-network-diagnostics/network-check-target-l6qjc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:29:55.652054 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:55.652014 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/38b35ba8-4e6c-4198-bb10-ea4df7f8816a-kube-api-access-nk9qc podName:38b35ba8-4e6c-4198-bb10-ea4df7f8816a nodeName:}" failed. No retries permitted until 2026-04-16 14:29:56.651996739 +0000 UTC m=+3.269786495 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-nk9qc" (UniqueName: "kubernetes.io/projected/38b35ba8-4e6c-4198-bb10-ea4df7f8816a-kube-api-access-nk9qc") pod "network-check-target-l6qjc" (UID: "38b35ba8-4e6c-4198-bb10-ea4df7f8816a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:29:55.827379 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.827072 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:29:55.877833 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.877688 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 14:24:54 +0000 UTC" deadline="2027-11-26 06:47:31.42582517 +0000 UTC" Apr 16 14:29:55.877833 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.877725 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14128h17m35.548104968s" Apr 16 14:29:55.974491 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:55.973676 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l6qjc" Apr 16 14:29:55.974491 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:55.973803 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l6qjc" podUID="38b35ba8-4e6c-4198-bb10-ea4df7f8816a" Apr 16 14:29:56.001290 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:56.001237 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" event={"ID":"545d3883-e3bc-4b57-b29f-358dd3038d53","Type":"ContainerStarted","Data":"c99cc7de9e2351dddb30c03673e8370313cd4090aaa4c20a132d17487acfada4"} Apr 16 14:29:56.025103 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:56.023267 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kdm8" event={"ID":"83ceccd4-351d-4f7b-a366-eff6c1e34ba1","Type":"ContainerStarted","Data":"e0d63d8e35f023802dc991f8f4372c84a9dc6851a267fa137567e95e68465ce9"} Apr 16 14:29:56.037078 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:56.037046 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7dhhl" event={"ID":"88fbb2da-2de8-4d14-aa18-1817fb16e61c","Type":"ContainerStarted","Data":"94c567e32bdca336f5f627214e4809a9795ea6904ae21d503b4dafc9c4a8d8b5"} Apr 16 14:29:56.038785 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:56.038760 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:29:56.053748 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:56.053718 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4648f" event={"ID":"76137a5e-4deb-4e87-b7e5-d17bde0111e4","Type":"ContainerStarted","Data":"00ffd9056ccfb61f5941253289abf2d572ddba72d3694e5ca308a8a212d44261"} Apr 16 14:29:56.084390 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:56.084295 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-t5h5z" event={"ID":"42343077-d6d1-4ad1-ba48-405f8545fbef","Type":"ContainerStarted","Data":"d53a50e839b09ce3b816a12a71e5748e46907f4fba71b54f5124f54573186243"} Apr 16 14:29:56.110784 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:56.110718 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" event={"ID":"dc8ac89f-dee2-4e7c-b409-39b0900c673e","Type":"ContainerStarted","Data":"abea9571cb942419754eff9fbc63e2c78dc5a09c22d331917b86f6e1335f7fc6"} Apr 16 14:29:56.119630 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:56.119543 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-k7pvw" event={"ID":"2f2a6e36-a780-456c-a172-4207fbfa5df6","Type":"ContainerStarted","Data":"7ba36364dbc98159700bb81d59e32da9eb792e4276965f9da8861ec2327f5a60"} Apr 16 14:29:56.128184 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:56.128140 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-v76f6" event={"ID":"79ff87e3-bc60-4320-a398-0c605679612c","Type":"ContainerStarted","Data":"fb30f814956a9830258afaec48a4508ae84f3a11388202e79ebc4942a166b6da"} Apr 16 14:29:56.151353 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:56.151315 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vjfvw" event={"ID":"1ec5f4c5-a526-4cf8-82d9-0b554ed2fbaa","Type":"ContainerStarted","Data":"89ab58d581b9081b458a27354d38f6db34b1ff631257887967eef966071ccba8"} Apr 16 14:29:56.461224 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:56.460579 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20614211-2bef-41dc-aad8-94242eb8364c-metrics-certs\") pod \"network-metrics-daemon-m57qr\" (UID: \"20614211-2bef-41dc-aad8-94242eb8364c\") " pod="openshift-multus/network-metrics-daemon-m57qr" Apr 16 14:29:56.461224 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:56.460719 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:29:56.461224 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:56.460782 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20614211-2bef-41dc-aad8-94242eb8364c-metrics-certs podName:20614211-2bef-41dc-aad8-94242eb8364c nodeName:}" failed. No retries permitted until 2026-04-16 14:29:58.460762542 +0000 UTC m=+5.078552298 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/20614211-2bef-41dc-aad8-94242eb8364c-metrics-certs") pod "network-metrics-daemon-m57qr" (UID: "20614211-2bef-41dc-aad8-94242eb8364c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:29:56.663801 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:56.663761 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nk9qc\" (UniqueName: \"kubernetes.io/projected/38b35ba8-4e6c-4198-bb10-ea4df7f8816a-kube-api-access-nk9qc\") pod \"network-check-target-l6qjc\" (UID: \"38b35ba8-4e6c-4198-bb10-ea4df7f8816a\") " pod="openshift-network-diagnostics/network-check-target-l6qjc" Apr 16 14:29:56.663993 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:56.663966 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:29:56.663993 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:56.663985 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:29:56.664143 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:56.664000 2579 projected.go:194] Error preparing data for projected volume kube-api-access-nk9qc for pod openshift-network-diagnostics/network-check-target-l6qjc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:29:56.664143 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:56.664075 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/38b35ba8-4e6c-4198-bb10-ea4df7f8816a-kube-api-access-nk9qc podName:38b35ba8-4e6c-4198-bb10-ea4df7f8816a nodeName:}" failed. No retries permitted until 2026-04-16 14:29:58.664055723 +0000 UTC m=+5.281845471 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-nk9qc" (UniqueName: "kubernetes.io/projected/38b35ba8-4e6c-4198-bb10-ea4df7f8816a-kube-api-access-nk9qc") pod "network-check-target-l6qjc" (UID: "38b35ba8-4e6c-4198-bb10-ea4df7f8816a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:29:56.878183 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:56.878047 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 14:24:54 +0000 UTC" deadline="2027-12-17 03:08:11.918408438 +0000 UTC" Apr 16 14:29:56.878183 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:56.878090 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14628h38m15.040324111s" Apr 16 14:29:56.974519 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:56.973978 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m57qr" Apr 16 14:29:56.974519 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:56.974134 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m57qr" podUID="20614211-2bef-41dc-aad8-94242eb8364c" Apr 16 14:29:57.973693 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:57.973657 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l6qjc" Apr 16 14:29:57.974206 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:57.973790 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l6qjc" podUID="38b35ba8-4e6c-4198-bb10-ea4df7f8816a" Apr 16 14:29:58.479853 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:58.479811 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20614211-2bef-41dc-aad8-94242eb8364c-metrics-certs\") pod \"network-metrics-daemon-m57qr\" (UID: \"20614211-2bef-41dc-aad8-94242eb8364c\") " pod="openshift-multus/network-metrics-daemon-m57qr" Apr 16 14:29:58.480067 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:58.480011 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:29:58.480131 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:58.480091 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20614211-2bef-41dc-aad8-94242eb8364c-metrics-certs podName:20614211-2bef-41dc-aad8-94242eb8364c nodeName:}" failed. No retries permitted until 2026-04-16 14:30:02.480071961 +0000 UTC m=+9.097861721 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/20614211-2bef-41dc-aad8-94242eb8364c-metrics-certs") pod "network-metrics-daemon-m57qr" (UID: "20614211-2bef-41dc-aad8-94242eb8364c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:29:58.681865 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:58.681203 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nk9qc\" (UniqueName: \"kubernetes.io/projected/38b35ba8-4e6c-4198-bb10-ea4df7f8816a-kube-api-access-nk9qc\") pod \"network-check-target-l6qjc\" (UID: \"38b35ba8-4e6c-4198-bb10-ea4df7f8816a\") " pod="openshift-network-diagnostics/network-check-target-l6qjc" Apr 16 14:29:58.681865 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:58.681420 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:29:58.681865 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:58.681441 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:29:58.681865 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:58.681454 2579 projected.go:194] Error preparing data for projected volume kube-api-access-nk9qc for pod openshift-network-diagnostics/network-check-target-l6qjc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:29:58.681865 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:58.681512 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/38b35ba8-4e6c-4198-bb10-ea4df7f8816a-kube-api-access-nk9qc podName:38b35ba8-4e6c-4198-bb10-ea4df7f8816a nodeName:}" failed. No retries permitted until 2026-04-16 14:30:02.681493727 +0000 UTC m=+9.299283474 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-nk9qc" (UniqueName: "kubernetes.io/projected/38b35ba8-4e6c-4198-bb10-ea4df7f8816a-kube-api-access-nk9qc") pod "network-check-target-l6qjc" (UID: "38b35ba8-4e6c-4198-bb10-ea4df7f8816a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:29:58.974084 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:58.973852 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m57qr" Apr 16 14:29:58.974084 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:58.974013 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m57qr" podUID="20614211-2bef-41dc-aad8-94242eb8364c" Apr 16 14:29:59.973456 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:29:59.973420 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l6qjc" Apr 16 14:29:59.973710 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:29:59.973547 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l6qjc" podUID="38b35ba8-4e6c-4198-bb10-ea4df7f8816a" Apr 16 14:30:00.974761 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:00.974218 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m57qr" Apr 16 14:30:00.974761 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:00.974378 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m57qr" podUID="20614211-2bef-41dc-aad8-94242eb8364c" Apr 16 14:30:01.973920 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:01.973887 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l6qjc" Apr 16 14:30:01.974112 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:01.974007 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l6qjc" podUID="38b35ba8-4e6c-4198-bb10-ea4df7f8816a" Apr 16 14:30:02.522517 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:02.522478 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20614211-2bef-41dc-aad8-94242eb8364c-metrics-certs\") pod \"network-metrics-daemon-m57qr\" (UID: \"20614211-2bef-41dc-aad8-94242eb8364c\") " pod="openshift-multus/network-metrics-daemon-m57qr" Apr 16 14:30:02.523008 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:02.522656 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:30:02.523008 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:02.522724 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20614211-2bef-41dc-aad8-94242eb8364c-metrics-certs podName:20614211-2bef-41dc-aad8-94242eb8364c nodeName:}" failed. No retries permitted until 2026-04-16 14:30:10.52270282 +0000 UTC m=+17.140492574 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/20614211-2bef-41dc-aad8-94242eb8364c-metrics-certs") pod "network-metrics-daemon-m57qr" (UID: "20614211-2bef-41dc-aad8-94242eb8364c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:30:02.723855 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:02.723812 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nk9qc\" (UniqueName: \"kubernetes.io/projected/38b35ba8-4e6c-4198-bb10-ea4df7f8816a-kube-api-access-nk9qc\") pod \"network-check-target-l6qjc\" (UID: \"38b35ba8-4e6c-4198-bb10-ea4df7f8816a\") " pod="openshift-network-diagnostics/network-check-target-l6qjc" Apr 16 14:30:02.724054 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:02.723999 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:30:02.724054 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:02.724024 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:30:02.724054 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:02.724052 2579 projected.go:194] Error preparing data for projected volume kube-api-access-nk9qc for pod openshift-network-diagnostics/network-check-target-l6qjc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:30:02.724248 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:02.724112 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/38b35ba8-4e6c-4198-bb10-ea4df7f8816a-kube-api-access-nk9qc podName:38b35ba8-4e6c-4198-bb10-ea4df7f8816a nodeName:}" failed. No retries permitted until 2026-04-16 14:30:10.724093338 +0000 UTC m=+17.341883093 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-nk9qc" (UniqueName: "kubernetes.io/projected/38b35ba8-4e6c-4198-bb10-ea4df7f8816a-kube-api-access-nk9qc") pod "network-check-target-l6qjc" (UID: "38b35ba8-4e6c-4198-bb10-ea4df7f8816a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:30:02.974439 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:02.974241 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m57qr" Apr 16 14:30:02.974439 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:02.974391 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m57qr" podUID="20614211-2bef-41dc-aad8-94242eb8364c" Apr 16 14:30:03.975047 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:03.974986 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l6qjc" Apr 16 14:30:03.975583 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:03.975134 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l6qjc" podUID="38b35ba8-4e6c-4198-bb10-ea4df7f8816a" Apr 16 14:30:04.973790 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:04.973762 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m57qr" Apr 16 14:30:04.973986 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:04.973902 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m57qr" podUID="20614211-2bef-41dc-aad8-94242eb8364c" Apr 16 14:30:05.973587 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:05.973557 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l6qjc" Apr 16 14:30:05.973965 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:05.973672 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l6qjc" podUID="38b35ba8-4e6c-4198-bb10-ea4df7f8816a" Apr 16 14:30:06.974320 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:06.974284 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m57qr" Apr 16 14:30:06.974740 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:06.974422 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m57qr" podUID="20614211-2bef-41dc-aad8-94242eb8364c" Apr 16 14:30:07.973889 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:07.973856 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l6qjc" Apr 16 14:30:07.974080 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:07.973985 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l6qjc" podUID="38b35ba8-4e6c-4198-bb10-ea4df7f8816a" Apr 16 14:30:08.973951 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:08.973864 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m57qr" Apr 16 14:30:08.974384 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:08.973994 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m57qr" podUID="20614211-2bef-41dc-aad8-94242eb8364c" Apr 16 14:30:09.973917 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:09.973879 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l6qjc" Apr 16 14:30:09.974140 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:09.974020 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l6qjc" podUID="38b35ba8-4e6c-4198-bb10-ea4df7f8816a" Apr 16 14:30:10.586327 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:10.586284 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20614211-2bef-41dc-aad8-94242eb8364c-metrics-certs\") pod \"network-metrics-daemon-m57qr\" (UID: \"20614211-2bef-41dc-aad8-94242eb8364c\") " pod="openshift-multus/network-metrics-daemon-m57qr" Apr 16 14:30:10.586527 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:10.586451 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:30:10.586586 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:10.586541 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20614211-2bef-41dc-aad8-94242eb8364c-metrics-certs podName:20614211-2bef-41dc-aad8-94242eb8364c nodeName:}" failed. No retries permitted until 2026-04-16 14:30:26.586514348 +0000 UTC m=+33.204304108 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/20614211-2bef-41dc-aad8-94242eb8364c-metrics-certs") pod "network-metrics-daemon-m57qr" (UID: "20614211-2bef-41dc-aad8-94242eb8364c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:30:10.788013 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:10.787974 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nk9qc\" (UniqueName: \"kubernetes.io/projected/38b35ba8-4e6c-4198-bb10-ea4df7f8816a-kube-api-access-nk9qc\") pod \"network-check-target-l6qjc\" (UID: \"38b35ba8-4e6c-4198-bb10-ea4df7f8816a\") " pod="openshift-network-diagnostics/network-check-target-l6qjc" Apr 16 14:30:10.788213 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:10.788175 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:30:10.788213 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:10.788205 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:30:10.788320 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:10.788226 2579 projected.go:194] Error preparing data for projected volume kube-api-access-nk9qc for pod openshift-network-diagnostics/network-check-target-l6qjc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:30:10.788320 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:10.788289 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/38b35ba8-4e6c-4198-bb10-ea4df7f8816a-kube-api-access-nk9qc podName:38b35ba8-4e6c-4198-bb10-ea4df7f8816a nodeName:}" failed. No retries permitted until 2026-04-16 14:30:26.788273944 +0000 UTC m=+33.406063705 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-nk9qc" (UniqueName: "kubernetes.io/projected/38b35ba8-4e6c-4198-bb10-ea4df7f8816a-kube-api-access-nk9qc") pod "network-check-target-l6qjc" (UID: "38b35ba8-4e6c-4198-bb10-ea4df7f8816a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:30:10.973775 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:10.973696 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m57qr" Apr 16 14:30:10.973920 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:10.973821 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m57qr" podUID="20614211-2bef-41dc-aad8-94242eb8364c" Apr 16 14:30:11.976294 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:11.976263 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l6qjc" Apr 16 14:30:11.976755 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:11.976360 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l6qjc" podUID="38b35ba8-4e6c-4198-bb10-ea4df7f8816a" Apr 16 14:30:12.974166 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:12.974138 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m57qr" Apr 16 14:30:12.974324 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:12.974241 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m57qr" podUID="20614211-2bef-41dc-aad8-94242eb8364c" Apr 16 14:30:13.976541 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:13.976504 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l6qjc" Apr 16 14:30:13.977150 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:13.976626 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l6qjc" podUID="38b35ba8-4e6c-4198-bb10-ea4df7f8816a" Apr 16 14:30:14.194607 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:14.194575 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-144.ec2.internal" event={"ID":"ab72f00075c1175e59c7e696357a6702","Type":"ContainerStarted","Data":"8b3c2a07cecd780b28e4dc7e13208fc60e4999e7b760483c1b08e8e6f53baf38"} Apr 16 14:30:14.197075 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:14.197057 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vclsl_dc8ac89f-dee2-4e7c-b409-39b0900c673e/ovn-acl-logging/0.log" Apr 16 14:30:14.197365 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:14.197345 2579 generic.go:358] "Generic (PLEG): container finished" podID="dc8ac89f-dee2-4e7c-b409-39b0900c673e" containerID="33d5abd93b72e53dd4daa0abb721d44e462b7cbfd793e564ddbe07fb172de5e0" exitCode=1 Apr 16 14:30:14.197415 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:14.197403 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" event={"ID":"dc8ac89f-dee2-4e7c-b409-39b0900c673e","Type":"ContainerStarted","Data":"a0063ca4fd453355c4dd5c7bed570d2c13464d582237a3c38677ab25c13745f7"} Apr 16 14:30:14.197457 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:14.197423 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" event={"ID":"dc8ac89f-dee2-4e7c-b409-39b0900c673e","Type":"ContainerStarted","Data":"dae34c517fd145e2f7bf2b7501612f5f6f18d9dc7233b8a250e00407558a7c22"} Apr 16 14:30:14.197457 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:14.197436 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" event={"ID":"dc8ac89f-dee2-4e7c-b409-39b0900c673e","Type":"ContainerStarted","Data":"314b7bfc74c9b6187479c7e0cc438c236d350ee04d1062f50d87ed6ccca9aefe"} Apr 16 14:30:14.197457 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:14.197448 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" event={"ID":"dc8ac89f-dee2-4e7c-b409-39b0900c673e","Type":"ContainerStarted","Data":"bf0a99243d2083192567f841d3000e9a0dcd15e4efa7a6c51dab3f751879e6e6"} Apr 16 14:30:14.197545 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:14.197461 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" event={"ID":"dc8ac89f-dee2-4e7c-b409-39b0900c673e","Type":"ContainerDied","Data":"33d5abd93b72e53dd4daa0abb721d44e462b7cbfd793e564ddbe07fb172de5e0"} Apr 16 14:30:14.197545 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:14.197476 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" event={"ID":"dc8ac89f-dee2-4e7c-b409-39b0900c673e","Type":"ContainerStarted","Data":"472cb0886d8715864a2d3d65dbd270a86a9faa3100b8a8685d4f7985604718de"} Apr 16 14:30:14.198807 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:14.198785 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-v76f6" event={"ID":"79ff87e3-bc60-4320-a398-0c605679612c","Type":"ContainerStarted","Data":"eb9f6439bdad6ca270ac9ea1b8450fe14502d8eb385b9c17eb13828bb74f64b3"} Apr 16 14:30:14.200099 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:14.200077 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" event={"ID":"545d3883-e3bc-4b57-b29f-358dd3038d53","Type":"ContainerStarted","Data":"c122b48dd30561fd181b0a25a01dfdecfabcd062c7e359afa476fc91dc01e565"} Apr 16 14:30:14.240888 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:14.240822 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-144.ec2.internal" podStartSLOduration=20.240799901 podStartE2EDuration="20.240799901s" podCreationTimestamp="2026-04-16 14:29:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:30:14.215606669 +0000 UTC m=+20.833396432" watchObservedRunningTime="2026-04-16 14:30:14.240799901 +0000 UTC m=+20.858589667" Apr 16 14:30:14.241182 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:14.241141 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-7jmkf" podStartSLOduration=2.29579583 podStartE2EDuration="20.241129447s" podCreationTimestamp="2026-04-16 14:29:54 +0000 UTC" firstStartedPulling="2026-04-16 14:29:55.197504438 +0000 UTC m=+1.815294180" lastFinishedPulling="2026-04-16 14:30:13.142838045 +0000 UTC m=+19.760627797" observedRunningTime="2026-04-16 14:30:14.240117001 +0000 UTC m=+20.857906764" watchObservedRunningTime="2026-04-16 14:30:14.241129447 +0000 UTC m=+20.858919211" Apr 16 14:30:14.259538 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:14.259413 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-v76f6" podStartSLOduration=2.319888445 podStartE2EDuration="20.259395579s" podCreationTimestamp="2026-04-16 14:29:54 +0000 UTC" firstStartedPulling="2026-04-16 14:29:55.208878635 +0000 UTC m=+1.826668381" lastFinishedPulling="2026-04-16 14:30:13.148385759 +0000 UTC m=+19.766175515" observedRunningTime="2026-04-16 14:30:14.259050283 +0000 UTC m=+20.876840052" watchObservedRunningTime="2026-04-16 14:30:14.259395579 +0000 UTC m=+20.877185346" Apr 16 14:30:14.974181 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:14.973933 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m57qr" Apr 16 14:30:14.974307 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:14.974232 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m57qr" podUID="20614211-2bef-41dc-aad8-94242eb8364c" Apr 16 14:30:15.203314 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:15.203284 2579 generic.go:358] "Generic (PLEG): container finished" podID="88fbb2da-2de8-4d14-aa18-1817fb16e61c" containerID="a9b29183e9230ac44ca1967bbfa9ec3a8e18b7e9cc1794f1d7c157d7469a9283" exitCode=0 Apr 16 14:30:15.203868 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:15.203354 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7dhhl" event={"ID":"88fbb2da-2de8-4d14-aa18-1817fb16e61c","Type":"ContainerDied","Data":"a9b29183e9230ac44ca1967bbfa9ec3a8e18b7e9cc1794f1d7c157d7469a9283"} Apr 16 14:30:15.204689 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:15.204670 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4648f" event={"ID":"76137a5e-4deb-4e87-b7e5-d17bde0111e4","Type":"ContainerStarted","Data":"43c355aee9e2ca1a00a6de8bcc178c398147457cbf4a06567345e131bf0889c7"} Apr 16 14:30:15.206210 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:15.206159 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-t5h5z" event={"ID":"42343077-d6d1-4ad1-ba48-405f8545fbef","Type":"ContainerStarted","Data":"15aad2c9933e77976f9907f59232523326be65c96523400b615d4b436d7b38db"} Apr 16 14:30:15.207664 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:15.207637 2579 generic.go:358] "Generic (PLEG): container finished" podID="e28845082dd3a225de8448aa7a81a8c9" containerID="7ca98d63dcc0ca792eac83a414e6d686979b089d2ead7c24728a3beeb7bbc9c1" exitCode=0 Apr 16 14:30:15.207743 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:15.207709 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal" event={"ID":"e28845082dd3a225de8448aa7a81a8c9","Type":"ContainerDied","Data":"7ca98d63dcc0ca792eac83a414e6d686979b089d2ead7c24728a3beeb7bbc9c1"} Apr 16 14:30:15.208899 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:15.208874 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-k7pvw" event={"ID":"2f2a6e36-a780-456c-a172-4207fbfa5df6","Type":"ContainerStarted","Data":"b49d18672b576e4dce33a87983a079adf2ec97f8652d14bc8f5ff05317b2cd85"} Apr 16 14:30:15.210240 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:15.210218 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vjfvw" event={"ID":"1ec5f4c5-a526-4cf8-82d9-0b554ed2fbaa","Type":"ContainerStarted","Data":"8cde26f97c26af9e291283ddef15635d3af284a476c332a04787885dd6d10175"} Apr 16 14:30:15.211587 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:15.211558 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kdm8" event={"ID":"83ceccd4-351d-4f7b-a366-eff6c1e34ba1","Type":"ContainerStarted","Data":"ec02789cb65b46953bfb4fe8992f8a63df46c9431d5a84f19c2600527de1dbd7"} Apr 16 14:30:15.243433 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:15.243371 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-t5h5z" podStartSLOduration=3.265072968 podStartE2EDuration="21.243355299s" podCreationTimestamp="2026-04-16 14:29:54 +0000 UTC" firstStartedPulling="2026-04-16 14:29:55.160960244 +0000 UTC m=+1.778749985" lastFinishedPulling="2026-04-16 14:30:13.139242571 +0000 UTC m=+19.757032316" observedRunningTime="2026-04-16 14:30:15.242957289 +0000 UTC m=+21.860747053" watchObservedRunningTime="2026-04-16 14:30:15.243355299 +0000 UTC m=+21.861145063" Apr 16 14:30:15.252504 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:15.252385 2579 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 14:30:15.259254 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:15.259209 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-vjfvw" podStartSLOduration=3.321638365 podStartE2EDuration="21.259196255s" podCreationTimestamp="2026-04-16 14:29:54 +0000 UTC" firstStartedPulling="2026-04-16 14:29:55.203383531 +0000 UTC m=+1.821173271" lastFinishedPulling="2026-04-16 14:30:13.140941406 +0000 UTC m=+19.758731161" observedRunningTime="2026-04-16 14:30:15.258999661 +0000 UTC m=+21.876789456" watchObservedRunningTime="2026-04-16 14:30:15.259196255 +0000 UTC m=+21.876986017" Apr 16 14:30:15.272763 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:15.272717 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4648f" podStartSLOduration=3.373684042 podStartE2EDuration="21.272706225s" podCreationTimestamp="2026-04-16 14:29:54 +0000 UTC" firstStartedPulling="2026-04-16 14:29:55.215086709 +0000 UTC m=+1.832876453" lastFinishedPulling="2026-04-16 14:30:13.114108888 +0000 UTC m=+19.731898636" observedRunningTime="2026-04-16 14:30:15.272309274 +0000 UTC m=+21.890099036" watchObservedRunningTime="2026-04-16 14:30:15.272706225 +0000 UTC m=+21.890495988" Apr 16 14:30:15.285812 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:15.285779 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-k7pvw" podStartSLOduration=3.39725186 podStartE2EDuration="21.285769825s" podCreationTimestamp="2026-04-16 14:29:54 +0000 UTC" firstStartedPulling="2026-04-16 14:29:55.22559152 +0000 UTC m=+1.843381261" lastFinishedPulling="2026-04-16 14:30:13.114109476 +0000 UTC m=+19.731899226" observedRunningTime="2026-04-16 14:30:15.285668502 +0000 UTC m=+21.903458265" watchObservedRunningTime="2026-04-16 14:30:15.285769825 +0000 UTC m=+21.903559587" Apr 16 14:30:15.916891 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:15.916788 2579 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T14:30:15.252403773Z","UUID":"66311a73-1c8e-4430-9894-89739fd3ccdb","Handler":null,"Name":"","Endpoint":""} Apr 16 14:30:15.918633 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:15.918607 2579 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 14:30:15.918782 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:15.918642 2579 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 14:30:15.973605 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:15.973571 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l6qjc" Apr 16 14:30:15.973771 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:15.973745 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l6qjc" podUID="38b35ba8-4e6c-4198-bb10-ea4df7f8816a" Apr 16 14:30:16.216810 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:16.216785 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vclsl_dc8ac89f-dee2-4e7c-b409-39b0900c673e/ovn-acl-logging/0.log" Apr 16 14:30:16.217310 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:16.217281 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" event={"ID":"dc8ac89f-dee2-4e7c-b409-39b0900c673e","Type":"ContainerStarted","Data":"9b87cb54cb7c7b5b9798732c44328730143f5bf9246997adf9882ce9beed8ccf"} Apr 16 14:30:16.219099 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:16.219060 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kdm8" event={"ID":"83ceccd4-351d-4f7b-a366-eff6c1e34ba1","Type":"ContainerStarted","Data":"db63d0cd0d161986b88e5b60a12252625fab94142f5672c44ffe081eb6ebfe28"} Apr 16 14:30:16.221129 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:16.221082 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal" event={"ID":"e28845082dd3a225de8448aa7a81a8c9","Type":"ContainerStarted","Data":"5336fe16d1c9d5f30cd2a532cba884446aaadb49d4a2cd383a25fee491d90d91"} Apr 16 14:30:16.236993 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:16.236953 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-144.ec2.internal" podStartSLOduration=22.236939888 podStartE2EDuration="22.236939888s" podCreationTimestamp="2026-04-16 14:29:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:30:16.236416849 +0000 UTC m=+22.854206612" watchObservedRunningTime="2026-04-16 14:30:16.236939888 +0000 UTC m=+22.854729651" Apr 16 14:30:16.973518 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:16.973481 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m57qr" Apr 16 14:30:16.973698 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:16.973621 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m57qr" podUID="20614211-2bef-41dc-aad8-94242eb8364c" Apr 16 14:30:17.224937 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:17.224839 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kdm8" event={"ID":"83ceccd4-351d-4f7b-a366-eff6c1e34ba1","Type":"ContainerStarted","Data":"5587ac7b5198e7f36a3c7841b206cac067b7ebcbbf17bbb83a0909a06196e8ff"} Apr 16 14:30:17.246536 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:17.246486 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kdm8" podStartSLOduration=2.338241309 podStartE2EDuration="23.246470282s" podCreationTimestamp="2026-04-16 14:29:54 +0000 UTC" firstStartedPulling="2026-04-16 14:29:55.189012178 +0000 UTC m=+1.806801919" lastFinishedPulling="2026-04-16 14:30:16.097241132 +0000 UTC m=+22.715030892" observedRunningTime="2026-04-16 14:30:17.245855203 +0000 UTC m=+23.863644967" watchObservedRunningTime="2026-04-16 14:30:17.246470282 +0000 UTC m=+23.864260035" Apr 16 14:30:17.976167 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:17.976136 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l6qjc" Apr 16 14:30:17.976324 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:17.976250 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l6qjc" podUID="38b35ba8-4e6c-4198-bb10-ea4df7f8816a" Apr 16 14:30:18.831204 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:18.831165 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-t5h5z" Apr 16 14:30:18.832166 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:18.832140 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-t5h5z" Apr 16 14:30:18.973874 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:18.973839 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m57qr" Apr 16 14:30:18.974076 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:18.973969 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m57qr" podUID="20614211-2bef-41dc-aad8-94242eb8364c" Apr 16 14:30:19.234892 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:19.234720 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vclsl_dc8ac89f-dee2-4e7c-b409-39b0900c673e/ovn-acl-logging/0.log" Apr 16 14:30:19.235245 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:19.235205 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" event={"ID":"dc8ac89f-dee2-4e7c-b409-39b0900c673e","Type":"ContainerStarted","Data":"112ee72954d42625787626d86a94c784f0f55470f53bf3000a63c98b9df27875"} Apr 16 14:30:19.973543 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:19.973509 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l6qjc" Apr 16 14:30:19.974279 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:19.973633 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l6qjc" podUID="38b35ba8-4e6c-4198-bb10-ea4df7f8816a" Apr 16 14:30:20.239004 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:20.238910 2579 generic.go:358] "Generic (PLEG): container finished" podID="88fbb2da-2de8-4d14-aa18-1817fb16e61c" containerID="02e26c91524304ef9d14e024fffd94ae3be7f75a4f6f9ea90f46d8df78fb6d8f" exitCode=0 Apr 16 14:30:20.239004 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:20.238986 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7dhhl" event={"ID":"88fbb2da-2de8-4d14-aa18-1817fb16e61c","Type":"ContainerDied","Data":"02e26c91524304ef9d14e024fffd94ae3be7f75a4f6f9ea90f46d8df78fb6d8f"} Apr 16 14:30:20.239370 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:20.239351 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:30:20.239434 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:20.239379 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:30:20.239570 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:20.239529 2579 scope.go:117] "RemoveContainer" containerID="33d5abd93b72e53dd4daa0abb721d44e462b7cbfd793e564ddbe07fb172de5e0" Apr 16 14:30:20.254404 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:20.254382 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:30:20.254588 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:20.254573 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:30:20.578175 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:20.578145 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-t5h5z" Apr 16 14:30:20.578399 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:20.578257 2579 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 14:30:20.578780 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:20.578761 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-t5h5z" Apr 16 14:30:20.973843 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:20.973668 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m57qr" Apr 16 14:30:20.974327 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:20.973971 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m57qr" podUID="20614211-2bef-41dc-aad8-94242eb8364c" Apr 16 14:30:21.152454 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:21.152423 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-m57qr"] Apr 16 14:30:21.154888 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:21.154864 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-l6qjc"] Apr 16 14:30:21.155016 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:21.154951 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l6qjc" Apr 16 14:30:21.155081 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:21.155048 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l6qjc" podUID="38b35ba8-4e6c-4198-bb10-ea4df7f8816a" Apr 16 14:30:21.244103 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:21.244011 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vclsl_dc8ac89f-dee2-4e7c-b409-39b0900c673e/ovn-acl-logging/0.log" Apr 16 14:30:21.244389 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:21.244366 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" event={"ID":"dc8ac89f-dee2-4e7c-b409-39b0900c673e","Type":"ContainerStarted","Data":"47160a1c1f6226ec149a5161e1676ad4f5cd059f4f445712334331e9762d529b"} Apr 16 14:30:21.244473 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:21.244455 2579 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 14:30:21.249043 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:21.249000 2579 generic.go:358] "Generic (PLEG): container finished" podID="88fbb2da-2de8-4d14-aa18-1817fb16e61c" containerID="ab1879074371de5d8663a8ce3aed87b6c263b6fd54ca3178d029fadd12356a26" exitCode=0 Apr 16 14:30:21.249195 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:21.249052 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7dhhl" event={"ID":"88fbb2da-2de8-4d14-aa18-1817fb16e61c","Type":"ContainerDied","Data":"ab1879074371de5d8663a8ce3aed87b6c263b6fd54ca3178d029fadd12356a26"} Apr 16 14:30:21.249260 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:21.249244 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m57qr" Apr 16 14:30:21.249414 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:21.249393 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m57qr" podUID="20614211-2bef-41dc-aad8-94242eb8364c" Apr 16 14:30:21.274113 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:21.274071 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" podStartSLOduration=8.935159362 podStartE2EDuration="27.274058653s" podCreationTimestamp="2026-04-16 14:29:54 +0000 UTC" firstStartedPulling="2026-04-16 14:29:55.231844478 +0000 UTC m=+1.849634218" lastFinishedPulling="2026-04-16 14:30:13.570743748 +0000 UTC m=+20.188533509" observedRunningTime="2026-04-16 14:30:21.272879315 +0000 UTC m=+27.890669078" watchObservedRunningTime="2026-04-16 14:30:21.274058653 +0000 UTC m=+27.891848412" Apr 16 14:30:21.518893 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:21.518806 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:30:22.253320 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:22.253287 2579 generic.go:358] "Generic (PLEG): container finished" podID="88fbb2da-2de8-4d14-aa18-1817fb16e61c" containerID="139920f88cc8e7a8e4665c3ae9f0d84d7c579cb95c84dd3ba8c371720d9fecee" exitCode=0 Apr 16 14:30:22.253773 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:22.253369 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7dhhl" event={"ID":"88fbb2da-2de8-4d14-aa18-1817fb16e61c","Type":"ContainerDied","Data":"139920f88cc8e7a8e4665c3ae9f0d84d7c579cb95c84dd3ba8c371720d9fecee"} Apr 16 14:30:22.973815 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:22.973701 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m57qr" Apr 16 14:30:22.974088 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:22.973712 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l6qjc" Apr 16 14:30:22.974088 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:22.973824 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m57qr" podUID="20614211-2bef-41dc-aad8-94242eb8364c" Apr 16 14:30:22.974088 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:22.973933 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l6qjc" podUID="38b35ba8-4e6c-4198-bb10-ea4df7f8816a" Apr 16 14:30:24.974229 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:24.974164 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m57qr" Apr 16 14:30:24.974705 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:24.974165 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l6qjc" Apr 16 14:30:24.974705 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:24.974267 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m57qr" podUID="20614211-2bef-41dc-aad8-94242eb8364c" Apr 16 14:30:24.974705 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:24.974367 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l6qjc" podUID="38b35ba8-4e6c-4198-bb10-ea4df7f8816a" Apr 16 14:30:26.151900 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.151872 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-144.ec2.internal" event="NodeReady" Apr 16 14:30:26.152312 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.152060 2579 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 14:30:26.202506 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.202470 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-4t4r6"] Apr 16 14:30:26.223469 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.223442 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-src8k"] Apr 16 14:30:26.223652 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.223634 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4t4r6" Apr 16 14:30:26.225986 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.225962 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 14:30:26.226147 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.226008 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 14:30:26.226147 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.226065 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dqsrn\"" Apr 16 14:30:26.233694 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.233666 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4t4r6"] Apr 16 14:30:26.233812 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.233704 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-src8k"] Apr 16 14:30:26.233983 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.233966 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-src8k" Apr 16 14:30:26.236939 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.236920 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 14:30:26.237087 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.236961 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-zpkvj\"" Apr 16 14:30:26.237087 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.236964 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 14:30:26.237087 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.236906 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 14:30:26.405335 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.405252 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0b44026e-170c-4488-824c-c757c82f68cd-tmp-dir\") pod \"dns-default-4t4r6\" (UID: \"0b44026e-170c-4488-824c-c757c82f68cd\") " pod="openshift-dns/dns-default-4t4r6" Apr 16 14:30:26.405335 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.405293 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc52s\" (UniqueName: \"kubernetes.io/projected/93aad6d1-f6fd-49fd-aeac-3661dd3118bf-kube-api-access-dc52s\") pod \"ingress-canary-src8k\" (UID: \"93aad6d1-f6fd-49fd-aeac-3661dd3118bf\") " pod="openshift-ingress-canary/ingress-canary-src8k" Apr 16 14:30:26.405555 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.405408 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b44026e-170c-4488-824c-c757c82f68cd-config-volume\") pod \"dns-default-4t4r6\" (UID: \"0b44026e-170c-4488-824c-c757c82f68cd\") " pod="openshift-dns/dns-default-4t4r6" Apr 16 14:30:26.405555 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.405473 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhszr\" (UniqueName: \"kubernetes.io/projected/0b44026e-170c-4488-824c-c757c82f68cd-kube-api-access-zhszr\") pod \"dns-default-4t4r6\" (UID: \"0b44026e-170c-4488-824c-c757c82f68cd\") " pod="openshift-dns/dns-default-4t4r6" Apr 16 14:30:26.405555 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.405507 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93aad6d1-f6fd-49fd-aeac-3661dd3118bf-cert\") pod \"ingress-canary-src8k\" (UID: \"93aad6d1-f6fd-49fd-aeac-3661dd3118bf\") " pod="openshift-ingress-canary/ingress-canary-src8k" Apr 16 14:30:26.405555 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.405534 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b44026e-170c-4488-824c-c757c82f68cd-metrics-tls\") pod \"dns-default-4t4r6\" (UID: \"0b44026e-170c-4488-824c-c757c82f68cd\") " pod="openshift-dns/dns-default-4t4r6" Apr 16 14:30:26.506653 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.506617 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b44026e-170c-4488-824c-c757c82f68cd-config-volume\") pod \"dns-default-4t4r6\" (UID: \"0b44026e-170c-4488-824c-c757c82f68cd\") " pod="openshift-dns/dns-default-4t4r6" Apr 16 14:30:26.506855 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.506673 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhszr\" (UniqueName: \"kubernetes.io/projected/0b44026e-170c-4488-824c-c757c82f68cd-kube-api-access-zhszr\") pod \"dns-default-4t4r6\" (UID: \"0b44026e-170c-4488-824c-c757c82f68cd\") " pod="openshift-dns/dns-default-4t4r6" Apr 16 14:30:26.506855 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.506700 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93aad6d1-f6fd-49fd-aeac-3661dd3118bf-cert\") pod \"ingress-canary-src8k\" (UID: \"93aad6d1-f6fd-49fd-aeac-3661dd3118bf\") " pod="openshift-ingress-canary/ingress-canary-src8k" Apr 16 14:30:26.506855 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.506721 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b44026e-170c-4488-824c-c757c82f68cd-metrics-tls\") pod \"dns-default-4t4r6\" (UID: \"0b44026e-170c-4488-824c-c757c82f68cd\") " pod="openshift-dns/dns-default-4t4r6" Apr 16 14:30:26.506855 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.506740 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0b44026e-170c-4488-824c-c757c82f68cd-tmp-dir\") pod \"dns-default-4t4r6\" (UID: \"0b44026e-170c-4488-824c-c757c82f68cd\") " pod="openshift-dns/dns-default-4t4r6" Apr 16 14:30:26.506855 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.506767 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dc52s\" (UniqueName: \"kubernetes.io/projected/93aad6d1-f6fd-49fd-aeac-3661dd3118bf-kube-api-access-dc52s\") pod \"ingress-canary-src8k\" (UID: \"93aad6d1-f6fd-49fd-aeac-3661dd3118bf\") " pod="openshift-ingress-canary/ingress-canary-src8k" Apr 16 14:30:26.507152 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:26.506898 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:30:26.507152 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:26.506918 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:30:26.507152 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:26.506983 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93aad6d1-f6fd-49fd-aeac-3661dd3118bf-cert podName:93aad6d1-f6fd-49fd-aeac-3661dd3118bf nodeName:}" failed. No retries permitted until 2026-04-16 14:30:27.006961074 +0000 UTC m=+33.624750821 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/93aad6d1-f6fd-49fd-aeac-3661dd3118bf-cert") pod "ingress-canary-src8k" (UID: "93aad6d1-f6fd-49fd-aeac-3661dd3118bf") : secret "canary-serving-cert" not found Apr 16 14:30:26.507152 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:26.507004 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b44026e-170c-4488-824c-c757c82f68cd-metrics-tls podName:0b44026e-170c-4488-824c-c757c82f68cd nodeName:}" failed. No retries permitted until 2026-04-16 14:30:27.006994355 +0000 UTC m=+33.624784104 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0b44026e-170c-4488-824c-c757c82f68cd-metrics-tls") pod "dns-default-4t4r6" (UID: "0b44026e-170c-4488-824c-c757c82f68cd") : secret "dns-default-metrics-tls" not found Apr 16 14:30:26.507152 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.507144 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0b44026e-170c-4488-824c-c757c82f68cd-tmp-dir\") pod \"dns-default-4t4r6\" (UID: \"0b44026e-170c-4488-824c-c757c82f68cd\") " pod="openshift-dns/dns-default-4t4r6" Apr 16 14:30:26.517132 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.517076 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b44026e-170c-4488-824c-c757c82f68cd-config-volume\") pod \"dns-default-4t4r6\" (UID: \"0b44026e-170c-4488-824c-c757c82f68cd\") " pod="openshift-dns/dns-default-4t4r6" Apr 16 14:30:26.520686 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.520556 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhszr\" (UniqueName: \"kubernetes.io/projected/0b44026e-170c-4488-824c-c757c82f68cd-kube-api-access-zhszr\") pod \"dns-default-4t4r6\" (UID: \"0b44026e-170c-4488-824c-c757c82f68cd\") " pod="openshift-dns/dns-default-4t4r6" Apr 16 14:30:26.520817 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.520614 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc52s\" (UniqueName: \"kubernetes.io/projected/93aad6d1-f6fd-49fd-aeac-3661dd3118bf-kube-api-access-dc52s\") pod \"ingress-canary-src8k\" (UID: \"93aad6d1-f6fd-49fd-aeac-3661dd3118bf\") " pod="openshift-ingress-canary/ingress-canary-src8k" Apr 16 14:30:26.607925 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.607890 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20614211-2bef-41dc-aad8-94242eb8364c-metrics-certs\") pod \"network-metrics-daemon-m57qr\" (UID: \"20614211-2bef-41dc-aad8-94242eb8364c\") " pod="openshift-multus/network-metrics-daemon-m57qr" Apr 16 14:30:26.608138 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:26.608077 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:30:26.608190 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:26.608177 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20614211-2bef-41dc-aad8-94242eb8364c-metrics-certs podName:20614211-2bef-41dc-aad8-94242eb8364c nodeName:}" failed. No retries permitted until 2026-04-16 14:30:58.60815396 +0000 UTC m=+65.225943705 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/20614211-2bef-41dc-aad8-94242eb8364c-metrics-certs") pod "network-metrics-daemon-m57qr" (UID: "20614211-2bef-41dc-aad8-94242eb8364c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:30:26.809833 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.809791 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nk9qc\" (UniqueName: \"kubernetes.io/projected/38b35ba8-4e6c-4198-bb10-ea4df7f8816a-kube-api-access-nk9qc\") pod \"network-check-target-l6qjc\" (UID: \"38b35ba8-4e6c-4198-bb10-ea4df7f8816a\") " pod="openshift-network-diagnostics/network-check-target-l6qjc" Apr 16 14:30:26.810025 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:26.809986 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:30:26.810025 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:26.810013 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:30:26.810025 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:26.810043 2579 projected.go:194] Error preparing data for projected volume kube-api-access-nk9qc for pod openshift-network-diagnostics/network-check-target-l6qjc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:30:26.810199 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:26.810112 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/38b35ba8-4e6c-4198-bb10-ea4df7f8816a-kube-api-access-nk9qc podName:38b35ba8-4e6c-4198-bb10-ea4df7f8816a nodeName:}" failed. No retries permitted until 2026-04-16 14:30:58.810092943 +0000 UTC m=+65.427882685 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-nk9qc" (UniqueName: "kubernetes.io/projected/38b35ba8-4e6c-4198-bb10-ea4df7f8816a-kube-api-access-nk9qc") pod "network-check-target-l6qjc" (UID: "38b35ba8-4e6c-4198-bb10-ea4df7f8816a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:30:26.974220 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.974186 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l6qjc" Apr 16 14:30:26.974409 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.974186 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m57qr" Apr 16 14:30:26.977112 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.977087 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 14:30:26.977260 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.977119 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bl9dn\"" Apr 16 14:30:26.977260 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.977087 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 14:30:26.977260 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.977087 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 14:30:26.977955 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:26.977935 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-x8s5q\"" Apr 16 14:30:27.012624 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:27.012598 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93aad6d1-f6fd-49fd-aeac-3661dd3118bf-cert\") pod \"ingress-canary-src8k\" (UID: \"93aad6d1-f6fd-49fd-aeac-3661dd3118bf\") " pod="openshift-ingress-canary/ingress-canary-src8k" Apr 16 14:30:27.012778 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:27.012649 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b44026e-170c-4488-824c-c757c82f68cd-metrics-tls\") pod \"dns-default-4t4r6\" (UID: \"0b44026e-170c-4488-824c-c757c82f68cd\") " pod="openshift-dns/dns-default-4t4r6" Apr 16 14:30:27.012778 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:27.012749 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:30:27.012778 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:27.012774 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:30:27.012937 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:27.012821 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93aad6d1-f6fd-49fd-aeac-3661dd3118bf-cert podName:93aad6d1-f6fd-49fd-aeac-3661dd3118bf nodeName:}" failed. No retries permitted until 2026-04-16 14:30:28.01280114 +0000 UTC m=+34.630590889 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/93aad6d1-f6fd-49fd-aeac-3661dd3118bf-cert") pod "ingress-canary-src8k" (UID: "93aad6d1-f6fd-49fd-aeac-3661dd3118bf") : secret "canary-serving-cert" not found Apr 16 14:30:27.012937 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:27.012843 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b44026e-170c-4488-824c-c757c82f68cd-metrics-tls podName:0b44026e-170c-4488-824c-c757c82f68cd nodeName:}" failed. No retries permitted until 2026-04-16 14:30:28.012832824 +0000 UTC m=+34.630622571 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0b44026e-170c-4488-824c-c757c82f68cd-metrics-tls") pod "dns-default-4t4r6" (UID: "0b44026e-170c-4488-824c-c757c82f68cd") : secret "dns-default-metrics-tls" not found Apr 16 14:30:28.022263 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:28.022225 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b44026e-170c-4488-824c-c757c82f68cd-metrics-tls\") pod \"dns-default-4t4r6\" (UID: \"0b44026e-170c-4488-824c-c757c82f68cd\") " pod="openshift-dns/dns-default-4t4r6" Apr 16 14:30:28.022736 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:28.022339 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93aad6d1-f6fd-49fd-aeac-3661dd3118bf-cert\") pod \"ingress-canary-src8k\" (UID: \"93aad6d1-f6fd-49fd-aeac-3661dd3118bf\") " pod="openshift-ingress-canary/ingress-canary-src8k" Apr 16 14:30:28.022736 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:28.022387 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:30:28.022736 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:28.022445 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:30:28.022736 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:28.022470 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b44026e-170c-4488-824c-c757c82f68cd-metrics-tls podName:0b44026e-170c-4488-824c-c757c82f68cd nodeName:}" failed. No retries permitted until 2026-04-16 14:30:30.022449949 +0000 UTC m=+36.640239714 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0b44026e-170c-4488-824c-c757c82f68cd-metrics-tls") pod "dns-default-4t4r6" (UID: "0b44026e-170c-4488-824c-c757c82f68cd") : secret "dns-default-metrics-tls" not found Apr 16 14:30:28.022736 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:28.022491 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93aad6d1-f6fd-49fd-aeac-3661dd3118bf-cert podName:93aad6d1-f6fd-49fd-aeac-3661dd3118bf nodeName:}" failed. No retries permitted until 2026-04-16 14:30:30.022479013 +0000 UTC m=+36.640268754 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/93aad6d1-f6fd-49fd-aeac-3661dd3118bf-cert") pod "ingress-canary-src8k" (UID: "93aad6d1-f6fd-49fd-aeac-3661dd3118bf") : secret "canary-serving-cert" not found Apr 16 14:30:29.269926 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:29.269892 2579 generic.go:358] "Generic (PLEG): container finished" podID="88fbb2da-2de8-4d14-aa18-1817fb16e61c" containerID="e618a3ab76b4515ea304b85db0f589b1ebc4010210780695faa98805b99de1f0" exitCode=0 Apr 16 14:30:29.270402 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:29.269940 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7dhhl" event={"ID":"88fbb2da-2de8-4d14-aa18-1817fb16e61c","Type":"ContainerDied","Data":"e618a3ab76b4515ea304b85db0f589b1ebc4010210780695faa98805b99de1f0"} Apr 16 14:30:30.036325 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:30.036280 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93aad6d1-f6fd-49fd-aeac-3661dd3118bf-cert\") pod \"ingress-canary-src8k\" (UID: \"93aad6d1-f6fd-49fd-aeac-3661dd3118bf\") " pod="openshift-ingress-canary/ingress-canary-src8k" Apr 16 14:30:30.036325 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:30.036326 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b44026e-170c-4488-824c-c757c82f68cd-metrics-tls\") pod \"dns-default-4t4r6\" (UID: \"0b44026e-170c-4488-824c-c757c82f68cd\") " pod="openshift-dns/dns-default-4t4r6" Apr 16 14:30:30.036530 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:30.036423 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:30:30.036530 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:30.036436 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:30:30.036530 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:30.036493 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b44026e-170c-4488-824c-c757c82f68cd-metrics-tls podName:0b44026e-170c-4488-824c-c757c82f68cd nodeName:}" failed. No retries permitted until 2026-04-16 14:30:34.03647418 +0000 UTC m=+40.654263938 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0b44026e-170c-4488-824c-c757c82f68cd-metrics-tls") pod "dns-default-4t4r6" (UID: "0b44026e-170c-4488-824c-c757c82f68cd") : secret "dns-default-metrics-tls" not found Apr 16 14:30:30.036530 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:30.036512 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93aad6d1-f6fd-49fd-aeac-3661dd3118bf-cert podName:93aad6d1-f6fd-49fd-aeac-3661dd3118bf nodeName:}" failed. No retries permitted until 2026-04-16 14:30:34.036503485 +0000 UTC m=+40.654293227 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/93aad6d1-f6fd-49fd-aeac-3661dd3118bf-cert") pod "ingress-canary-src8k" (UID: "93aad6d1-f6fd-49fd-aeac-3661dd3118bf") : secret "canary-serving-cert" not found Apr 16 14:30:30.274180 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:30.274150 2579 generic.go:358] "Generic (PLEG): container finished" podID="88fbb2da-2de8-4d14-aa18-1817fb16e61c" containerID="068898ace194d4c5b39549c51ad81a89cc90fb23d5b72f510734ef78d418c39f" exitCode=0 Apr 16 14:30:30.274695 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:30.274194 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7dhhl" event={"ID":"88fbb2da-2de8-4d14-aa18-1817fb16e61c","Type":"ContainerDied","Data":"068898ace194d4c5b39549c51ad81a89cc90fb23d5b72f510734ef78d418c39f"} Apr 16 14:30:31.278565 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:31.278532 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7dhhl" event={"ID":"88fbb2da-2de8-4d14-aa18-1817fb16e61c","Type":"ContainerStarted","Data":"f9aa859d6a59ddcbcdc5eebb0063250a0cecae6e24789202b3604c25870ed371"} Apr 16 14:30:31.306155 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:31.306108 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-7dhhl" podStartSLOduration=3.995913057 podStartE2EDuration="37.306093117s" podCreationTimestamp="2026-04-16 14:29:54 +0000 UTC" firstStartedPulling="2026-04-16 14:29:55.219204857 +0000 UTC m=+1.836994597" lastFinishedPulling="2026-04-16 14:30:28.529384912 +0000 UTC m=+35.147174657" observedRunningTime="2026-04-16 14:30:31.305379363 +0000 UTC m=+37.923169126" watchObservedRunningTime="2026-04-16 14:30:31.306093117 +0000 UTC m=+37.923882880" Apr 16 14:30:34.064447 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:34.064405 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93aad6d1-f6fd-49fd-aeac-3661dd3118bf-cert\") pod \"ingress-canary-src8k\" (UID: \"93aad6d1-f6fd-49fd-aeac-3661dd3118bf\") " pod="openshift-ingress-canary/ingress-canary-src8k" Apr 16 14:30:34.064447 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:34.064455 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b44026e-170c-4488-824c-c757c82f68cd-metrics-tls\") pod \"dns-default-4t4r6\" (UID: \"0b44026e-170c-4488-824c-c757c82f68cd\") " pod="openshift-dns/dns-default-4t4r6" Apr 16 14:30:34.064983 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:34.064570 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:30:34.064983 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:34.064579 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:30:34.064983 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:34.064633 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b44026e-170c-4488-824c-c757c82f68cd-metrics-tls podName:0b44026e-170c-4488-824c-c757c82f68cd nodeName:}" failed. No retries permitted until 2026-04-16 14:30:42.064619244 +0000 UTC m=+48.682408985 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0b44026e-170c-4488-824c-c757c82f68cd-metrics-tls") pod "dns-default-4t4r6" (UID: "0b44026e-170c-4488-824c-c757c82f68cd") : secret "dns-default-metrics-tls" not found Apr 16 14:30:34.064983 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:34.064664 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93aad6d1-f6fd-49fd-aeac-3661dd3118bf-cert podName:93aad6d1-f6fd-49fd-aeac-3661dd3118bf nodeName:}" failed. No retries permitted until 2026-04-16 14:30:42.064644782 +0000 UTC m=+48.682434536 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/93aad6d1-f6fd-49fd-aeac-3661dd3118bf-cert") pod "ingress-canary-src8k" (UID: "93aad6d1-f6fd-49fd-aeac-3661dd3118bf") : secret "canary-serving-cert" not found Apr 16 14:30:42.120621 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:42.120578 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93aad6d1-f6fd-49fd-aeac-3661dd3118bf-cert\") pod \"ingress-canary-src8k\" (UID: \"93aad6d1-f6fd-49fd-aeac-3661dd3118bf\") " pod="openshift-ingress-canary/ingress-canary-src8k" Apr 16 14:30:42.120621 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:42.120627 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b44026e-170c-4488-824c-c757c82f68cd-metrics-tls\") pod \"dns-default-4t4r6\" (UID: \"0b44026e-170c-4488-824c-c757c82f68cd\") " pod="openshift-dns/dns-default-4t4r6" Apr 16 14:30:42.121139 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:42.120746 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:30:42.121139 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:42.120826 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93aad6d1-f6fd-49fd-aeac-3661dd3118bf-cert podName:93aad6d1-f6fd-49fd-aeac-3661dd3118bf nodeName:}" failed. No retries permitted until 2026-04-16 14:30:58.120810029 +0000 UTC m=+64.738599770 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/93aad6d1-f6fd-49fd-aeac-3661dd3118bf-cert") pod "ingress-canary-src8k" (UID: "93aad6d1-f6fd-49fd-aeac-3661dd3118bf") : secret "canary-serving-cert" not found Apr 16 14:30:42.121139 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:42.120748 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:30:42.121139 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:42.120905 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b44026e-170c-4488-824c-c757c82f68cd-metrics-tls podName:0b44026e-170c-4488-824c-c757c82f68cd nodeName:}" failed. No retries permitted until 2026-04-16 14:30:58.120893681 +0000 UTC m=+64.738683422 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0b44026e-170c-4488-824c-c757c82f68cd-metrics-tls") pod "dns-default-4t4r6" (UID: "0b44026e-170c-4488-824c-c757c82f68cd") : secret "dns-default-metrics-tls" not found Apr 16 14:30:53.266655 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:53.266627 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vclsl" Apr 16 14:30:58.133861 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:58.133823 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93aad6d1-f6fd-49fd-aeac-3661dd3118bf-cert\") pod \"ingress-canary-src8k\" (UID: \"93aad6d1-f6fd-49fd-aeac-3661dd3118bf\") " pod="openshift-ingress-canary/ingress-canary-src8k" Apr 16 14:30:58.133861 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:58.133865 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b44026e-170c-4488-824c-c757c82f68cd-metrics-tls\") pod \"dns-default-4t4r6\" (UID: \"0b44026e-170c-4488-824c-c757c82f68cd\") " pod="openshift-dns/dns-default-4t4r6" Apr 16 14:30:58.134322 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:58.133983 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:30:58.134322 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:58.133987 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:30:58.134322 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:58.134061 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b44026e-170c-4488-824c-c757c82f68cd-metrics-tls podName:0b44026e-170c-4488-824c-c757c82f68cd nodeName:}" failed. No retries permitted until 2026-04-16 14:31:30.134023283 +0000 UTC m=+96.751813026 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0b44026e-170c-4488-824c-c757c82f68cd-metrics-tls") pod "dns-default-4t4r6" (UID: "0b44026e-170c-4488-824c-c757c82f68cd") : secret "dns-default-metrics-tls" not found Apr 16 14:30:58.134322 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:58.134079 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93aad6d1-f6fd-49fd-aeac-3661dd3118bf-cert podName:93aad6d1-f6fd-49fd-aeac-3661dd3118bf nodeName:}" failed. No retries permitted until 2026-04-16 14:31:30.134071549 +0000 UTC m=+96.751861291 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/93aad6d1-f6fd-49fd-aeac-3661dd3118bf-cert") pod "ingress-canary-src8k" (UID: "93aad6d1-f6fd-49fd-aeac-3661dd3118bf") : secret "canary-serving-cert" not found Apr 16 14:30:58.636862 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:58.636821 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20614211-2bef-41dc-aad8-94242eb8364c-metrics-certs\") pod \"network-metrics-daemon-m57qr\" (UID: \"20614211-2bef-41dc-aad8-94242eb8364c\") " pod="openshift-multus/network-metrics-daemon-m57qr" Apr 16 14:30:58.639405 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:58.639386 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 14:30:58.647896 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:58.647875 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:30:58.647957 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:30:58.647934 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20614211-2bef-41dc-aad8-94242eb8364c-metrics-certs podName:20614211-2bef-41dc-aad8-94242eb8364c nodeName:}" failed. No retries permitted until 2026-04-16 14:32:02.647918194 +0000 UTC m=+129.265707934 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/20614211-2bef-41dc-aad8-94242eb8364c-metrics-certs") pod "network-metrics-daemon-m57qr" (UID: "20614211-2bef-41dc-aad8-94242eb8364c") : secret "metrics-daemon-secret" not found Apr 16 14:30:58.838378 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:58.838343 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nk9qc\" (UniqueName: \"kubernetes.io/projected/38b35ba8-4e6c-4198-bb10-ea4df7f8816a-kube-api-access-nk9qc\") pod \"network-check-target-l6qjc\" (UID: \"38b35ba8-4e6c-4198-bb10-ea4df7f8816a\") " pod="openshift-network-diagnostics/network-check-target-l6qjc" Apr 16 14:30:58.841131 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:58.841112 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 14:30:58.851369 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:58.851352 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 14:30:58.861757 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:58.861735 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk9qc\" (UniqueName: \"kubernetes.io/projected/38b35ba8-4e6c-4198-bb10-ea4df7f8816a-kube-api-access-nk9qc\") pod \"network-check-target-l6qjc\" (UID: \"38b35ba8-4e6c-4198-bb10-ea4df7f8816a\") " pod="openshift-network-diagnostics/network-check-target-l6qjc" Apr 16 14:30:59.088814 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:59.088787 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-x8s5q\"" Apr 16 14:30:59.096820 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:59.096797 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l6qjc" Apr 16 14:30:59.272800 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:59.272770 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-l6qjc"] Apr 16 14:30:59.277696 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:30:59.277669 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38b35ba8_4e6c_4198_bb10_ea4df7f8816a.slice/crio-4a52f6dc0cf42361c3da79e0f53a97cf570c00aec7ad5e3ac3ea9b96074e90e7 WatchSource:0}: Error finding container 4a52f6dc0cf42361c3da79e0f53a97cf570c00aec7ad5e3ac3ea9b96074e90e7: Status 404 returned error can't find the container with id 4a52f6dc0cf42361c3da79e0f53a97cf570c00aec7ad5e3ac3ea9b96074e90e7 Apr 16 14:30:59.326627 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:30:59.326596 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-l6qjc" event={"ID":"38b35ba8-4e6c-4198-bb10-ea4df7f8816a","Type":"ContainerStarted","Data":"4a52f6dc0cf42361c3da79e0f53a97cf570c00aec7ad5e3ac3ea9b96074e90e7"} Apr 16 14:31:02.332902 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:02.332761 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-l6qjc" event={"ID":"38b35ba8-4e6c-4198-bb10-ea4df7f8816a","Type":"ContainerStarted","Data":"c0e8fbff3bc4904555a2d87d7e95be87487e6bbe921fb908089e392d71f95341"} Apr 16 14:31:02.332902 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:02.332879 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-l6qjc" Apr 16 14:31:02.351178 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:02.351131 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-l6qjc" podStartSLOduration=65.617440333 podStartE2EDuration="1m8.35111695s" podCreationTimestamp="2026-04-16 14:29:54 +0000 UTC" firstStartedPulling="2026-04-16 14:30:59.279513422 +0000 UTC m=+65.897303163" lastFinishedPulling="2026-04-16 14:31:02.013190026 +0000 UTC m=+68.630979780" observedRunningTime="2026-04-16 14:31:02.350605909 +0000 UTC m=+68.968395671" watchObservedRunningTime="2026-04-16 14:31:02.35111695 +0000 UTC m=+68.968906712" Apr 16 14:31:30.146361 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:30.146314 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93aad6d1-f6fd-49fd-aeac-3661dd3118bf-cert\") pod \"ingress-canary-src8k\" (UID: \"93aad6d1-f6fd-49fd-aeac-3661dd3118bf\") " pod="openshift-ingress-canary/ingress-canary-src8k" Apr 16 14:31:30.146361 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:30.146367 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b44026e-170c-4488-824c-c757c82f68cd-metrics-tls\") pod \"dns-default-4t4r6\" (UID: \"0b44026e-170c-4488-824c-c757c82f68cd\") " pod="openshift-dns/dns-default-4t4r6" Apr 16 14:31:30.146895 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:31:30.146491 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:31:30.146895 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:31:30.146556 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b44026e-170c-4488-824c-c757c82f68cd-metrics-tls podName:0b44026e-170c-4488-824c-c757c82f68cd nodeName:}" failed. No retries permitted until 2026-04-16 14:32:34.146541809 +0000 UTC m=+160.764331550 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0b44026e-170c-4488-824c-c757c82f68cd-metrics-tls") pod "dns-default-4t4r6" (UID: "0b44026e-170c-4488-824c-c757c82f68cd") : secret "dns-default-metrics-tls" not found Apr 16 14:31:30.146895 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:31:30.146493 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:31:30.146895 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:31:30.146682 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93aad6d1-f6fd-49fd-aeac-3661dd3118bf-cert podName:93aad6d1-f6fd-49fd-aeac-3661dd3118bf nodeName:}" failed. No retries permitted until 2026-04-16 14:32:34.146667634 +0000 UTC m=+160.764457393 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/93aad6d1-f6fd-49fd-aeac-3661dd3118bf-cert") pod "ingress-canary-src8k" (UID: "93aad6d1-f6fd-49fd-aeac-3661dd3118bf") : secret "canary-serving-cert" not found Apr 16 14:31:33.336799 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:33.336767 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-l6qjc" Apr 16 14:31:46.968494 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:46.968450 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-579fc97f6b-mdg5r"] Apr 16 14:31:46.972721 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:46.972700 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" Apr 16 14:31:46.974732 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:46.974708 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 14:31:46.974831 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:46.974754 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 14:31:46.974831 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:46.974762 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 14:31:46.974941 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:46.974899 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-r52pv\"" Apr 16 14:31:46.980216 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:46.980199 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 14:31:46.980835 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:46.980755 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-579fc97f6b-mdg5r"] Apr 16 14:31:47.064015 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.063983 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/954a0b17-ede2-4a5f-a717-fb8f76331dee-installation-pull-secrets\") pod \"image-registry-579fc97f6b-mdg5r\" (UID: \"954a0b17-ede2-4a5f-a717-fb8f76331dee\") " pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" Apr 16 14:31:47.064015 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.064015 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/954a0b17-ede2-4a5f-a717-fb8f76331dee-registry-certificates\") pod \"image-registry-579fc97f6b-mdg5r\" (UID: \"954a0b17-ede2-4a5f-a717-fb8f76331dee\") " pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" Apr 16 14:31:47.064234 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.064097 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/954a0b17-ede2-4a5f-a717-fb8f76331dee-registry-tls\") pod \"image-registry-579fc97f6b-mdg5r\" (UID: \"954a0b17-ede2-4a5f-a717-fb8f76331dee\") " pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" Apr 16 14:31:47.064234 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.064163 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/954a0b17-ede2-4a5f-a717-fb8f76331dee-bound-sa-token\") pod \"image-registry-579fc97f6b-mdg5r\" (UID: \"954a0b17-ede2-4a5f-a717-fb8f76331dee\") " pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" Apr 16 14:31:47.064234 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.064182 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c94qs\" (UniqueName: \"kubernetes.io/projected/954a0b17-ede2-4a5f-a717-fb8f76331dee-kube-api-access-c94qs\") pod \"image-registry-579fc97f6b-mdg5r\" (UID: \"954a0b17-ede2-4a5f-a717-fb8f76331dee\") " pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" Apr 16 14:31:47.064234 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.064201 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/954a0b17-ede2-4a5f-a717-fb8f76331dee-ca-trust-extracted\") pod \"image-registry-579fc97f6b-mdg5r\" (UID: \"954a0b17-ede2-4a5f-a717-fb8f76331dee\") " pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" Apr 16 14:31:47.064234 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.064220 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/954a0b17-ede2-4a5f-a717-fb8f76331dee-trusted-ca\") pod \"image-registry-579fc97f6b-mdg5r\" (UID: \"954a0b17-ede2-4a5f-a717-fb8f76331dee\") " pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" Apr 16 14:31:47.064397 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.064268 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/954a0b17-ede2-4a5f-a717-fb8f76331dee-image-registry-private-configuration\") pod \"image-registry-579fc97f6b-mdg5r\" (UID: \"954a0b17-ede2-4a5f-a717-fb8f76331dee\") " pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" Apr 16 14:31:47.079644 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.079613 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-gr6dx"] Apr 16 14:31:47.082451 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.082429 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-lr5dz"] Apr 16 14:31:47.082669 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.082650 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-gr6dx" Apr 16 14:31:47.085178 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.085161 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-lr5dz" Apr 16 14:31:47.100432 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.100414 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 14:31:47.100805 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.100789 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-qwbdk\"" Apr 16 14:31:47.101374 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.101362 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 14:31:47.103726 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.103712 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:31:47.104274 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.104252 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:31:47.104416 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.104393 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 14:31:47.127472 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.127445 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-xcshx\"" Apr 16 14:31:47.127609 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.127483 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 14:31:47.128293 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.128270 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-gr6dx"] Apr 16 14:31:47.144827 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.144803 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 14:31:47.151964 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.151941 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-lr5dz"] Apr 16 14:31:47.165180 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.165159 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/954a0b17-ede2-4a5f-a717-fb8f76331dee-ca-trust-extracted\") pod \"image-registry-579fc97f6b-mdg5r\" (UID: \"954a0b17-ede2-4a5f-a717-fb8f76331dee\") " pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" Apr 16 14:31:47.165280 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.165188 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/954a0b17-ede2-4a5f-a717-fb8f76331dee-trusted-ca\") pod \"image-registry-579fc97f6b-mdg5r\" (UID: \"954a0b17-ede2-4a5f-a717-fb8f76331dee\") " pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" Apr 16 14:31:47.165280 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.165209 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fca51d46-f492-4bcf-8d66-7ecb32e54d54-trusted-ca\") pod \"console-operator-d87b8d5fc-lr5dz\" (UID: \"fca51d46-f492-4bcf-8d66-7ecb32e54d54\") " pod="openshift-console-operator/console-operator-d87b8d5fc-lr5dz" Apr 16 14:31:47.165280 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.165241 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/954a0b17-ede2-4a5f-a717-fb8f76331dee-image-registry-private-configuration\") pod \"image-registry-579fc97f6b-mdg5r\" (UID: \"954a0b17-ede2-4a5f-a717-fb8f76331dee\") " pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" Apr 16 14:31:47.165280 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.165271 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2r4q\" (UniqueName: \"kubernetes.io/projected/3e29e4f2-44c1-4a7f-b243-c4e1d79a60ae-kube-api-access-h2r4q\") pod \"volume-data-source-validator-7d955d5dd4-gr6dx\" (UID: \"3e29e4f2-44c1-4a7f-b243-c4e1d79a60ae\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-gr6dx" Apr 16 14:31:47.165438 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.165320 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vzmh\" (UniqueName: \"kubernetes.io/projected/fca51d46-f492-4bcf-8d66-7ecb32e54d54-kube-api-access-6vzmh\") pod \"console-operator-d87b8d5fc-lr5dz\" (UID: \"fca51d46-f492-4bcf-8d66-7ecb32e54d54\") " pod="openshift-console-operator/console-operator-d87b8d5fc-lr5dz" Apr 16 14:31:47.165438 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.165391 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fca51d46-f492-4bcf-8d66-7ecb32e54d54-config\") pod \"console-operator-d87b8d5fc-lr5dz\" (UID: \"fca51d46-f492-4bcf-8d66-7ecb32e54d54\") " pod="openshift-console-operator/console-operator-d87b8d5fc-lr5dz" Apr 16 14:31:47.165522 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.165441 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/954a0b17-ede2-4a5f-a717-fb8f76331dee-installation-pull-secrets\") pod \"image-registry-579fc97f6b-mdg5r\" (UID: \"954a0b17-ede2-4a5f-a717-fb8f76331dee\") " pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" Apr 16 14:31:47.165522 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.165459 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/954a0b17-ede2-4a5f-a717-fb8f76331dee-registry-certificates\") pod \"image-registry-579fc97f6b-mdg5r\" (UID: \"954a0b17-ede2-4a5f-a717-fb8f76331dee\") " pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" Apr 16 14:31:47.165522 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.165478 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fca51d46-f492-4bcf-8d66-7ecb32e54d54-serving-cert\") pod \"console-operator-d87b8d5fc-lr5dz\" (UID: \"fca51d46-f492-4bcf-8d66-7ecb32e54d54\") " pod="openshift-console-operator/console-operator-d87b8d5fc-lr5dz" Apr 16 14:31:47.165666 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.165522 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/954a0b17-ede2-4a5f-a717-fb8f76331dee-registry-tls\") pod \"image-registry-579fc97f6b-mdg5r\" (UID: \"954a0b17-ede2-4a5f-a717-fb8f76331dee\") " pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" Apr 16 14:31:47.165666 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.165555 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/954a0b17-ede2-4a5f-a717-fb8f76331dee-bound-sa-token\") pod \"image-registry-579fc97f6b-mdg5r\" (UID: \"954a0b17-ede2-4a5f-a717-fb8f76331dee\") " pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" Apr 16 14:31:47.165666 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.165577 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c94qs\" (UniqueName: \"kubernetes.io/projected/954a0b17-ede2-4a5f-a717-fb8f76331dee-kube-api-access-c94qs\") pod \"image-registry-579fc97f6b-mdg5r\" (UID: \"954a0b17-ede2-4a5f-a717-fb8f76331dee\") " pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" Apr 16 14:31:47.165666 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.165582 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/954a0b17-ede2-4a5f-a717-fb8f76331dee-ca-trust-extracted\") pod \"image-registry-579fc97f6b-mdg5r\" (UID: \"954a0b17-ede2-4a5f-a717-fb8f76331dee\") " pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" Apr 16 14:31:47.165880 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:31:47.165728 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:31:47.165880 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:31:47.165745 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-579fc97f6b-mdg5r: secret "image-registry-tls" not found Apr 16 14:31:47.165880 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:31:47.165806 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/954a0b17-ede2-4a5f-a717-fb8f76331dee-registry-tls podName:954a0b17-ede2-4a5f-a717-fb8f76331dee nodeName:}" failed. No retries permitted until 2026-04-16 14:31:47.665786704 +0000 UTC m=+114.283576454 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/954a0b17-ede2-4a5f-a717-fb8f76331dee-registry-tls") pod "image-registry-579fc97f6b-mdg5r" (UID: "954a0b17-ede2-4a5f-a717-fb8f76331dee") : secret "image-registry-tls" not found Apr 16 14:31:47.166601 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.166579 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/954a0b17-ede2-4a5f-a717-fb8f76331dee-registry-certificates\") pod \"image-registry-579fc97f6b-mdg5r\" (UID: \"954a0b17-ede2-4a5f-a717-fb8f76331dee\") " pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" Apr 16 14:31:47.166735 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.166718 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/954a0b17-ede2-4a5f-a717-fb8f76331dee-trusted-ca\") pod \"image-registry-579fc97f6b-mdg5r\" (UID: \"954a0b17-ede2-4a5f-a717-fb8f76331dee\") " pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" Apr 16 14:31:47.168339 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.168320 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/954a0b17-ede2-4a5f-a717-fb8f76331dee-image-registry-private-configuration\") pod \"image-registry-579fc97f6b-mdg5r\" (UID: \"954a0b17-ede2-4a5f-a717-fb8f76331dee\") " pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" Apr 16 14:31:47.168457 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.168439 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/954a0b17-ede2-4a5f-a717-fb8f76331dee-installation-pull-secrets\") pod \"image-registry-579fc97f6b-mdg5r\" (UID: \"954a0b17-ede2-4a5f-a717-fb8f76331dee\") " pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" Apr 16 14:31:47.186424 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.186404 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c94qs\" (UniqueName: \"kubernetes.io/projected/954a0b17-ede2-4a5f-a717-fb8f76331dee-kube-api-access-c94qs\") pod \"image-registry-579fc97f6b-mdg5r\" (UID: \"954a0b17-ede2-4a5f-a717-fb8f76331dee\") " pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" Apr 16 14:31:47.192773 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.192752 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/954a0b17-ede2-4a5f-a717-fb8f76331dee-bound-sa-token\") pod \"image-registry-579fc97f6b-mdg5r\" (UID: \"954a0b17-ede2-4a5f-a717-fb8f76331dee\") " pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" Apr 16 14:31:47.266669 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.266592 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fca51d46-f492-4bcf-8d66-7ecb32e54d54-serving-cert\") pod \"console-operator-d87b8d5fc-lr5dz\" (UID: \"fca51d46-f492-4bcf-8d66-7ecb32e54d54\") " pod="openshift-console-operator/console-operator-d87b8d5fc-lr5dz" Apr 16 14:31:47.266669 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.266641 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fca51d46-f492-4bcf-8d66-7ecb32e54d54-trusted-ca\") pod \"console-operator-d87b8d5fc-lr5dz\" (UID: \"fca51d46-f492-4bcf-8d66-7ecb32e54d54\") " pod="openshift-console-operator/console-operator-d87b8d5fc-lr5dz" Apr 16 14:31:47.266883 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.266672 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2r4q\" (UniqueName: \"kubernetes.io/projected/3e29e4f2-44c1-4a7f-b243-c4e1d79a60ae-kube-api-access-h2r4q\") pod \"volume-data-source-validator-7d955d5dd4-gr6dx\" (UID: \"3e29e4f2-44c1-4a7f-b243-c4e1d79a60ae\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-gr6dx" Apr 16 14:31:47.266883 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.266700 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vzmh\" (UniqueName: \"kubernetes.io/projected/fca51d46-f492-4bcf-8d66-7ecb32e54d54-kube-api-access-6vzmh\") pod \"console-operator-d87b8d5fc-lr5dz\" (UID: \"fca51d46-f492-4bcf-8d66-7ecb32e54d54\") " pod="openshift-console-operator/console-operator-d87b8d5fc-lr5dz" Apr 16 14:31:47.266883 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.266730 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fca51d46-f492-4bcf-8d66-7ecb32e54d54-config\") pod \"console-operator-d87b8d5fc-lr5dz\" (UID: \"fca51d46-f492-4bcf-8d66-7ecb32e54d54\") " pod="openshift-console-operator/console-operator-d87b8d5fc-lr5dz" Apr 16 14:31:47.267365 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.267342 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fca51d46-f492-4bcf-8d66-7ecb32e54d54-config\") pod \"console-operator-d87b8d5fc-lr5dz\" (UID: \"fca51d46-f492-4bcf-8d66-7ecb32e54d54\") " pod="openshift-console-operator/console-operator-d87b8d5fc-lr5dz" Apr 16 14:31:47.267657 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.267619 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fca51d46-f492-4bcf-8d66-7ecb32e54d54-trusted-ca\") pod \"console-operator-d87b8d5fc-lr5dz\" (UID: \"fca51d46-f492-4bcf-8d66-7ecb32e54d54\") " pod="openshift-console-operator/console-operator-d87b8d5fc-lr5dz" Apr 16 14:31:47.268870 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.268847 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fca51d46-f492-4bcf-8d66-7ecb32e54d54-serving-cert\") pod \"console-operator-d87b8d5fc-lr5dz\" (UID: \"fca51d46-f492-4bcf-8d66-7ecb32e54d54\") " pod="openshift-console-operator/console-operator-d87b8d5fc-lr5dz" Apr 16 14:31:47.288270 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.288239 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2r4q\" (UniqueName: \"kubernetes.io/projected/3e29e4f2-44c1-4a7f-b243-c4e1d79a60ae-kube-api-access-h2r4q\") pod \"volume-data-source-validator-7d955d5dd4-gr6dx\" (UID: \"3e29e4f2-44c1-4a7f-b243-c4e1d79a60ae\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-gr6dx" Apr 16 14:31:47.293855 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.293827 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vzmh\" (UniqueName: \"kubernetes.io/projected/fca51d46-f492-4bcf-8d66-7ecb32e54d54-kube-api-access-6vzmh\") pod \"console-operator-d87b8d5fc-lr5dz\" (UID: \"fca51d46-f492-4bcf-8d66-7ecb32e54d54\") " pod="openshift-console-operator/console-operator-d87b8d5fc-lr5dz" Apr 16 14:31:47.392002 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.391975 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-gr6dx" Apr 16 14:31:47.396676 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.396659 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-lr5dz" Apr 16 14:31:47.512075 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.512023 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-gr6dx"] Apr 16 14:31:47.515396 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:31:47.515362 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e29e4f2_44c1_4a7f_b243_c4e1d79a60ae.slice/crio-ac4860bc5f3f9b4e3a30bfc72023012fb0b6db7a70ad1c48021e886c25f3046a WatchSource:0}: Error finding container ac4860bc5f3f9b4e3a30bfc72023012fb0b6db7a70ad1c48021e886c25f3046a: Status 404 returned error can't find the container with id ac4860bc5f3f9b4e3a30bfc72023012fb0b6db7a70ad1c48021e886c25f3046a Apr 16 14:31:47.527363 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.527340 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-lr5dz"] Apr 16 14:31:47.530699 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:31:47.530674 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfca51d46_f492_4bcf_8d66_7ecb32e54d54.slice/crio-5b680018834493de8f53c556d92e21f9a5fb05c0e078f7ffccfa3e0cc66fa8ad WatchSource:0}: Error finding container 5b680018834493de8f53c556d92e21f9a5fb05c0e078f7ffccfa3e0cc66fa8ad: Status 404 returned error can't find the container with id 5b680018834493de8f53c556d92e21f9a5fb05c0e078f7ffccfa3e0cc66fa8ad Apr 16 14:31:47.670058 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:47.669992 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/954a0b17-ede2-4a5f-a717-fb8f76331dee-registry-tls\") pod \"image-registry-579fc97f6b-mdg5r\" (UID: \"954a0b17-ede2-4a5f-a717-fb8f76331dee\") " pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" Apr 16 14:31:47.670204 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:31:47.670159 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:31:47.670204 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:31:47.670178 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-579fc97f6b-mdg5r: secret "image-registry-tls" not found Apr 16 14:31:47.670298 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:31:47.670237 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/954a0b17-ede2-4a5f-a717-fb8f76331dee-registry-tls podName:954a0b17-ede2-4a5f-a717-fb8f76331dee nodeName:}" failed. No retries permitted until 2026-04-16 14:31:48.670219298 +0000 UTC m=+115.288009058 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/954a0b17-ede2-4a5f-a717-fb8f76331dee-registry-tls") pod "image-registry-579fc97f6b-mdg5r" (UID: "954a0b17-ede2-4a5f-a717-fb8f76331dee") : secret "image-registry-tls" not found Apr 16 14:31:48.426070 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:48.425946 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-lr5dz" event={"ID":"fca51d46-f492-4bcf-8d66-7ecb32e54d54","Type":"ContainerStarted","Data":"5b680018834493de8f53c556d92e21f9a5fb05c0e078f7ffccfa3e0cc66fa8ad"} Apr 16 14:31:48.427151 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:48.427103 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-gr6dx" event={"ID":"3e29e4f2-44c1-4a7f-b243-c4e1d79a60ae","Type":"ContainerStarted","Data":"ac4860bc5f3f9b4e3a30bfc72023012fb0b6db7a70ad1c48021e886c25f3046a"} Apr 16 14:31:48.676681 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:48.676605 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/954a0b17-ede2-4a5f-a717-fb8f76331dee-registry-tls\") pod \"image-registry-579fc97f6b-mdg5r\" (UID: \"954a0b17-ede2-4a5f-a717-fb8f76331dee\") " pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" Apr 16 14:31:48.676876 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:31:48.676763 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:31:48.676876 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:31:48.676790 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-579fc97f6b-mdg5r: secret "image-registry-tls" not found Apr 16 14:31:48.676876 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:31:48.676853 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/954a0b17-ede2-4a5f-a717-fb8f76331dee-registry-tls podName:954a0b17-ede2-4a5f-a717-fb8f76331dee nodeName:}" failed. No retries permitted until 2026-04-16 14:31:50.676837165 +0000 UTC m=+117.294626923 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/954a0b17-ede2-4a5f-a717-fb8f76331dee-registry-tls") pod "image-registry-579fc97f6b-mdg5r" (UID: "954a0b17-ede2-4a5f-a717-fb8f76331dee") : secret "image-registry-tls" not found Apr 16 14:31:49.429326 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:49.429299 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-gr6dx" event={"ID":"3e29e4f2-44c1-4a7f-b243-c4e1d79a60ae","Type":"ContainerStarted","Data":"e856e8161b62a98ea4d95c5602850cf55de20bf1ff8b86e9007f3689f47d39f7"} Apr 16 14:31:49.446171 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:49.446130 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-gr6dx" podStartSLOduration=1.138277433 podStartE2EDuration="2.446114872s" podCreationTimestamp="2026-04-16 14:31:47 +0000 UTC" firstStartedPulling="2026-04-16 14:31:47.517130965 +0000 UTC m=+114.134920710" lastFinishedPulling="2026-04-16 14:31:48.824968408 +0000 UTC m=+115.442758149" observedRunningTime="2026-04-16 14:31:49.445113707 +0000 UTC m=+116.062903470" watchObservedRunningTime="2026-04-16 14:31:49.446114872 +0000 UTC m=+116.063904635" Apr 16 14:31:50.435319 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:50.435282 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-lr5dz_fca51d46-f492-4bcf-8d66-7ecb32e54d54/console-operator/0.log" Apr 16 14:31:50.435804 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:50.435329 2579 generic.go:358] "Generic (PLEG): container finished" podID="fca51d46-f492-4bcf-8d66-7ecb32e54d54" containerID="0bdf7e6ff78dce80ec43a243c11845fc0619697f0191f7c323a8d3bed5790715" exitCode=255 Apr 16 14:31:50.435804 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:50.435419 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-lr5dz" event={"ID":"fca51d46-f492-4bcf-8d66-7ecb32e54d54","Type":"ContainerDied","Data":"0bdf7e6ff78dce80ec43a243c11845fc0619697f0191f7c323a8d3bed5790715"} Apr 16 14:31:50.435804 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:50.435713 2579 scope.go:117] "RemoveContainer" containerID="0bdf7e6ff78dce80ec43a243c11845fc0619697f0191f7c323a8d3bed5790715" Apr 16 14:31:50.471944 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:50.471912 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-q47lm"] Apr 16 14:31:50.476065 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:50.476046 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-q47lm" Apr 16 14:31:50.478091 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:50.478071 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-lkwxz\"" Apr 16 14:31:50.482906 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:50.482886 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-q47lm"] Apr 16 14:31:50.593099 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:50.593071 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhbt4\" (UniqueName: \"kubernetes.io/projected/c9c1ca3a-17c3-42bd-b377-8de2ae3b265f-kube-api-access-fhbt4\") pod \"network-check-source-7b678d77c7-q47lm\" (UID: \"c9c1ca3a-17c3-42bd-b377-8de2ae3b265f\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-q47lm" Apr 16 14:31:50.694543 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:50.694435 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/954a0b17-ede2-4a5f-a717-fb8f76331dee-registry-tls\") pod \"image-registry-579fc97f6b-mdg5r\" (UID: \"954a0b17-ede2-4a5f-a717-fb8f76331dee\") " pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" Apr 16 14:31:50.694543 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:50.694527 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fhbt4\" (UniqueName: \"kubernetes.io/projected/c9c1ca3a-17c3-42bd-b377-8de2ae3b265f-kube-api-access-fhbt4\") pod \"network-check-source-7b678d77c7-q47lm\" (UID: \"c9c1ca3a-17c3-42bd-b377-8de2ae3b265f\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-q47lm" Apr 16 14:31:50.694750 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:31:50.694584 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:31:50.694750 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:31:50.694604 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-579fc97f6b-mdg5r: secret "image-registry-tls" not found Apr 16 14:31:50.694750 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:31:50.694657 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/954a0b17-ede2-4a5f-a717-fb8f76331dee-registry-tls podName:954a0b17-ede2-4a5f-a717-fb8f76331dee nodeName:}" failed. No retries permitted until 2026-04-16 14:31:54.694642067 +0000 UTC m=+121.312431809 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/954a0b17-ede2-4a5f-a717-fb8f76331dee-registry-tls") pod "image-registry-579fc97f6b-mdg5r" (UID: "954a0b17-ede2-4a5f-a717-fb8f76331dee") : secret "image-registry-tls" not found Apr 16 14:31:50.703412 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:50.703380 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhbt4\" (UniqueName: \"kubernetes.io/projected/c9c1ca3a-17c3-42bd-b377-8de2ae3b265f-kube-api-access-fhbt4\") pod \"network-check-source-7b678d77c7-q47lm\" (UID: \"c9c1ca3a-17c3-42bd-b377-8de2ae3b265f\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-q47lm" Apr 16 14:31:50.801566 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:50.801512 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-q47lm" Apr 16 14:31:50.914540 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:50.914503 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-q47lm"] Apr 16 14:31:50.917420 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:31:50.917395 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9c1ca3a_17c3_42bd_b377_8de2ae3b265f.slice/crio-c79fdf0c9f6d47f6e5031c5cc2c25114a17f3c1a5055bfeebd9ac119332a69f8 WatchSource:0}: Error finding container c79fdf0c9f6d47f6e5031c5cc2c25114a17f3c1a5055bfeebd9ac119332a69f8: Status 404 returned error can't find the container with id c79fdf0c9f6d47f6e5031c5cc2c25114a17f3c1a5055bfeebd9ac119332a69f8 Apr 16 14:31:51.439304 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:51.439263 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-q47lm" event={"ID":"c9c1ca3a-17c3-42bd-b377-8de2ae3b265f","Type":"ContainerStarted","Data":"ebbb0a5e090df0cccbfc16e32df5e3f9384e9c5730e7a2552f0fdc6666afc319"} Apr 16 14:31:51.439304 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:51.439309 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-q47lm" event={"ID":"c9c1ca3a-17c3-42bd-b377-8de2ae3b265f","Type":"ContainerStarted","Data":"c79fdf0c9f6d47f6e5031c5cc2c25114a17f3c1a5055bfeebd9ac119332a69f8"} Apr 16 14:31:51.440660 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:51.440637 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-lr5dz_fca51d46-f492-4bcf-8d66-7ecb32e54d54/console-operator/1.log" Apr 16 14:31:51.441016 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:51.440999 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-lr5dz_fca51d46-f492-4bcf-8d66-7ecb32e54d54/console-operator/0.log" Apr 16 14:31:51.441085 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:51.441055 2579 generic.go:358] "Generic (PLEG): container finished" podID="fca51d46-f492-4bcf-8d66-7ecb32e54d54" containerID="332fb43c0bc3b792dcd9253e86ad4d5a24ebb97ec28b374856c82e0507afbd7d" exitCode=255 Apr 16 14:31:51.441121 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:51.441099 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-lr5dz" event={"ID":"fca51d46-f492-4bcf-8d66-7ecb32e54d54","Type":"ContainerDied","Data":"332fb43c0bc3b792dcd9253e86ad4d5a24ebb97ec28b374856c82e0507afbd7d"} Apr 16 14:31:51.441154 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:51.441126 2579 scope.go:117] "RemoveContainer" containerID="0bdf7e6ff78dce80ec43a243c11845fc0619697f0191f7c323a8d3bed5790715" Apr 16 14:31:51.441461 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:51.441437 2579 scope.go:117] "RemoveContainer" containerID="332fb43c0bc3b792dcd9253e86ad4d5a24ebb97ec28b374856c82e0507afbd7d" Apr 16 14:31:51.441672 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:31:51.441651 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-lr5dz_openshift-console-operator(fca51d46-f492-4bcf-8d66-7ecb32e54d54)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-lr5dz" podUID="fca51d46-f492-4bcf-8d66-7ecb32e54d54" Apr 16 14:31:51.456630 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:51.456593 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-q47lm" podStartSLOduration=1.456579204 podStartE2EDuration="1.456579204s" podCreationTimestamp="2026-04-16 14:31:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:31:51.455385414 +0000 UTC m=+118.073175180" watchObservedRunningTime="2026-04-16 14:31:51.456579204 +0000 UTC m=+118.074369003" Apr 16 14:31:51.771411 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:51.771376 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-ltfsn"] Apr 16 14:31:51.775751 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:51.775734 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-ltfsn" Apr 16 14:31:51.777657 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:51.777630 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 14:31:51.777780 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:51.777756 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 14:31:51.778059 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:51.778021 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-qhgm2\"" Apr 16 14:31:51.783507 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:51.783481 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-ltfsn"] Apr 16 14:31:51.904872 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:51.904827 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9nkl\" (UniqueName: \"kubernetes.io/projected/b73c7eb9-b024-4371-b219-6f114a27050f-kube-api-access-z9nkl\") pod \"migrator-64d4d94569-ltfsn\" (UID: \"b73c7eb9-b024-4371-b219-6f114a27050f\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-ltfsn" Apr 16 14:31:52.005908 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:52.005873 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z9nkl\" (UniqueName: \"kubernetes.io/projected/b73c7eb9-b024-4371-b219-6f114a27050f-kube-api-access-z9nkl\") pod \"migrator-64d4d94569-ltfsn\" (UID: \"b73c7eb9-b024-4371-b219-6f114a27050f\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-ltfsn" Apr 16 14:31:52.027139 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:52.027067 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9nkl\" (UniqueName: \"kubernetes.io/projected/b73c7eb9-b024-4371-b219-6f114a27050f-kube-api-access-z9nkl\") pod \"migrator-64d4d94569-ltfsn\" (UID: \"b73c7eb9-b024-4371-b219-6f114a27050f\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-ltfsn" Apr 16 14:31:52.085303 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:52.085268 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-ltfsn" Apr 16 14:31:52.204361 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:52.204328 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-ltfsn"] Apr 16 14:31:52.208337 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:31:52.208306 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb73c7eb9_b024_4371_b219_6f114a27050f.slice/crio-282c40cc15158ea973d405385e28481133310d4ed28fc2906ada7a69cc1b7314 WatchSource:0}: Error finding container 282c40cc15158ea973d405385e28481133310d4ed28fc2906ada7a69cc1b7314: Status 404 returned error can't find the container with id 282c40cc15158ea973d405385e28481133310d4ed28fc2906ada7a69cc1b7314 Apr 16 14:31:52.443780 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:52.443702 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-ltfsn" event={"ID":"b73c7eb9-b024-4371-b219-6f114a27050f","Type":"ContainerStarted","Data":"282c40cc15158ea973d405385e28481133310d4ed28fc2906ada7a69cc1b7314"} Apr 16 14:31:52.445065 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:52.445044 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-lr5dz_fca51d46-f492-4bcf-8d66-7ecb32e54d54/console-operator/1.log" Apr 16 14:31:52.445553 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:52.445534 2579 scope.go:117] "RemoveContainer" containerID="332fb43c0bc3b792dcd9253e86ad4d5a24ebb97ec28b374856c82e0507afbd7d" Apr 16 14:31:52.445740 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:31:52.445719 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-lr5dz_openshift-console-operator(fca51d46-f492-4bcf-8d66-7ecb32e54d54)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-lr5dz" podUID="fca51d46-f492-4bcf-8d66-7ecb32e54d54" Apr 16 14:31:53.448926 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:53.448831 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-ltfsn" event={"ID":"b73c7eb9-b024-4371-b219-6f114a27050f","Type":"ContainerStarted","Data":"ab51d8e92a084bb507e568b79401fff73a5a9cee0d12b09960117f9a3d23c415"} Apr 16 14:31:53.448926 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:53.448875 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-ltfsn" event={"ID":"b73c7eb9-b024-4371-b219-6f114a27050f","Type":"ContainerStarted","Data":"de17227d15df87e58504c3e62eabf7e54b59104af2a117b61894234ed810207f"} Apr 16 14:31:53.466462 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:53.466414 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-ltfsn" podStartSLOduration=1.48588972 podStartE2EDuration="2.466400124s" podCreationTimestamp="2026-04-16 14:31:51 +0000 UTC" firstStartedPulling="2026-04-16 14:31:52.21019899 +0000 UTC m=+118.827988737" lastFinishedPulling="2026-04-16 14:31:53.190709395 +0000 UTC m=+119.808499141" observedRunningTime="2026-04-16 14:31:53.465393519 +0000 UTC m=+120.083183281" watchObservedRunningTime="2026-04-16 14:31:53.466400124 +0000 UTC m=+120.084189888" Apr 16 14:31:53.592969 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:53.592935 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-sbrlj"] Apr 16 14:31:53.595953 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:53.595937 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-sbrlj" Apr 16 14:31:53.598181 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:53.598156 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 14:31:53.598334 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:53.598156 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 14:31:53.598334 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:53.598214 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 14:31:53.598334 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:53.598221 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 14:31:53.598444 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:53.598352 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-jx5g4\"" Apr 16 14:31:53.603533 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:53.603513 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-sbrlj"] Apr 16 14:31:53.630658 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:53.630630 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-vjfvw_1ec5f4c5-a526-4cf8-82d9-0b554ed2fbaa/dns-node-resolver/0.log" Apr 16 14:31:53.720356 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:53.720278 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/82d654b9-7864-4c68-9543-b09b8243ac64-signing-cabundle\") pod \"service-ca-bfc587fb7-sbrlj\" (UID: \"82d654b9-7864-4c68-9543-b09b8243ac64\") " pod="openshift-service-ca/service-ca-bfc587fb7-sbrlj" Apr 16 14:31:53.720356 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:53.720338 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/82d654b9-7864-4c68-9543-b09b8243ac64-signing-key\") pod \"service-ca-bfc587fb7-sbrlj\" (UID: \"82d654b9-7864-4c68-9543-b09b8243ac64\") " pod="openshift-service-ca/service-ca-bfc587fb7-sbrlj" Apr 16 14:31:53.720525 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:53.720392 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-789rk\" (UniqueName: \"kubernetes.io/projected/82d654b9-7864-4c68-9543-b09b8243ac64-kube-api-access-789rk\") pod \"service-ca-bfc587fb7-sbrlj\" (UID: \"82d654b9-7864-4c68-9543-b09b8243ac64\") " pod="openshift-service-ca/service-ca-bfc587fb7-sbrlj" Apr 16 14:31:53.821366 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:53.821337 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/82d654b9-7864-4c68-9543-b09b8243ac64-signing-cabundle\") pod \"service-ca-bfc587fb7-sbrlj\" (UID: \"82d654b9-7864-4c68-9543-b09b8243ac64\") " pod="openshift-service-ca/service-ca-bfc587fb7-sbrlj" Apr 16 14:31:53.821441 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:53.821397 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/82d654b9-7864-4c68-9543-b09b8243ac64-signing-key\") pod \"service-ca-bfc587fb7-sbrlj\" (UID: \"82d654b9-7864-4c68-9543-b09b8243ac64\") " pod="openshift-service-ca/service-ca-bfc587fb7-sbrlj" Apr 16 14:31:53.821441 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:53.821428 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-789rk\" (UniqueName: \"kubernetes.io/projected/82d654b9-7864-4c68-9543-b09b8243ac64-kube-api-access-789rk\") pod \"service-ca-bfc587fb7-sbrlj\" (UID: \"82d654b9-7864-4c68-9543-b09b8243ac64\") " pod="openshift-service-ca/service-ca-bfc587fb7-sbrlj" Apr 16 14:31:53.823813 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:53.823795 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 14:31:53.823855 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:53.823800 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 14:31:53.829975 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:53.829959 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 14:31:53.832536 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:53.832516 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/82d654b9-7864-4c68-9543-b09b8243ac64-signing-cabundle\") pod \"service-ca-bfc587fb7-sbrlj\" (UID: \"82d654b9-7864-4c68-9543-b09b8243ac64\") " pod="openshift-service-ca/service-ca-bfc587fb7-sbrlj" Apr 16 14:31:53.834278 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:53.834257 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/82d654b9-7864-4c68-9543-b09b8243ac64-signing-key\") pod \"service-ca-bfc587fb7-sbrlj\" (UID: \"82d654b9-7864-4c68-9543-b09b8243ac64\") " pod="openshift-service-ca/service-ca-bfc587fb7-sbrlj" Apr 16 14:31:53.840608 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:53.840591 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 14:31:53.850739 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:53.850715 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-789rk\" (UniqueName: \"kubernetes.io/projected/82d654b9-7864-4c68-9543-b09b8243ac64-kube-api-access-789rk\") pod \"service-ca-bfc587fb7-sbrlj\" (UID: \"82d654b9-7864-4c68-9543-b09b8243ac64\") " pod="openshift-service-ca/service-ca-bfc587fb7-sbrlj" Apr 16 14:31:53.907355 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:53.907334 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-jx5g4\"" Apr 16 14:31:53.914910 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:53.914891 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-sbrlj" Apr 16 14:31:54.035750 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:54.035719 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-sbrlj"] Apr 16 14:31:54.042221 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:31:54.039264 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82d654b9_7864_4c68_9543_b09b8243ac64.slice/crio-5271f2b2ef7d190a62fa2b496bfd6ebf9c862a2954b6cf8424788ba45162f05e WatchSource:0}: Error finding container 5271f2b2ef7d190a62fa2b496bfd6ebf9c862a2954b6cf8424788ba45162f05e: Status 404 returned error can't find the container with id 5271f2b2ef7d190a62fa2b496bfd6ebf9c862a2954b6cf8424788ba45162f05e Apr 16 14:31:54.452501 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:54.452460 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-sbrlj" event={"ID":"82d654b9-7864-4c68-9543-b09b8243ac64","Type":"ContainerStarted","Data":"5271f2b2ef7d190a62fa2b496bfd6ebf9c862a2954b6cf8424788ba45162f05e"} Apr 16 14:31:54.631514 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:54.631485 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4648f_76137a5e-4deb-4e87-b7e5-d17bde0111e4/node-ca/0.log" Apr 16 14:31:54.729732 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:54.729634 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/954a0b17-ede2-4a5f-a717-fb8f76331dee-registry-tls\") pod \"image-registry-579fc97f6b-mdg5r\" (UID: \"954a0b17-ede2-4a5f-a717-fb8f76331dee\") " pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" Apr 16 14:31:54.729896 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:31:54.729787 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:31:54.729896 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:31:54.729808 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-579fc97f6b-mdg5r: secret "image-registry-tls" not found Apr 16 14:31:54.729994 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:31:54.729900 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/954a0b17-ede2-4a5f-a717-fb8f76331dee-registry-tls podName:954a0b17-ede2-4a5f-a717-fb8f76331dee nodeName:}" failed. No retries permitted until 2026-04-16 14:32:02.729881399 +0000 UTC m=+129.347671154 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/954a0b17-ede2-4a5f-a717-fb8f76331dee-registry-tls") pod "image-registry-579fc97f6b-mdg5r" (UID: "954a0b17-ede2-4a5f-a717-fb8f76331dee") : secret "image-registry-tls" not found Apr 16 14:31:56.459307 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:56.459264 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-sbrlj" event={"ID":"82d654b9-7864-4c68-9543-b09b8243ac64","Type":"ContainerStarted","Data":"746a0b76b6dd011bc88cbf7541d1dc1832c63383986772681893293ce103bfa5"} Apr 16 14:31:56.476348 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:56.476298 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-sbrlj" podStartSLOduration=1.763329353 podStartE2EDuration="3.4762843s" podCreationTimestamp="2026-04-16 14:31:53 +0000 UTC" firstStartedPulling="2026-04-16 14:31:54.041274514 +0000 UTC m=+120.659064259" lastFinishedPulling="2026-04-16 14:31:55.754229465 +0000 UTC m=+122.372019206" observedRunningTime="2026-04-16 14:31:56.475707696 +0000 UTC m=+123.093497462" watchObservedRunningTime="2026-04-16 14:31:56.4762843 +0000 UTC m=+123.094074385" Apr 16 14:31:57.396999 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:57.396955 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-lr5dz" Apr 16 14:31:57.396999 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:57.397003 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-lr5dz" Apr 16 14:31:57.397494 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:31:57.397481 2579 scope.go:117] "RemoveContainer" containerID="332fb43c0bc3b792dcd9253e86ad4d5a24ebb97ec28b374856c82e0507afbd7d" Apr 16 14:31:57.397705 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:31:57.397686 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-lr5dz_openshift-console-operator(fca51d46-f492-4bcf-8d66-7ecb32e54d54)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-lr5dz" podUID="fca51d46-f492-4bcf-8d66-7ecb32e54d54" Apr 16 14:32:02.696011 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:02.695969 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20614211-2bef-41dc-aad8-94242eb8364c-metrics-certs\") pod \"network-metrics-daemon-m57qr\" (UID: \"20614211-2bef-41dc-aad8-94242eb8364c\") " pod="openshift-multus/network-metrics-daemon-m57qr" Apr 16 14:32:02.696447 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:32:02.696133 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:32:02.696447 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:32:02.696199 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20614211-2bef-41dc-aad8-94242eb8364c-metrics-certs podName:20614211-2bef-41dc-aad8-94242eb8364c nodeName:}" failed. No retries permitted until 2026-04-16 14:34:04.696184434 +0000 UTC m=+251.313974174 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/20614211-2bef-41dc-aad8-94242eb8364c-metrics-certs") pod "network-metrics-daemon-m57qr" (UID: "20614211-2bef-41dc-aad8-94242eb8364c") : secret "metrics-daemon-secret" not found Apr 16 14:32:02.796518 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:02.796469 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/954a0b17-ede2-4a5f-a717-fb8f76331dee-registry-tls\") pod \"image-registry-579fc97f6b-mdg5r\" (UID: \"954a0b17-ede2-4a5f-a717-fb8f76331dee\") " pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" Apr 16 14:32:02.798880 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:02.798844 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/954a0b17-ede2-4a5f-a717-fb8f76331dee-registry-tls\") pod \"image-registry-579fc97f6b-mdg5r\" (UID: \"954a0b17-ede2-4a5f-a717-fb8f76331dee\") " pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" Apr 16 14:32:02.883565 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:02.883536 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-r52pv\"" Apr 16 14:32:02.892345 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:02.892311 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" Apr 16 14:32:03.015213 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:03.015184 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-579fc97f6b-mdg5r"] Apr 16 14:32:03.018650 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:32:03.018619 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod954a0b17_ede2_4a5f_a717_fb8f76331dee.slice/crio-306e07d9a258a7bfafadfd7ee25b167039d310bdb1e8c2a9190c46cb5c922f75 WatchSource:0}: Error finding container 306e07d9a258a7bfafadfd7ee25b167039d310bdb1e8c2a9190c46cb5c922f75: Status 404 returned error can't find the container with id 306e07d9a258a7bfafadfd7ee25b167039d310bdb1e8c2a9190c46cb5c922f75 Apr 16 14:32:03.475713 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:03.475679 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" event={"ID":"954a0b17-ede2-4a5f-a717-fb8f76331dee","Type":"ContainerStarted","Data":"ac9ac373d68b76df0ce29d292f237361257c79f21ac910aca3d651addb111e50"} Apr 16 14:32:03.475713 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:03.475714 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" event={"ID":"954a0b17-ede2-4a5f-a717-fb8f76331dee","Type":"ContainerStarted","Data":"306e07d9a258a7bfafadfd7ee25b167039d310bdb1e8c2a9190c46cb5c922f75"} Apr 16 14:32:03.475918 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:03.475791 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" Apr 16 14:32:03.497390 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:03.497339 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" podStartSLOduration=17.49732147 podStartE2EDuration="17.49732147s" podCreationTimestamp="2026-04-16 14:31:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:32:03.496670417 +0000 UTC m=+130.114460181" watchObservedRunningTime="2026-04-16 14:32:03.49732147 +0000 UTC m=+130.115111237" Apr 16 14:32:09.976795 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:09.976764 2579 scope.go:117] "RemoveContainer" containerID="332fb43c0bc3b792dcd9253e86ad4d5a24ebb97ec28b374856c82e0507afbd7d" Apr 16 14:32:10.495747 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:10.495719 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-lr5dz_fca51d46-f492-4bcf-8d66-7ecb32e54d54/console-operator/1.log" Apr 16 14:32:10.495928 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:10.495782 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-lr5dz" event={"ID":"fca51d46-f492-4bcf-8d66-7ecb32e54d54","Type":"ContainerStarted","Data":"e10f59424a11a7f0a45e011ce1860c11b879e251695367f867574d9b061e5631"} Apr 16 14:32:10.496082 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:10.496061 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-lr5dz" Apr 16 14:32:10.516298 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:10.516238 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-lr5dz" podStartSLOduration=21.622940537 podStartE2EDuration="23.516220459s" podCreationTimestamp="2026-04-16 14:31:47 +0000 UTC" firstStartedPulling="2026-04-16 14:31:47.532135087 +0000 UTC m=+114.149924828" lastFinishedPulling="2026-04-16 14:31:49.425415007 +0000 UTC m=+116.043204750" observedRunningTime="2026-04-16 14:32:10.514270503 +0000 UTC m=+137.132060268" watchObservedRunningTime="2026-04-16 14:32:10.516220459 +0000 UTC m=+137.134010223" Apr 16 14:32:10.749097 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:10.749001 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-lr5dz" Apr 16 14:32:15.455529 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.455493 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-mxz6w"] Apr 16 14:32:15.458444 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.458428 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-mxz6w" Apr 16 14:32:15.460585 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.460560 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 14:32:15.460692 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.460649 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-zmxm6\"" Apr 16 14:32:15.461382 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.461362 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 14:32:15.471843 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.471822 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-mxz6w"] Apr 16 14:32:15.484892 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.484868 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-579fc97f6b-mdg5r"] Apr 16 14:32:15.484892 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.484880 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/bcf5dc26-6a7b-4d47-bd94-a76c16744638-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-mxz6w\" (UID: \"bcf5dc26-6a7b-4d47-bd94-a76c16744638\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-mxz6w" Apr 16 14:32:15.485096 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.484930 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/bcf5dc26-6a7b-4d47-bd94-a76c16744638-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-mxz6w\" (UID: \"bcf5dc26-6a7b-4d47-bd94-a76c16744638\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-mxz6w" Apr 16 14:32:15.584428 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.584387 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-6w8wv"] Apr 16 14:32:15.586351 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.586319 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/bcf5dc26-6a7b-4d47-bd94-a76c16744638-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-mxz6w\" (UID: \"bcf5dc26-6a7b-4d47-bd94-a76c16744638\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-mxz6w" Apr 16 14:32:15.586484 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.586428 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/bcf5dc26-6a7b-4d47-bd94-a76c16744638-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-mxz6w\" (UID: \"bcf5dc26-6a7b-4d47-bd94-a76c16744638\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-mxz6w" Apr 16 14:32:15.586969 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.586945 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/bcf5dc26-6a7b-4d47-bd94-a76c16744638-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-mxz6w\" (UID: \"bcf5dc26-6a7b-4d47-bd94-a76c16744638\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-mxz6w" Apr 16 14:32:15.587967 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.587951 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6w8wv" Apr 16 14:32:15.588967 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.588946 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/bcf5dc26-6a7b-4d47-bd94-a76c16744638-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-mxz6w\" (UID: \"bcf5dc26-6a7b-4d47-bd94-a76c16744638\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-mxz6w" Apr 16 14:32:15.594065 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.594022 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 14:32:15.594195 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.594022 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 14:32:15.595188 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.595167 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 14:32:15.595463 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.595446 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-k65jf\"" Apr 16 14:32:15.595601 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.595584 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 14:32:15.596598 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.596579 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-586b57c7b4-zvqf4"] Apr 16 14:32:15.599563 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.599531 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-zvqf4" Apr 16 14:32:15.602718 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.602641 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 14:32:15.602718 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.602670 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 14:32:15.603148 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.603126 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-s5glw\"" Apr 16 14:32:15.604359 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.604332 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6w8wv"] Apr 16 14:32:15.616279 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.616257 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-zvqf4"] Apr 16 14:32:15.687287 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.687246 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/eadb0665-d202-43bc-b37c-41f49b666d83-data-volume\") pod \"insights-runtime-extractor-6w8wv\" (UID: \"eadb0665-d202-43bc-b37c-41f49b666d83\") " pod="openshift-insights/insights-runtime-extractor-6w8wv" Apr 16 14:32:15.687470 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.687334 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/eadb0665-d202-43bc-b37c-41f49b666d83-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6w8wv\" (UID: \"eadb0665-d202-43bc-b37c-41f49b666d83\") " pod="openshift-insights/insights-runtime-extractor-6w8wv" Apr 16 14:32:15.687470 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.687368 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/eadb0665-d202-43bc-b37c-41f49b666d83-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6w8wv\" (UID: \"eadb0665-d202-43bc-b37c-41f49b666d83\") " pod="openshift-insights/insights-runtime-extractor-6w8wv" Apr 16 14:32:15.687470 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.687387 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/eadb0665-d202-43bc-b37c-41f49b666d83-crio-socket\") pod \"insights-runtime-extractor-6w8wv\" (UID: \"eadb0665-d202-43bc-b37c-41f49b666d83\") " pod="openshift-insights/insights-runtime-extractor-6w8wv" Apr 16 14:32:15.687470 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.687457 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2jcj\" (UniqueName: \"kubernetes.io/projected/c405329f-15f2-4fd4-8b05-4f3e19783a99-kube-api-access-b2jcj\") pod \"downloads-586b57c7b4-zvqf4\" (UID: \"c405329f-15f2-4fd4-8b05-4f3e19783a99\") " pod="openshift-console/downloads-586b57c7b4-zvqf4" Apr 16 14:32:15.687669 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.687476 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g84r\" (UniqueName: \"kubernetes.io/projected/eadb0665-d202-43bc-b37c-41f49b666d83-kube-api-access-5g84r\") pod \"insights-runtime-extractor-6w8wv\" (UID: \"eadb0665-d202-43bc-b37c-41f49b666d83\") " pod="openshift-insights/insights-runtime-extractor-6w8wv" Apr 16 14:32:15.766327 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.766244 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-mxz6w" Apr 16 14:32:15.787865 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.787834 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/eadb0665-d202-43bc-b37c-41f49b666d83-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6w8wv\" (UID: \"eadb0665-d202-43bc-b37c-41f49b666d83\") " pod="openshift-insights/insights-runtime-extractor-6w8wv" Apr 16 14:32:15.787992 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.787887 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/eadb0665-d202-43bc-b37c-41f49b666d83-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6w8wv\" (UID: \"eadb0665-d202-43bc-b37c-41f49b666d83\") " pod="openshift-insights/insights-runtime-extractor-6w8wv" Apr 16 14:32:15.787992 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.787917 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/eadb0665-d202-43bc-b37c-41f49b666d83-crio-socket\") pod \"insights-runtime-extractor-6w8wv\" (UID: \"eadb0665-d202-43bc-b37c-41f49b666d83\") " pod="openshift-insights/insights-runtime-extractor-6w8wv" Apr 16 14:32:15.787992 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.787956 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2jcj\" (UniqueName: \"kubernetes.io/projected/c405329f-15f2-4fd4-8b05-4f3e19783a99-kube-api-access-b2jcj\") pod \"downloads-586b57c7b4-zvqf4\" (UID: \"c405329f-15f2-4fd4-8b05-4f3e19783a99\") " pod="openshift-console/downloads-586b57c7b4-zvqf4" Apr 16 14:32:15.787992 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.787987 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5g84r\" (UniqueName: \"kubernetes.io/projected/eadb0665-d202-43bc-b37c-41f49b666d83-kube-api-access-5g84r\") pod \"insights-runtime-extractor-6w8wv\" (UID: \"eadb0665-d202-43bc-b37c-41f49b666d83\") " pod="openshift-insights/insights-runtime-extractor-6w8wv" Apr 16 14:32:15.788227 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.788016 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/eadb0665-d202-43bc-b37c-41f49b666d83-data-volume\") pod \"insights-runtime-extractor-6w8wv\" (UID: \"eadb0665-d202-43bc-b37c-41f49b666d83\") " pod="openshift-insights/insights-runtime-extractor-6w8wv" Apr 16 14:32:15.788227 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.788079 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/eadb0665-d202-43bc-b37c-41f49b666d83-crio-socket\") pod \"insights-runtime-extractor-6w8wv\" (UID: \"eadb0665-d202-43bc-b37c-41f49b666d83\") " pod="openshift-insights/insights-runtime-extractor-6w8wv" Apr 16 14:32:15.788441 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.788420 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/eadb0665-d202-43bc-b37c-41f49b666d83-data-volume\") pod \"insights-runtime-extractor-6w8wv\" (UID: \"eadb0665-d202-43bc-b37c-41f49b666d83\") " pod="openshift-insights/insights-runtime-extractor-6w8wv" Apr 16 14:32:15.788530 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.788428 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/eadb0665-d202-43bc-b37c-41f49b666d83-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6w8wv\" (UID: \"eadb0665-d202-43bc-b37c-41f49b666d83\") " pod="openshift-insights/insights-runtime-extractor-6w8wv" Apr 16 14:32:15.790363 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.790338 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/eadb0665-d202-43bc-b37c-41f49b666d83-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6w8wv\" (UID: \"eadb0665-d202-43bc-b37c-41f49b666d83\") " pod="openshift-insights/insights-runtime-extractor-6w8wv" Apr 16 14:32:15.797043 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.797013 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g84r\" (UniqueName: \"kubernetes.io/projected/eadb0665-d202-43bc-b37c-41f49b666d83-kube-api-access-5g84r\") pod \"insights-runtime-extractor-6w8wv\" (UID: \"eadb0665-d202-43bc-b37c-41f49b666d83\") " pod="openshift-insights/insights-runtime-extractor-6w8wv" Apr 16 14:32:15.800575 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.800551 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2jcj\" (UniqueName: \"kubernetes.io/projected/c405329f-15f2-4fd4-8b05-4f3e19783a99-kube-api-access-b2jcj\") pod \"downloads-586b57c7b4-zvqf4\" (UID: \"c405329f-15f2-4fd4-8b05-4f3e19783a99\") " pod="openshift-console/downloads-586b57c7b4-zvqf4" Apr 16 14:32:15.891961 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.891927 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-mxz6w"] Apr 16 14:32:15.895174 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:32:15.895142 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcf5dc26_6a7b_4d47_bd94_a76c16744638.slice/crio-4bdb8bfea45765de5772f932d29aba889393073569cf6a9f4f91962de5d32a93 WatchSource:0}: Error finding container 4bdb8bfea45765de5772f932d29aba889393073569cf6a9f4f91962de5d32a93: Status 404 returned error can't find the container with id 4bdb8bfea45765de5772f932d29aba889393073569cf6a9f4f91962de5d32a93 Apr 16 14:32:15.904043 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.904015 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6w8wv" Apr 16 14:32:15.912874 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:15.912851 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-zvqf4" Apr 16 14:32:16.032822 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:16.032787 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6w8wv"] Apr 16 14:32:16.035835 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:32:16.035807 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeadb0665_d202_43bc_b37c_41f49b666d83.slice/crio-5b7f0e83cde0de1b008e258fd87c416aa13fefb9caa72ad816564bd2d8576ae2 WatchSource:0}: Error finding container 5b7f0e83cde0de1b008e258fd87c416aa13fefb9caa72ad816564bd2d8576ae2: Status 404 returned error can't find the container with id 5b7f0e83cde0de1b008e258fd87c416aa13fefb9caa72ad816564bd2d8576ae2 Apr 16 14:32:16.050923 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:16.050898 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-zvqf4"] Apr 16 14:32:16.054890 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:32:16.054866 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc405329f_15f2_4fd4_8b05_4f3e19783a99.slice/crio-25f191375fc9bbab29afdd236798f1c92167dec91d74045f477e14ea6eac4bbb WatchSource:0}: Error finding container 25f191375fc9bbab29afdd236798f1c92167dec91d74045f477e14ea6eac4bbb: Status 404 returned error can't find the container with id 25f191375fc9bbab29afdd236798f1c92167dec91d74045f477e14ea6eac4bbb Apr 16 14:32:16.513583 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:16.513485 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-zvqf4" event={"ID":"c405329f-15f2-4fd4-8b05-4f3e19783a99","Type":"ContainerStarted","Data":"25f191375fc9bbab29afdd236798f1c92167dec91d74045f477e14ea6eac4bbb"} Apr 16 14:32:16.514605 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:16.514575 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-mxz6w" event={"ID":"bcf5dc26-6a7b-4d47-bd94-a76c16744638","Type":"ContainerStarted","Data":"4bdb8bfea45765de5772f932d29aba889393073569cf6a9f4f91962de5d32a93"} Apr 16 14:32:16.515869 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:16.515847 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6w8wv" event={"ID":"eadb0665-d202-43bc-b37c-41f49b666d83","Type":"ContainerStarted","Data":"949f93df54c3fad9ac23a059f4c02c9b5388b5df7f18b02cfac86465bbb64681"} Apr 16 14:32:16.515951 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:16.515874 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6w8wv" event={"ID":"eadb0665-d202-43bc-b37c-41f49b666d83","Type":"ContainerStarted","Data":"5b7f0e83cde0de1b008e258fd87c416aa13fefb9caa72ad816564bd2d8576ae2"} Apr 16 14:32:17.522393 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:17.522348 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-mxz6w" event={"ID":"bcf5dc26-6a7b-4d47-bd94-a76c16744638","Type":"ContainerStarted","Data":"49c843d1a6561043fac9a6a446aa074d9fc2b4c37aed6fc6bc242b26680d000d"} Apr 16 14:32:17.524242 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:17.524214 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6w8wv" event={"ID":"eadb0665-d202-43bc-b37c-41f49b666d83","Type":"ContainerStarted","Data":"df3f6fb141545d2a65a12e5569838998f072959c971b4f54c1ad1db0f38f53b6"} Apr 16 14:32:17.539676 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:17.539623 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-mxz6w" podStartSLOduration=1.5981594449999998 podStartE2EDuration="2.539606691s" podCreationTimestamp="2026-04-16 14:32:15 +0000 UTC" firstStartedPulling="2026-04-16 14:32:15.897066724 +0000 UTC m=+142.514856465" lastFinishedPulling="2026-04-16 14:32:16.838513955 +0000 UTC m=+143.456303711" observedRunningTime="2026-04-16 14:32:17.537330858 +0000 UTC m=+144.155120622" watchObservedRunningTime="2026-04-16 14:32:17.539606691 +0000 UTC m=+144.157396455" Apr 16 14:32:19.532476 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:19.532439 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6w8wv" event={"ID":"eadb0665-d202-43bc-b37c-41f49b666d83","Type":"ContainerStarted","Data":"f03e6e6d1607df8f5d24d205f756be1c5654dc8fbd1771c5577a5af9e4660a0b"} Apr 16 14:32:19.551442 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:19.551392 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-6w8wv" podStartSLOduration=2.035976838 podStartE2EDuration="4.551374201s" podCreationTimestamp="2026-04-16 14:32:15 +0000 UTC" firstStartedPulling="2026-04-16 14:32:16.088414917 +0000 UTC m=+142.706204658" lastFinishedPulling="2026-04-16 14:32:18.603812062 +0000 UTC m=+145.221602021" observedRunningTime="2026-04-16 14:32:19.549946252 +0000 UTC m=+146.167736017" watchObservedRunningTime="2026-04-16 14:32:19.551374201 +0000 UTC m=+146.169163968" Apr 16 14:32:25.490814 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:25.490770 2579 patch_prober.go:28] interesting pod/image-registry-579fc97f6b-mdg5r container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 14:32:25.491233 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:25.490829 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" podUID="954a0b17-ede2-4a5f-a717-fb8f76331dee" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:32:29.239151 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:32:29.239106 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-4t4r6" podUID="0b44026e-170c-4488-824c-c757c82f68cd" Apr 16 14:32:29.248310 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:32:29.248267 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-src8k" podUID="93aad6d1-f6fd-49fd-aeac-3661dd3118bf" Apr 16 14:32:29.559548 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:29.559516 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4t4r6" Apr 16 14:32:29.993127 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:32:29.993019 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-m57qr" podUID="20614211-2bef-41dc-aad8-94242eb8364c" Apr 16 14:32:31.036917 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.036886 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6db78bffcf-j4hnv"] Apr 16 14:32:31.039743 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.039719 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6db78bffcf-j4hnv" Apr 16 14:32:31.042680 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.042625 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 14:32:31.042680 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.042628 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-5hq8m\"" Apr 16 14:32:31.042864 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.042746 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 14:32:31.042864 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.042782 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 14:32:31.042864 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.042826 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 14:32:31.042864 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.042850 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 14:32:31.046827 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.046805 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 14:32:31.049265 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.049222 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6db78bffcf-j4hnv"] Apr 16 14:32:31.125970 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.125934 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh8bv\" (UniqueName: \"kubernetes.io/projected/2840ca5d-533d-4e77-8636-f68c1bfa7c31-kube-api-access-zh8bv\") pod \"console-6db78bffcf-j4hnv\" (UID: \"2840ca5d-533d-4e77-8636-f68c1bfa7c31\") " pod="openshift-console/console-6db78bffcf-j4hnv" Apr 16 14:32:31.126167 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.125995 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2840ca5d-533d-4e77-8636-f68c1bfa7c31-console-serving-cert\") pod \"console-6db78bffcf-j4hnv\" (UID: \"2840ca5d-533d-4e77-8636-f68c1bfa7c31\") " pod="openshift-console/console-6db78bffcf-j4hnv" Apr 16 14:32:31.126167 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.126027 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2840ca5d-533d-4e77-8636-f68c1bfa7c31-console-oauth-config\") pod \"console-6db78bffcf-j4hnv\" (UID: \"2840ca5d-533d-4e77-8636-f68c1bfa7c31\") " pod="openshift-console/console-6db78bffcf-j4hnv" Apr 16 14:32:31.126167 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.126068 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2840ca5d-533d-4e77-8636-f68c1bfa7c31-oauth-serving-cert\") pod \"console-6db78bffcf-j4hnv\" (UID: \"2840ca5d-533d-4e77-8636-f68c1bfa7c31\") " pod="openshift-console/console-6db78bffcf-j4hnv" Apr 16 14:32:31.126167 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.126116 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2840ca5d-533d-4e77-8636-f68c1bfa7c31-console-config\") pod \"console-6db78bffcf-j4hnv\" (UID: \"2840ca5d-533d-4e77-8636-f68c1bfa7c31\") " pod="openshift-console/console-6db78bffcf-j4hnv" Apr 16 14:32:31.126372 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.126202 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2840ca5d-533d-4e77-8636-f68c1bfa7c31-trusted-ca-bundle\") pod \"console-6db78bffcf-j4hnv\" (UID: \"2840ca5d-533d-4e77-8636-f68c1bfa7c31\") " pod="openshift-console/console-6db78bffcf-j4hnv" Apr 16 14:32:31.126372 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.126222 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2840ca5d-533d-4e77-8636-f68c1bfa7c31-service-ca\") pod \"console-6db78bffcf-j4hnv\" (UID: \"2840ca5d-533d-4e77-8636-f68c1bfa7c31\") " pod="openshift-console/console-6db78bffcf-j4hnv" Apr 16 14:32:31.227071 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.227002 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zh8bv\" (UniqueName: \"kubernetes.io/projected/2840ca5d-533d-4e77-8636-f68c1bfa7c31-kube-api-access-zh8bv\") pod \"console-6db78bffcf-j4hnv\" (UID: \"2840ca5d-533d-4e77-8636-f68c1bfa7c31\") " pod="openshift-console/console-6db78bffcf-j4hnv" Apr 16 14:32:31.227280 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.227099 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2840ca5d-533d-4e77-8636-f68c1bfa7c31-console-serving-cert\") pod \"console-6db78bffcf-j4hnv\" (UID: \"2840ca5d-533d-4e77-8636-f68c1bfa7c31\") " pod="openshift-console/console-6db78bffcf-j4hnv" Apr 16 14:32:31.227280 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.227132 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2840ca5d-533d-4e77-8636-f68c1bfa7c31-console-oauth-config\") pod \"console-6db78bffcf-j4hnv\" (UID: \"2840ca5d-533d-4e77-8636-f68c1bfa7c31\") " pod="openshift-console/console-6db78bffcf-j4hnv" Apr 16 14:32:31.227280 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.227159 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2840ca5d-533d-4e77-8636-f68c1bfa7c31-oauth-serving-cert\") pod \"console-6db78bffcf-j4hnv\" (UID: \"2840ca5d-533d-4e77-8636-f68c1bfa7c31\") " pod="openshift-console/console-6db78bffcf-j4hnv" Apr 16 14:32:31.227280 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.227190 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2840ca5d-533d-4e77-8636-f68c1bfa7c31-console-config\") pod \"console-6db78bffcf-j4hnv\" (UID: \"2840ca5d-533d-4e77-8636-f68c1bfa7c31\") " pod="openshift-console/console-6db78bffcf-j4hnv" Apr 16 14:32:31.227540 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.227457 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2840ca5d-533d-4e77-8636-f68c1bfa7c31-trusted-ca-bundle\") pod \"console-6db78bffcf-j4hnv\" (UID: \"2840ca5d-533d-4e77-8636-f68c1bfa7c31\") " pod="openshift-console/console-6db78bffcf-j4hnv" Apr 16 14:32:31.227540 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.227485 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2840ca5d-533d-4e77-8636-f68c1bfa7c31-service-ca\") pod \"console-6db78bffcf-j4hnv\" (UID: \"2840ca5d-533d-4e77-8636-f68c1bfa7c31\") " pod="openshift-console/console-6db78bffcf-j4hnv" Apr 16 14:32:31.228057 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.227996 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2840ca5d-533d-4e77-8636-f68c1bfa7c31-oauth-serving-cert\") pod \"console-6db78bffcf-j4hnv\" (UID: \"2840ca5d-533d-4e77-8636-f68c1bfa7c31\") " pod="openshift-console/console-6db78bffcf-j4hnv" Apr 16 14:32:31.228180 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.228086 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2840ca5d-533d-4e77-8636-f68c1bfa7c31-console-config\") pod \"console-6db78bffcf-j4hnv\" (UID: \"2840ca5d-533d-4e77-8636-f68c1bfa7c31\") " pod="openshift-console/console-6db78bffcf-j4hnv" Apr 16 14:32:31.228254 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.228223 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2840ca5d-533d-4e77-8636-f68c1bfa7c31-service-ca\") pod \"console-6db78bffcf-j4hnv\" (UID: \"2840ca5d-533d-4e77-8636-f68c1bfa7c31\") " pod="openshift-console/console-6db78bffcf-j4hnv" Apr 16 14:32:31.228741 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.228709 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2840ca5d-533d-4e77-8636-f68c1bfa7c31-trusted-ca-bundle\") pod \"console-6db78bffcf-j4hnv\" (UID: \"2840ca5d-533d-4e77-8636-f68c1bfa7c31\") " pod="openshift-console/console-6db78bffcf-j4hnv" Apr 16 14:32:31.230338 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.230314 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2840ca5d-533d-4e77-8636-f68c1bfa7c31-console-serving-cert\") pod \"console-6db78bffcf-j4hnv\" (UID: \"2840ca5d-533d-4e77-8636-f68c1bfa7c31\") " pod="openshift-console/console-6db78bffcf-j4hnv" Apr 16 14:32:31.230457 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.230440 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2840ca5d-533d-4e77-8636-f68c1bfa7c31-console-oauth-config\") pod \"console-6db78bffcf-j4hnv\" (UID: \"2840ca5d-533d-4e77-8636-f68c1bfa7c31\") " pod="openshift-console/console-6db78bffcf-j4hnv" Apr 16 14:32:31.236835 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.236808 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh8bv\" (UniqueName: \"kubernetes.io/projected/2840ca5d-533d-4e77-8636-f68c1bfa7c31-kube-api-access-zh8bv\") pod \"console-6db78bffcf-j4hnv\" (UID: \"2840ca5d-533d-4e77-8636-f68c1bfa7c31\") " pod="openshift-console/console-6db78bffcf-j4hnv" Apr 16 14:32:31.351339 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.351253 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6db78bffcf-j4hnv" Apr 16 14:32:31.581106 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.581078 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6db78bffcf-j4hnv"] Apr 16 14:32:31.584840 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:32:31.584807 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2840ca5d_533d_4e77_8636_f68c1bfa7c31.slice/crio-213c40b473e615b10aeea4aca7b9c55646d6a4a55fb4a3d7aafae7e556a261ad WatchSource:0}: Error finding container 213c40b473e615b10aeea4aca7b9c55646d6a4a55fb4a3d7aafae7e556a261ad: Status 404 returned error can't find the container with id 213c40b473e615b10aeea4aca7b9c55646d6a4a55fb4a3d7aafae7e556a261ad Apr 16 14:32:31.700473 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.700383 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-g8nc9"] Apr 16 14:32:31.708975 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.708943 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-g8nc9" Apr 16 14:32:31.714832 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.711837 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 14:32:31.714832 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.713583 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-4cpq5\"" Apr 16 14:32:31.714832 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.713639 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-g8nc9"] Apr 16 14:32:31.714832 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.713749 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 14:32:31.714832 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.713840 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 14:32:31.714832 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.714214 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 14:32:31.714832 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.714469 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 14:32:31.714832 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.714655 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 14:32:31.732346 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.732317 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-lqxdm"] Apr 16 14:32:31.735414 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.735391 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-lqxdm" Apr 16 14:32:31.737830 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.737813 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 14:32:31.738109 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.737828 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 14:32:31.738109 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.737909 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-6npkq\"" Apr 16 14:32:31.738549 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.738534 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 14:32:31.833485 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.833450 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f50b4fad-71ea-4a5d-bac7-f28d01dab723-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-g8nc9\" (UID: \"f50b4fad-71ea-4a5d-bac7-f28d01dab723\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-g8nc9" Apr 16 14:32:31.833667 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.833498 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f50b4fad-71ea-4a5d-bac7-f28d01dab723-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-g8nc9\" (UID: \"f50b4fad-71ea-4a5d-bac7-f28d01dab723\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-g8nc9" Apr 16 14:32:31.833667 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.833530 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f50b4fad-71ea-4a5d-bac7-f28d01dab723-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-g8nc9\" (UID: \"f50b4fad-71ea-4a5d-bac7-f28d01dab723\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-g8nc9" Apr 16 14:32:31.833667 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.833608 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/06ded581-64bf-4eda-b995-f0d8319c99ac-metrics-client-ca\") pod \"node-exporter-lqxdm\" (UID: \"06ded581-64bf-4eda-b995-f0d8319c99ac\") " pod="openshift-monitoring/node-exporter-lqxdm" Apr 16 14:32:31.833833 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.833650 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/06ded581-64bf-4eda-b995-f0d8319c99ac-node-exporter-wtmp\") pod \"node-exporter-lqxdm\" (UID: \"06ded581-64bf-4eda-b995-f0d8319c99ac\") " pod="openshift-monitoring/node-exporter-lqxdm" Apr 16 14:32:31.833833 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.833723 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f50b4fad-71ea-4a5d-bac7-f28d01dab723-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-g8nc9\" (UID: \"f50b4fad-71ea-4a5d-bac7-f28d01dab723\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-g8nc9" Apr 16 14:32:31.833833 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.833757 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/06ded581-64bf-4eda-b995-f0d8319c99ac-root\") pod \"node-exporter-lqxdm\" (UID: \"06ded581-64bf-4eda-b995-f0d8319c99ac\") " pod="openshift-monitoring/node-exporter-lqxdm" Apr 16 14:32:31.833833 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.833816 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/06ded581-64bf-4eda-b995-f0d8319c99ac-sys\") pod \"node-exporter-lqxdm\" (UID: \"06ded581-64bf-4eda-b995-f0d8319c99ac\") " pod="openshift-monitoring/node-exporter-lqxdm" Apr 16 14:32:31.833984 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.833865 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nmz7\" (UniqueName: \"kubernetes.io/projected/f50b4fad-71ea-4a5d-bac7-f28d01dab723-kube-api-access-4nmz7\") pod \"kube-state-metrics-7479c89684-g8nc9\" (UID: \"f50b4fad-71ea-4a5d-bac7-f28d01dab723\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-g8nc9" Apr 16 14:32:31.833984 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.833931 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/06ded581-64bf-4eda-b995-f0d8319c99ac-node-exporter-accelerators-collector-config\") pod \"node-exporter-lqxdm\" (UID: \"06ded581-64bf-4eda-b995-f0d8319c99ac\") " pod="openshift-monitoring/node-exporter-lqxdm" Apr 16 14:32:31.833984 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.833979 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/06ded581-64bf-4eda-b995-f0d8319c99ac-node-exporter-tls\") pod \"node-exporter-lqxdm\" (UID: \"06ded581-64bf-4eda-b995-f0d8319c99ac\") " pod="openshift-monitoring/node-exporter-lqxdm" Apr 16 14:32:31.834146 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.834014 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f50b4fad-71ea-4a5d-bac7-f28d01dab723-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-g8nc9\" (UID: \"f50b4fad-71ea-4a5d-bac7-f28d01dab723\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-g8nc9" Apr 16 14:32:31.834146 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.834056 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/06ded581-64bf-4eda-b995-f0d8319c99ac-node-exporter-textfile\") pod \"node-exporter-lqxdm\" (UID: \"06ded581-64bf-4eda-b995-f0d8319c99ac\") " pod="openshift-monitoring/node-exporter-lqxdm" Apr 16 14:32:31.834146 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.834087 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/06ded581-64bf-4eda-b995-f0d8319c99ac-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lqxdm\" (UID: \"06ded581-64bf-4eda-b995-f0d8319c99ac\") " pod="openshift-monitoring/node-exporter-lqxdm" Apr 16 14:32:31.834146 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.834109 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nncr9\" (UniqueName: \"kubernetes.io/projected/06ded581-64bf-4eda-b995-f0d8319c99ac-kube-api-access-nncr9\") pod \"node-exporter-lqxdm\" (UID: \"06ded581-64bf-4eda-b995-f0d8319c99ac\") " pod="openshift-monitoring/node-exporter-lqxdm" Apr 16 14:32:31.935269 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.935233 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/06ded581-64bf-4eda-b995-f0d8319c99ac-node-exporter-tls\") pod \"node-exporter-lqxdm\" (UID: \"06ded581-64bf-4eda-b995-f0d8319c99ac\") " pod="openshift-monitoring/node-exporter-lqxdm" Apr 16 14:32:31.935456 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.935284 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f50b4fad-71ea-4a5d-bac7-f28d01dab723-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-g8nc9\" (UID: \"f50b4fad-71ea-4a5d-bac7-f28d01dab723\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-g8nc9" Apr 16 14:32:31.935521 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.935457 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/06ded581-64bf-4eda-b995-f0d8319c99ac-node-exporter-textfile\") pod \"node-exporter-lqxdm\" (UID: \"06ded581-64bf-4eda-b995-f0d8319c99ac\") " pod="openshift-monitoring/node-exporter-lqxdm" Apr 16 14:32:31.935521 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.935499 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/06ded581-64bf-4eda-b995-f0d8319c99ac-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lqxdm\" (UID: \"06ded581-64bf-4eda-b995-f0d8319c99ac\") " pod="openshift-monitoring/node-exporter-lqxdm" Apr 16 14:32:31.935619 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.935526 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nncr9\" (UniqueName: \"kubernetes.io/projected/06ded581-64bf-4eda-b995-f0d8319c99ac-kube-api-access-nncr9\") pod \"node-exporter-lqxdm\" (UID: \"06ded581-64bf-4eda-b995-f0d8319c99ac\") " pod="openshift-monitoring/node-exporter-lqxdm" Apr 16 14:32:31.935619 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.935569 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f50b4fad-71ea-4a5d-bac7-f28d01dab723-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-g8nc9\" (UID: \"f50b4fad-71ea-4a5d-bac7-f28d01dab723\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-g8nc9" Apr 16 14:32:31.935619 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.935596 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f50b4fad-71ea-4a5d-bac7-f28d01dab723-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-g8nc9\" (UID: \"f50b4fad-71ea-4a5d-bac7-f28d01dab723\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-g8nc9" Apr 16 14:32:31.935773 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.935637 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f50b4fad-71ea-4a5d-bac7-f28d01dab723-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-g8nc9\" (UID: \"f50b4fad-71ea-4a5d-bac7-f28d01dab723\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-g8nc9" Apr 16 14:32:31.935773 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.935667 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/06ded581-64bf-4eda-b995-f0d8319c99ac-metrics-client-ca\") pod \"node-exporter-lqxdm\" (UID: \"06ded581-64bf-4eda-b995-f0d8319c99ac\") " pod="openshift-monitoring/node-exporter-lqxdm" Apr 16 14:32:31.935773 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.935693 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/06ded581-64bf-4eda-b995-f0d8319c99ac-node-exporter-wtmp\") pod \"node-exporter-lqxdm\" (UID: \"06ded581-64bf-4eda-b995-f0d8319c99ac\") " pod="openshift-monitoring/node-exporter-lqxdm" Apr 16 14:32:31.935773 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.935729 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f50b4fad-71ea-4a5d-bac7-f28d01dab723-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-g8nc9\" (UID: \"f50b4fad-71ea-4a5d-bac7-f28d01dab723\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-g8nc9" Apr 16 14:32:31.935773 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.935756 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/06ded581-64bf-4eda-b995-f0d8319c99ac-root\") pod \"node-exporter-lqxdm\" (UID: \"06ded581-64bf-4eda-b995-f0d8319c99ac\") " pod="openshift-monitoring/node-exporter-lqxdm" Apr 16 14:32:31.936003 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.935806 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/06ded581-64bf-4eda-b995-f0d8319c99ac-sys\") pod \"node-exporter-lqxdm\" (UID: \"06ded581-64bf-4eda-b995-f0d8319c99ac\") " pod="openshift-monitoring/node-exporter-lqxdm" Apr 16 14:32:31.936003 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.935819 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/06ded581-64bf-4eda-b995-f0d8319c99ac-node-exporter-textfile\") pod \"node-exporter-lqxdm\" (UID: \"06ded581-64bf-4eda-b995-f0d8319c99ac\") " pod="openshift-monitoring/node-exporter-lqxdm" Apr 16 14:32:31.936003 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.935857 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4nmz7\" (UniqueName: \"kubernetes.io/projected/f50b4fad-71ea-4a5d-bac7-f28d01dab723-kube-api-access-4nmz7\") pod \"kube-state-metrics-7479c89684-g8nc9\" (UID: \"f50b4fad-71ea-4a5d-bac7-f28d01dab723\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-g8nc9" Apr 16 14:32:31.936003 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.935922 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/06ded581-64bf-4eda-b995-f0d8319c99ac-node-exporter-accelerators-collector-config\") pod \"node-exporter-lqxdm\" (UID: \"06ded581-64bf-4eda-b995-f0d8319c99ac\") " pod="openshift-monitoring/node-exporter-lqxdm" Apr 16 14:32:31.937284 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.936356 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/06ded581-64bf-4eda-b995-f0d8319c99ac-root\") pod \"node-exporter-lqxdm\" (UID: \"06ded581-64bf-4eda-b995-f0d8319c99ac\") " pod="openshift-monitoring/node-exporter-lqxdm" Apr 16 14:32:31.937284 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.936417 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/06ded581-64bf-4eda-b995-f0d8319c99ac-sys\") pod \"node-exporter-lqxdm\" (UID: \"06ded581-64bf-4eda-b995-f0d8319c99ac\") " pod="openshift-monitoring/node-exporter-lqxdm" Apr 16 14:32:31.937284 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.936458 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/06ded581-64bf-4eda-b995-f0d8319c99ac-metrics-client-ca\") pod \"node-exporter-lqxdm\" (UID: \"06ded581-64bf-4eda-b995-f0d8319c99ac\") " pod="openshift-monitoring/node-exporter-lqxdm" Apr 16 14:32:31.937284 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.936621 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/06ded581-64bf-4eda-b995-f0d8319c99ac-node-exporter-wtmp\") pod \"node-exporter-lqxdm\" (UID: \"06ded581-64bf-4eda-b995-f0d8319c99ac\") " pod="openshift-monitoring/node-exporter-lqxdm" Apr 16 14:32:31.937284 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.936922 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f50b4fad-71ea-4a5d-bac7-f28d01dab723-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-g8nc9\" (UID: \"f50b4fad-71ea-4a5d-bac7-f28d01dab723\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-g8nc9" Apr 16 14:32:31.937284 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.937172 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/06ded581-64bf-4eda-b995-f0d8319c99ac-node-exporter-accelerators-collector-config\") pod \"node-exporter-lqxdm\" (UID: \"06ded581-64bf-4eda-b995-f0d8319c99ac\") " pod="openshift-monitoring/node-exporter-lqxdm" Apr 16 14:32:31.937284 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.937193 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f50b4fad-71ea-4a5d-bac7-f28d01dab723-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-g8nc9\" (UID: \"f50b4fad-71ea-4a5d-bac7-f28d01dab723\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-g8nc9" Apr 16 14:32:31.937284 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.937246 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f50b4fad-71ea-4a5d-bac7-f28d01dab723-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-g8nc9\" (UID: \"f50b4fad-71ea-4a5d-bac7-f28d01dab723\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-g8nc9" Apr 16 14:32:31.938240 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.938215 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/06ded581-64bf-4eda-b995-f0d8319c99ac-node-exporter-tls\") pod \"node-exporter-lqxdm\" (UID: \"06ded581-64bf-4eda-b995-f0d8319c99ac\") " pod="openshift-monitoring/node-exporter-lqxdm" Apr 16 14:32:31.938547 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.938514 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f50b4fad-71ea-4a5d-bac7-f28d01dab723-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-g8nc9\" (UID: \"f50b4fad-71ea-4a5d-bac7-f28d01dab723\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-g8nc9" Apr 16 14:32:31.938729 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.938710 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/06ded581-64bf-4eda-b995-f0d8319c99ac-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lqxdm\" (UID: \"06ded581-64bf-4eda-b995-f0d8319c99ac\") " pod="openshift-monitoring/node-exporter-lqxdm" Apr 16 14:32:31.939514 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.939486 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f50b4fad-71ea-4a5d-bac7-f28d01dab723-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-g8nc9\" (UID: \"f50b4fad-71ea-4a5d-bac7-f28d01dab723\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-g8nc9" Apr 16 14:32:31.945318 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.945281 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nncr9\" (UniqueName: \"kubernetes.io/projected/06ded581-64bf-4eda-b995-f0d8319c99ac-kube-api-access-nncr9\") pod \"node-exporter-lqxdm\" (UID: \"06ded581-64bf-4eda-b995-f0d8319c99ac\") " pod="openshift-monitoring/node-exporter-lqxdm" Apr 16 14:32:31.945415 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:31.945390 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nmz7\" (UniqueName: \"kubernetes.io/projected/f50b4fad-71ea-4a5d-bac7-f28d01dab723-kube-api-access-4nmz7\") pod \"kube-state-metrics-7479c89684-g8nc9\" (UID: \"f50b4fad-71ea-4a5d-bac7-f28d01dab723\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-g8nc9" Apr 16 14:32:32.030215 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:32.027829 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-g8nc9" Apr 16 14:32:32.048367 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:32.047965 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-lqxdm" Apr 16 14:32:32.191138 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:32.188172 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-g8nc9"] Apr 16 14:32:32.193299 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:32:32.193264 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf50b4fad_71ea_4a5d_bac7_f28d01dab723.slice/crio-8088c081e64058e557c06840c1b9c715a1217cac68b0a470578075eee5267a92 WatchSource:0}: Error finding container 8088c081e64058e557c06840c1b9c715a1217cac68b0a470578075eee5267a92: Status 404 returned error can't find the container with id 8088c081e64058e557c06840c1b9c715a1217cac68b0a470578075eee5267a92 Apr 16 14:32:32.570941 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:32.570862 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-zvqf4" event={"ID":"c405329f-15f2-4fd4-8b05-4f3e19783a99","Type":"ContainerStarted","Data":"4d57a3bc4b79cbba888097efba7425ad6f108948a8b369133f225d7e06b96aaf"} Apr 16 14:32:32.572129 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:32.572063 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-586b57c7b4-zvqf4" Apr 16 14:32:32.574533 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:32.574467 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6db78bffcf-j4hnv" event={"ID":"2840ca5d-533d-4e77-8636-f68c1bfa7c31","Type":"ContainerStarted","Data":"213c40b473e615b10aeea4aca7b9c55646d6a4a55fb4a3d7aafae7e556a261ad"} Apr 16 14:32:32.576466 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:32.576429 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lqxdm" event={"ID":"06ded581-64bf-4eda-b995-f0d8319c99ac","Type":"ContainerStarted","Data":"69d7d583dc042809d739f7841caafcef55be0debe5a04d809c770a031241f138"} Apr 16 14:32:32.578000 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:32.577973 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-g8nc9" event={"ID":"f50b4fad-71ea-4a5d-bac7-f28d01dab723","Type":"ContainerStarted","Data":"8088c081e64058e557c06840c1b9c715a1217cac68b0a470578075eee5267a92"} Apr 16 14:32:32.585615 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:32.582630 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-586b57c7b4-zvqf4" Apr 16 14:32:32.595798 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:32.595692 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-586b57c7b4-zvqf4" podStartSLOduration=2.115545509 podStartE2EDuration="17.595674778s" podCreationTimestamp="2026-04-16 14:32:15 +0000 UTC" firstStartedPulling="2026-04-16 14:32:16.056586509 +0000 UTC m=+142.674376250" lastFinishedPulling="2026-04-16 14:32:31.536715775 +0000 UTC m=+158.154505519" observedRunningTime="2026-04-16 14:32:32.595002652 +0000 UTC m=+159.212792413" watchObservedRunningTime="2026-04-16 14:32:32.595674778 +0000 UTC m=+159.213464543" Apr 16 14:32:32.797898 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:32.797852 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:32:32.806795 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:32.806758 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:32.809594 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:32.809245 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 14:32:32.809594 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:32.809485 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 14:32:32.810408 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:32.809809 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-dr2rr\"" Apr 16 14:32:32.810408 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:32.810082 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 14:32:32.810408 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:32.810100 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 14:32:32.810408 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:32.810124 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 14:32:32.810408 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:32.810270 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 14:32:32.810408 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:32.810278 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 14:32:32.810408 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:32.810316 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 14:32:32.811234 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:32.811215 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 14:32:32.815004 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:32.814953 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:32:32.946432 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:32.945999 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:32.946432 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:32.946078 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:32.946432 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:32.946117 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-web-config\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:32.946432 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:32.946147 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:32.946432 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:32.946174 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d94d7fa-516f-4583-967f-477abf72f68e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:32.946432 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:32.946224 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:32.946905 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:32.946828 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:32.946905 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:32.946888 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/0d94d7fa-516f-4583-967f-477abf72f68e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:32.947009 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:32.946921 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0d94d7fa-516f-4583-967f-477abf72f68e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:32.947009 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:32.946948 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2rz2\" (UniqueName: \"kubernetes.io/projected/0d94d7fa-516f-4583-967f-477abf72f68e-kube-api-access-b2rz2\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:32.947131 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:32.947041 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-config-volume\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:32.947131 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:32.947095 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0d94d7fa-516f-4583-967f-477abf72f68e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:32.947236 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:32.947131 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0d94d7fa-516f-4583-967f-477abf72f68e-config-out\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:33.047654 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:33.047614 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/0d94d7fa-516f-4583-967f-477abf72f68e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:33.047778 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:33.047668 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0d94d7fa-516f-4583-967f-477abf72f68e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:33.047778 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:33.047696 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2rz2\" (UniqueName: \"kubernetes.io/projected/0d94d7fa-516f-4583-967f-477abf72f68e-kube-api-access-b2rz2\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:33.047778 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:33.047755 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-config-volume\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:33.047926 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:33.047801 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0d94d7fa-516f-4583-967f-477abf72f68e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:33.047926 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:33.047836 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0d94d7fa-516f-4583-967f-477abf72f68e-config-out\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:33.047926 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:33.047865 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:33.047926 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:33.047903 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:33.048137 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:33.047933 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-web-config\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:33.048137 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:33.047960 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:33.048137 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:33.047983 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d94d7fa-516f-4583-967f-477abf72f68e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:33.048137 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:33.048023 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:33.048137 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:33.048133 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:33.048822 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:33.048531 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/0d94d7fa-516f-4583-967f-477abf72f68e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:33.048822 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:33.048532 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0d94d7fa-516f-4583-967f-477abf72f68e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:33.049413 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:32:33.049382 2579 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 16 14:32:33.049486 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:32:33.049455 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-secret-alertmanager-main-tls podName:0d94d7fa-516f-4583-967f-477abf72f68e nodeName:}" failed. No retries permitted until 2026-04-16 14:32:33.549434424 +0000 UTC m=+160.167224167 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "0d94d7fa-516f-4583-967f-477abf72f68e") : secret "alertmanager-main-tls" not found Apr 16 14:32:33.050131 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:33.049547 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d94d7fa-516f-4583-967f-477abf72f68e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:33.057811 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:33.057137 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0d94d7fa-516f-4583-967f-477abf72f68e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:33.058403 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:33.058380 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:33.059409 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:33.059371 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-web-config\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:33.059944 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:33.059908 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:33.060537 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:33.060483 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-config-volume\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:33.061172 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:33.061133 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2rz2\" (UniqueName: \"kubernetes.io/projected/0d94d7fa-516f-4583-967f-477abf72f68e-kube-api-access-b2rz2\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:33.061642 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:33.061607 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:33.063680 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:33.063640 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0d94d7fa-516f-4583-967f-477abf72f68e-config-out\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:33.064410 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:33.064387 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:33.554225 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:33.554144 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:33.554418 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:32:33.554232 2579 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 16 14:32:33.554418 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:32:33.554315 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-secret-alertmanager-main-tls podName:0d94d7fa-516f-4583-967f-477abf72f68e nodeName:}" failed. No retries permitted until 2026-04-16 14:32:34.554293472 +0000 UTC m=+161.172083234 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "0d94d7fa-516f-4583-967f-477abf72f68e") : secret "alertmanager-main-tls" not found Apr 16 14:32:33.587430 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:33.587368 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lqxdm" event={"ID":"06ded581-64bf-4eda-b995-f0d8319c99ac","Type":"ContainerStarted","Data":"7b93408534e33d2d6d87976159e13c10a9c71439db06c85def7b40b0ea3f1fd2"} Apr 16 14:32:34.161219 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:34.161169 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b44026e-170c-4488-824c-c757c82f68cd-metrics-tls\") pod \"dns-default-4t4r6\" (UID: \"0b44026e-170c-4488-824c-c757c82f68cd\") " pod="openshift-dns/dns-default-4t4r6" Apr 16 14:32:34.161790 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:34.161271 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93aad6d1-f6fd-49fd-aeac-3661dd3118bf-cert\") pod \"ingress-canary-src8k\" (UID: \"93aad6d1-f6fd-49fd-aeac-3661dd3118bf\") " pod="openshift-ingress-canary/ingress-canary-src8k" Apr 16 14:32:34.164629 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:34.164568 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b44026e-170c-4488-824c-c757c82f68cd-metrics-tls\") pod \"dns-default-4t4r6\" (UID: \"0b44026e-170c-4488-824c-c757c82f68cd\") " pod="openshift-dns/dns-default-4t4r6" Apr 16 14:32:34.164886 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:34.164802 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93aad6d1-f6fd-49fd-aeac-3661dd3118bf-cert\") pod \"ingress-canary-src8k\" (UID: \"93aad6d1-f6fd-49fd-aeac-3661dd3118bf\") " pod="openshift-ingress-canary/ingress-canary-src8k" Apr 16 14:32:34.362369 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:34.362336 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dqsrn\"" Apr 16 14:32:34.370275 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:34.370246 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4t4r6" Apr 16 14:32:34.564948 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:34.564906 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:34.568944 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:34.568888 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:34.592694 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:34.592655 2579 generic.go:358] "Generic (PLEG): container finished" podID="06ded581-64bf-4eda-b995-f0d8319c99ac" containerID="7b93408534e33d2d6d87976159e13c10a9c71439db06c85def7b40b0ea3f1fd2" exitCode=0 Apr 16 14:32:34.594396 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:34.594327 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lqxdm" event={"ID":"06ded581-64bf-4eda-b995-f0d8319c99ac","Type":"ContainerDied","Data":"7b93408534e33d2d6d87976159e13c10a9c71439db06c85def7b40b0ea3f1fd2"} Apr 16 14:32:34.619347 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:34.619317 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:32:35.489999 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:35.489968 2579 patch_prober.go:28] interesting pod/image-registry-579fc97f6b-mdg5r container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 14:32:35.490446 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:35.490023 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" podUID="954a0b17-ede2-4a5f-a717-fb8f76331dee" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:32:35.841168 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:35.840993 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4t4r6"] Apr 16 14:32:35.846982 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:32:35.846947 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b44026e_170c_4488_824c_c757c82f68cd.slice/crio-3e0466c097f7f6269f7f53a91a143e753476fac27c60d2626a5960880cb006b4 WatchSource:0}: Error finding container 3e0466c097f7f6269f7f53a91a143e753476fac27c60d2626a5960880cb006b4: Status 404 returned error can't find the container with id 3e0466c097f7f6269f7f53a91a143e753476fac27c60d2626a5960880cb006b4 Apr 16 14:32:35.860278 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:35.860250 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:32:36.112591 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.112526 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7c4b5cb6c-c868m"] Apr 16 14:32:36.124093 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.124060 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7c4b5cb6c-c868m" Apr 16 14:32:36.126851 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.126819 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 14:32:36.127004 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.126955 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 14:32:36.127114 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.127085 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 14:32:36.127252 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.127236 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 14:32:36.127323 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.127279 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-sab9k178pk24\"" Apr 16 14:32:36.127458 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.127442 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-prtf9\"" Apr 16 14:32:36.128065 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.128021 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7c4b5cb6c-c868m"] Apr 16 14:32:36.282822 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.282792 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kwt5\" (UniqueName: \"kubernetes.io/projected/2cf40cf8-92d8-4a5a-afda-a5c5e3009baf-kube-api-access-6kwt5\") pod \"metrics-server-7c4b5cb6c-c868m\" (UID: \"2cf40cf8-92d8-4a5a-afda-a5c5e3009baf\") " pod="openshift-monitoring/metrics-server-7c4b5cb6c-c868m" Apr 16 14:32:36.283012 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.282911 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/2cf40cf8-92d8-4a5a-afda-a5c5e3009baf-secret-metrics-server-tls\") pod \"metrics-server-7c4b5cb6c-c868m\" (UID: \"2cf40cf8-92d8-4a5a-afda-a5c5e3009baf\") " pod="openshift-monitoring/metrics-server-7c4b5cb6c-c868m" Apr 16 14:32:36.283012 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.282953 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/2cf40cf8-92d8-4a5a-afda-a5c5e3009baf-audit-log\") pod \"metrics-server-7c4b5cb6c-c868m\" (UID: \"2cf40cf8-92d8-4a5a-afda-a5c5e3009baf\") " pod="openshift-monitoring/metrics-server-7c4b5cb6c-c868m" Apr 16 14:32:36.283012 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.282989 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/2cf40cf8-92d8-4a5a-afda-a5c5e3009baf-metrics-server-audit-profiles\") pod \"metrics-server-7c4b5cb6c-c868m\" (UID: \"2cf40cf8-92d8-4a5a-afda-a5c5e3009baf\") " pod="openshift-monitoring/metrics-server-7c4b5cb6c-c868m" Apr 16 14:32:36.283205 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.283129 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/2cf40cf8-92d8-4a5a-afda-a5c5e3009baf-secret-metrics-server-client-certs\") pod \"metrics-server-7c4b5cb6c-c868m\" (UID: \"2cf40cf8-92d8-4a5a-afda-a5c5e3009baf\") " pod="openshift-monitoring/metrics-server-7c4b5cb6c-c868m" Apr 16 14:32:36.283205 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.283158 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cf40cf8-92d8-4a5a-afda-a5c5e3009baf-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7c4b5cb6c-c868m\" (UID: \"2cf40cf8-92d8-4a5a-afda-a5c5e3009baf\") " pod="openshift-monitoring/metrics-server-7c4b5cb6c-c868m" Apr 16 14:32:36.283287 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.283234 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cf40cf8-92d8-4a5a-afda-a5c5e3009baf-client-ca-bundle\") pod \"metrics-server-7c4b5cb6c-c868m\" (UID: \"2cf40cf8-92d8-4a5a-afda-a5c5e3009baf\") " pod="openshift-monitoring/metrics-server-7c4b5cb6c-c868m" Apr 16 14:32:36.383882 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.383840 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6kwt5\" (UniqueName: \"kubernetes.io/projected/2cf40cf8-92d8-4a5a-afda-a5c5e3009baf-kube-api-access-6kwt5\") pod \"metrics-server-7c4b5cb6c-c868m\" (UID: \"2cf40cf8-92d8-4a5a-afda-a5c5e3009baf\") " pod="openshift-monitoring/metrics-server-7c4b5cb6c-c868m" Apr 16 14:32:36.384075 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.383912 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/2cf40cf8-92d8-4a5a-afda-a5c5e3009baf-secret-metrics-server-tls\") pod \"metrics-server-7c4b5cb6c-c868m\" (UID: \"2cf40cf8-92d8-4a5a-afda-a5c5e3009baf\") " pod="openshift-monitoring/metrics-server-7c4b5cb6c-c868m" Apr 16 14:32:36.384075 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.383942 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/2cf40cf8-92d8-4a5a-afda-a5c5e3009baf-audit-log\") pod \"metrics-server-7c4b5cb6c-c868m\" (UID: \"2cf40cf8-92d8-4a5a-afda-a5c5e3009baf\") " pod="openshift-monitoring/metrics-server-7c4b5cb6c-c868m" Apr 16 14:32:36.384075 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.383969 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/2cf40cf8-92d8-4a5a-afda-a5c5e3009baf-metrics-server-audit-profiles\") pod \"metrics-server-7c4b5cb6c-c868m\" (UID: \"2cf40cf8-92d8-4a5a-afda-a5c5e3009baf\") " pod="openshift-monitoring/metrics-server-7c4b5cb6c-c868m" Apr 16 14:32:36.384075 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.384016 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/2cf40cf8-92d8-4a5a-afda-a5c5e3009baf-secret-metrics-server-client-certs\") pod \"metrics-server-7c4b5cb6c-c868m\" (UID: \"2cf40cf8-92d8-4a5a-afda-a5c5e3009baf\") " pod="openshift-monitoring/metrics-server-7c4b5cb6c-c868m" Apr 16 14:32:36.384296 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.384078 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cf40cf8-92d8-4a5a-afda-a5c5e3009baf-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7c4b5cb6c-c868m\" (UID: \"2cf40cf8-92d8-4a5a-afda-a5c5e3009baf\") " pod="openshift-monitoring/metrics-server-7c4b5cb6c-c868m" Apr 16 14:32:36.384444 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.384418 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cf40cf8-92d8-4a5a-afda-a5c5e3009baf-client-ca-bundle\") pod \"metrics-server-7c4b5cb6c-c868m\" (UID: \"2cf40cf8-92d8-4a5a-afda-a5c5e3009baf\") " pod="openshift-monitoring/metrics-server-7c4b5cb6c-c868m" Apr 16 14:32:36.384589 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.384422 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/2cf40cf8-92d8-4a5a-afda-a5c5e3009baf-audit-log\") pod \"metrics-server-7c4b5cb6c-c868m\" (UID: \"2cf40cf8-92d8-4a5a-afda-a5c5e3009baf\") " pod="openshift-monitoring/metrics-server-7c4b5cb6c-c868m" Apr 16 14:32:36.384976 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.384903 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cf40cf8-92d8-4a5a-afda-a5c5e3009baf-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7c4b5cb6c-c868m\" (UID: \"2cf40cf8-92d8-4a5a-afda-a5c5e3009baf\") " pod="openshift-monitoring/metrics-server-7c4b5cb6c-c868m" Apr 16 14:32:36.385119 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.385056 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/2cf40cf8-92d8-4a5a-afda-a5c5e3009baf-metrics-server-audit-profiles\") pod \"metrics-server-7c4b5cb6c-c868m\" (UID: \"2cf40cf8-92d8-4a5a-afda-a5c5e3009baf\") " pod="openshift-monitoring/metrics-server-7c4b5cb6c-c868m" Apr 16 14:32:36.387174 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.387150 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cf40cf8-92d8-4a5a-afda-a5c5e3009baf-client-ca-bundle\") pod \"metrics-server-7c4b5cb6c-c868m\" (UID: \"2cf40cf8-92d8-4a5a-afda-a5c5e3009baf\") " pod="openshift-monitoring/metrics-server-7c4b5cb6c-c868m" Apr 16 14:32:36.387278 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.387208 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/2cf40cf8-92d8-4a5a-afda-a5c5e3009baf-secret-metrics-server-client-certs\") pod \"metrics-server-7c4b5cb6c-c868m\" (UID: \"2cf40cf8-92d8-4a5a-afda-a5c5e3009baf\") " pod="openshift-monitoring/metrics-server-7c4b5cb6c-c868m" Apr 16 14:32:36.387278 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.387245 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/2cf40cf8-92d8-4a5a-afda-a5c5e3009baf-secret-metrics-server-tls\") pod \"metrics-server-7c4b5cb6c-c868m\" (UID: \"2cf40cf8-92d8-4a5a-afda-a5c5e3009baf\") " pod="openshift-monitoring/metrics-server-7c4b5cb6c-c868m" Apr 16 14:32:36.398576 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.398548 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kwt5\" (UniqueName: \"kubernetes.io/projected/2cf40cf8-92d8-4a5a-afda-a5c5e3009baf-kube-api-access-6kwt5\") pod \"metrics-server-7c4b5cb6c-c868m\" (UID: \"2cf40cf8-92d8-4a5a-afda-a5c5e3009baf\") " pod="openshift-monitoring/metrics-server-7c4b5cb6c-c868m" Apr 16 14:32:36.436274 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.436239 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7c4b5cb6c-c868m" Apr 16 14:32:36.599828 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.599791 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7c4b5cb6c-c868m"] Apr 16 14:32:36.607708 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.607603 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lqxdm" event={"ID":"06ded581-64bf-4eda-b995-f0d8319c99ac","Type":"ContainerStarted","Data":"d05520020b5c4c7e9f424b89cb7cf28c16fefb0709d34d4cd6113cc31321a3b1"} Apr 16 14:32:36.607830 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.607724 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lqxdm" event={"ID":"06ded581-64bf-4eda-b995-f0d8319c99ac","Type":"ContainerStarted","Data":"bc3d3461a561ee817a009dadfa2b18039a1c358a00f6b9d2601d0cb72ac5df94"} Apr 16 14:32:36.610819 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.610791 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-g8nc9" event={"ID":"f50b4fad-71ea-4a5d-bac7-f28d01dab723","Type":"ContainerStarted","Data":"ce82186f1f7a6d61401b60db6e5aed3f569a33a14eff655bb4cadb91d41f55e3"} Apr 16 14:32:36.610928 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.610827 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-g8nc9" event={"ID":"f50b4fad-71ea-4a5d-bac7-f28d01dab723","Type":"ContainerStarted","Data":"89eb2338a5a818328d04f5f31a792860b1646464699f3663b9e3a8f0a8135686"} Apr 16 14:32:36.610928 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.610843 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-g8nc9" event={"ID":"f50b4fad-71ea-4a5d-bac7-f28d01dab723","Type":"ContainerStarted","Data":"5acf92f4e926a88f23a6b4f83fd2922e62b3ab1dcfcbb208917426a989d5e266"} Apr 16 14:32:36.614067 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.614026 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0d94d7fa-516f-4583-967f-477abf72f68e","Type":"ContainerStarted","Data":"b64c703cbffbe56c5b3495fa5117c770f27b50a10bf82fab0657820e8eac6ad5"} Apr 16 14:32:36.615363 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.615315 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4t4r6" event={"ID":"0b44026e-170c-4488-824c-c757c82f68cd","Type":"ContainerStarted","Data":"3e0466c097f7f6269f7f53a91a143e753476fac27c60d2626a5960880cb006b4"} Apr 16 14:32:36.617540 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.617505 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6db78bffcf-j4hnv" event={"ID":"2840ca5d-533d-4e77-8636-f68c1bfa7c31","Type":"ContainerStarted","Data":"04ba492bc696dd5a9fe584d120a15080fb94a7ea83fdd2c433ca255eafe07f0d"} Apr 16 14:32:36.633502 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.633446 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-lqxdm" podStartSLOduration=4.670488132 podStartE2EDuration="5.633430828s" podCreationTimestamp="2026-04-16 14:32:31 +0000 UTC" firstStartedPulling="2026-04-16 14:32:32.066533063 +0000 UTC m=+158.684322820" lastFinishedPulling="2026-04-16 14:32:33.02947575 +0000 UTC m=+159.647265516" observedRunningTime="2026-04-16 14:32:36.633002524 +0000 UTC m=+163.250792288" watchObservedRunningTime="2026-04-16 14:32:36.633430828 +0000 UTC m=+163.251220591" Apr 16 14:32:36.682566 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.682507 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7479c89684-g8nc9" podStartSLOduration=2.209710639 podStartE2EDuration="5.682486872s" podCreationTimestamp="2026-04-16 14:32:31 +0000 UTC" firstStartedPulling="2026-04-16 14:32:32.196735241 +0000 UTC m=+158.814524988" lastFinishedPulling="2026-04-16 14:32:35.669511463 +0000 UTC m=+162.287301221" observedRunningTime="2026-04-16 14:32:36.655800082 +0000 UTC m=+163.273589848" watchObservedRunningTime="2026-04-16 14:32:36.682486872 +0000 UTC m=+163.300276635" Apr 16 14:32:36.683406 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:36.683359 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6db78bffcf-j4hnv" podStartSLOduration=1.5928767430000002 podStartE2EDuration="5.683346429s" podCreationTimestamp="2026-04-16 14:32:31 +0000 UTC" firstStartedPulling="2026-04-16 14:32:31.587092898 +0000 UTC m=+158.204882639" lastFinishedPulling="2026-04-16 14:32:35.677562577 +0000 UTC m=+162.295352325" observedRunningTime="2026-04-16 14:32:36.682155472 +0000 UTC m=+163.299945236" watchObservedRunningTime="2026-04-16 14:32:36.683346429 +0000 UTC m=+163.301136193" Apr 16 14:32:37.621864 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:37.621820 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7c4b5cb6c-c868m" event={"ID":"2cf40cf8-92d8-4a5a-afda-a5c5e3009baf","Type":"ContainerStarted","Data":"94a3c2bb0a3ff0f845e4acb09fc1218fb530039b29469de4c5eaad022a4787a3"} Apr 16 14:32:37.936240 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:37.936156 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:32:37.960116 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:37.960080 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:32:37.960297 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:37.960239 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:37.962676 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:37.962654 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 14:32:37.964015 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:37.963987 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 14:32:37.964390 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:37.964304 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 14:32:37.964512 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:37.964394 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 14:32:37.964512 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:37.964306 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 14:32:37.965204 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:37.964720 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 14:32:37.965204 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:37.965149 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-2xbnh\"" Apr 16 14:32:37.965686 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:37.965557 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 14:32:37.965686 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:37.965563 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 14:32:37.965930 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:37.965842 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 14:32:37.966021 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:37.965914 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 14:32:37.966021 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:37.966010 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 14:32:37.966210 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:37.965917 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-cmr8bvo245iv2\"" Apr 16 14:32:37.967723 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:37.967409 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 14:32:38.105053 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.104988 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/671ec7c2-af4a-4a14-9063-810caf17324b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.105270 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.105055 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/671ec7c2-af4a-4a14-9063-810caf17324b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.105270 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.105084 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/671ec7c2-af4a-4a14-9063-810caf17324b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.105270 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.105116 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/671ec7c2-af4a-4a14-9063-810caf17324b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.105270 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.105160 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.105270 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.105193 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.105270 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.105231 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/671ec7c2-af4a-4a14-9063-810caf17324b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.105270 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.105267 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/671ec7c2-af4a-4a14-9063-810caf17324b-config-out\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.105701 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.105292 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.105701 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.105323 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.105701 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.105351 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/671ec7c2-af4a-4a14-9063-810caf17324b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.105701 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.105387 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-config\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.105701 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.105423 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvrs4\" (UniqueName: \"kubernetes.io/projected/671ec7c2-af4a-4a14-9063-810caf17324b-kube-api-access-rvrs4\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.105701 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.105450 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.105701 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.105476 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.105701 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.105545 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-web-config\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.105701 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.105578 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.105701 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.105613 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/671ec7c2-af4a-4a14-9063-810caf17324b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.206952 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.206918 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.207146 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.206972 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.207146 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.207007 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/671ec7c2-af4a-4a14-9063-810caf17324b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.207146 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.207077 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/671ec7c2-af4a-4a14-9063-810caf17324b-config-out\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.207146 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.207109 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.207146 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.207142 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.207384 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.207173 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/671ec7c2-af4a-4a14-9063-810caf17324b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.207384 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.207207 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-config\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.207384 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.207233 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvrs4\" (UniqueName: \"kubernetes.io/projected/671ec7c2-af4a-4a14-9063-810caf17324b-kube-api-access-rvrs4\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.207384 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.207251 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.207384 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.207269 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.207384 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.207314 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-web-config\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.207384 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.207336 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.207384 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.207364 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/671ec7c2-af4a-4a14-9063-810caf17324b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.207822 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.207393 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/671ec7c2-af4a-4a14-9063-810caf17324b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.207822 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.207419 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/671ec7c2-af4a-4a14-9063-810caf17324b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.207822 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.207438 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/671ec7c2-af4a-4a14-9063-810caf17324b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.207822 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.207456 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/671ec7c2-af4a-4a14-9063-810caf17324b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.211080 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.209174 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/671ec7c2-af4a-4a14-9063-810caf17324b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.212844 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.212754 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/671ec7c2-af4a-4a14-9063-810caf17324b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.224688 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.224380 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/671ec7c2-af4a-4a14-9063-810caf17324b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.225752 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.225697 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-web-config\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.226164 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.226098 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/671ec7c2-af4a-4a14-9063-810caf17324b-config-out\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.226164 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.226152 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.226456 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.226433 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.227378 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.227261 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/671ec7c2-af4a-4a14-9063-810caf17324b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.227481 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.227393 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvrs4\" (UniqueName: \"kubernetes.io/projected/671ec7c2-af4a-4a14-9063-810caf17324b-kube-api-access-rvrs4\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.227645 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.227591 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/671ec7c2-af4a-4a14-9063-810caf17324b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.228720 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.227890 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.228720 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.228086 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.228720 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.228417 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.228720 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.228552 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/671ec7c2-af4a-4a14-9063-810caf17324b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.229081 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.228951 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/671ec7c2-af4a-4a14-9063-810caf17324b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.229081 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.229009 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.230316 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.230267 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-config\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.231110 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.231084 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.275399 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.275370 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:38.453267 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.453233 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:32:38.628569 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.628530 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4t4r6" event={"ID":"0b44026e-170c-4488-824c-c757c82f68cd","Type":"ContainerStarted","Data":"bb406c8f1598654fcf6ce997a45b159816bbb457124bfd33111ea112c5f6f0da"} Apr 16 14:32:38.628569 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.628575 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4t4r6" event={"ID":"0b44026e-170c-4488-824c-c757c82f68cd","Type":"ContainerStarted","Data":"7b6d955110169a448a66f57148df409d4afeb5172a78ca1aff831ba8525256fa"} Apr 16 14:32:38.629150 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.628691 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-4t4r6" Apr 16 14:32:38.648395 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:38.648010 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-4t4r6" podStartSLOduration=130.442431841 podStartE2EDuration="2m12.647989808s" podCreationTimestamp="2026-04-16 14:30:26 +0000 UTC" firstStartedPulling="2026-04-16 14:32:35.849714261 +0000 UTC m=+162.467504015" lastFinishedPulling="2026-04-16 14:32:38.055272232 +0000 UTC m=+164.673061982" observedRunningTime="2026-04-16 14:32:38.647759637 +0000 UTC m=+165.265549411" watchObservedRunningTime="2026-04-16 14:32:38.647989808 +0000 UTC m=+165.265779571" Apr 16 14:32:38.728165 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:32:38.728083 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod671ec7c2_af4a_4a14_9063_810caf17324b.slice/crio-1deb5e7398df1adfb152cf19c244afbc2c5325a5a1c89c3b2704bfeff93840f1 WatchSource:0}: Error finding container 1deb5e7398df1adfb152cf19c244afbc2c5325a5a1c89c3b2704bfeff93840f1: Status 404 returned error can't find the container with id 1deb5e7398df1adfb152cf19c244afbc2c5325a5a1c89c3b2704bfeff93840f1 Apr 16 14:32:39.633168 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:39.633126 2579 generic.go:358] "Generic (PLEG): container finished" podID="0d94d7fa-516f-4583-967f-477abf72f68e" containerID="8bba2946d5c6134cc944237e0e002ffb2e2756982b9be941fb43a0e2118a83ae" exitCode=0 Apr 16 14:32:39.633642 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:39.633209 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0d94d7fa-516f-4583-967f-477abf72f68e","Type":"ContainerDied","Data":"8bba2946d5c6134cc944237e0e002ffb2e2756982b9be941fb43a0e2118a83ae"} Apr 16 14:32:39.634941 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:39.634870 2579 generic.go:358] "Generic (PLEG): container finished" podID="671ec7c2-af4a-4a14-9063-810caf17324b" containerID="3385d56c60155e59f6472b0ce3848a7b5d827b42c6eae7113fa99f55e9f2e395" exitCode=0 Apr 16 14:32:39.635022 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:39.634957 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"671ec7c2-af4a-4a14-9063-810caf17324b","Type":"ContainerDied","Data":"3385d56c60155e59f6472b0ce3848a7b5d827b42c6eae7113fa99f55e9f2e395"} Apr 16 14:32:39.635022 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:39.634994 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"671ec7c2-af4a-4a14-9063-810caf17324b","Type":"ContainerStarted","Data":"1deb5e7398df1adfb152cf19c244afbc2c5325a5a1c89c3b2704bfeff93840f1"} Apr 16 14:32:40.504188 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:40.503901 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" podUID="954a0b17-ede2-4a5f-a717-fb8f76331dee" containerName="registry" containerID="cri-o://ac9ac373d68b76df0ce29d292f237361257c79f21ac910aca3d651addb111e50" gracePeriod=30 Apr 16 14:32:41.052402 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.052369 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" Apr 16 14:32:41.140916 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.140814 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/954a0b17-ede2-4a5f-a717-fb8f76331dee-bound-sa-token\") pod \"954a0b17-ede2-4a5f-a717-fb8f76331dee\" (UID: \"954a0b17-ede2-4a5f-a717-fb8f76331dee\") " Apr 16 14:32:41.140916 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.140869 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/954a0b17-ede2-4a5f-a717-fb8f76331dee-trusted-ca\") pod \"954a0b17-ede2-4a5f-a717-fb8f76331dee\" (UID: \"954a0b17-ede2-4a5f-a717-fb8f76331dee\") " Apr 16 14:32:41.141466 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.140929 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/954a0b17-ede2-4a5f-a717-fb8f76331dee-ca-trust-extracted\") pod \"954a0b17-ede2-4a5f-a717-fb8f76331dee\" (UID: \"954a0b17-ede2-4a5f-a717-fb8f76331dee\") " Apr 16 14:32:41.141466 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.140994 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/954a0b17-ede2-4a5f-a717-fb8f76331dee-installation-pull-secrets\") pod \"954a0b17-ede2-4a5f-a717-fb8f76331dee\" (UID: \"954a0b17-ede2-4a5f-a717-fb8f76331dee\") " Apr 16 14:32:41.141466 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.141024 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/954a0b17-ede2-4a5f-a717-fb8f76331dee-image-registry-private-configuration\") pod \"954a0b17-ede2-4a5f-a717-fb8f76331dee\" (UID: \"954a0b17-ede2-4a5f-a717-fb8f76331dee\") " Apr 16 14:32:41.141466 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.141085 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/954a0b17-ede2-4a5f-a717-fb8f76331dee-registry-certificates\") pod \"954a0b17-ede2-4a5f-a717-fb8f76331dee\" (UID: \"954a0b17-ede2-4a5f-a717-fb8f76331dee\") " Apr 16 14:32:41.141466 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.141122 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/954a0b17-ede2-4a5f-a717-fb8f76331dee-registry-tls\") pod \"954a0b17-ede2-4a5f-a717-fb8f76331dee\" (UID: \"954a0b17-ede2-4a5f-a717-fb8f76331dee\") " Apr 16 14:32:41.141466 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.141174 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c94qs\" (UniqueName: \"kubernetes.io/projected/954a0b17-ede2-4a5f-a717-fb8f76331dee-kube-api-access-c94qs\") pod \"954a0b17-ede2-4a5f-a717-fb8f76331dee\" (UID: \"954a0b17-ede2-4a5f-a717-fb8f76331dee\") " Apr 16 14:32:41.141466 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.141462 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/954a0b17-ede2-4a5f-a717-fb8f76331dee-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "954a0b17-ede2-4a5f-a717-fb8f76331dee" (UID: "954a0b17-ede2-4a5f-a717-fb8f76331dee"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:32:41.142437 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.141907 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/954a0b17-ede2-4a5f-a717-fb8f76331dee-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "954a0b17-ede2-4a5f-a717-fb8f76331dee" (UID: "954a0b17-ede2-4a5f-a717-fb8f76331dee"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:32:41.146638 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.146571 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/954a0b17-ede2-4a5f-a717-fb8f76331dee-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "954a0b17-ede2-4a5f-a717-fb8f76331dee" (UID: "954a0b17-ede2-4a5f-a717-fb8f76331dee"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:32:41.146638 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.146573 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/954a0b17-ede2-4a5f-a717-fb8f76331dee-kube-api-access-c94qs" (OuterVolumeSpecName: "kube-api-access-c94qs") pod "954a0b17-ede2-4a5f-a717-fb8f76331dee" (UID: "954a0b17-ede2-4a5f-a717-fb8f76331dee"). InnerVolumeSpecName "kube-api-access-c94qs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:32:41.146805 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.146707 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/954a0b17-ede2-4a5f-a717-fb8f76331dee-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "954a0b17-ede2-4a5f-a717-fb8f76331dee" (UID: "954a0b17-ede2-4a5f-a717-fb8f76331dee"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:32:41.146867 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.146812 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/954a0b17-ede2-4a5f-a717-fb8f76331dee-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "954a0b17-ede2-4a5f-a717-fb8f76331dee" (UID: "954a0b17-ede2-4a5f-a717-fb8f76331dee"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:32:41.147723 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.147687 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/954a0b17-ede2-4a5f-a717-fb8f76331dee-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "954a0b17-ede2-4a5f-a717-fb8f76331dee" (UID: "954a0b17-ede2-4a5f-a717-fb8f76331dee"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:32:41.155818 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.155790 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/954a0b17-ede2-4a5f-a717-fb8f76331dee-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "954a0b17-ede2-4a5f-a717-fb8f76331dee" (UID: "954a0b17-ede2-4a5f-a717-fb8f76331dee"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:32:41.242607 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.242566 2579 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/954a0b17-ede2-4a5f-a717-fb8f76331dee-ca-trust-extracted\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:32:41.242607 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.242610 2579 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/954a0b17-ede2-4a5f-a717-fb8f76331dee-installation-pull-secrets\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:32:41.242906 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.242629 2579 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/954a0b17-ede2-4a5f-a717-fb8f76331dee-image-registry-private-configuration\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:32:41.242906 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.242645 2579 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/954a0b17-ede2-4a5f-a717-fb8f76331dee-registry-certificates\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:32:41.242906 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.242660 2579 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/954a0b17-ede2-4a5f-a717-fb8f76331dee-registry-tls\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:32:41.242906 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.242676 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c94qs\" (UniqueName: \"kubernetes.io/projected/954a0b17-ede2-4a5f-a717-fb8f76331dee-kube-api-access-c94qs\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:32:41.242906 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.242691 2579 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/954a0b17-ede2-4a5f-a717-fb8f76331dee-bound-sa-token\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:32:41.242906 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.242705 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/954a0b17-ede2-4a5f-a717-fb8f76331dee-trusted-ca\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:32:41.352246 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.352192 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6db78bffcf-j4hnv" Apr 16 14:32:41.352637 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.352617 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6db78bffcf-j4hnv" Apr 16 14:32:41.358995 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.358968 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6db78bffcf-j4hnv" Apr 16 14:32:41.374528 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.374508 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6db78bffcf-j4hnv" Apr 16 14:32:41.645262 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.645134 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7c4b5cb6c-c868m" event={"ID":"2cf40cf8-92d8-4a5a-afda-a5c5e3009baf","Type":"ContainerStarted","Data":"613e9671c108404c18d9522896859173cba1d70a72d6a0c5623d90206c4ccd87"} Apr 16 14:32:41.647144 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.647097 2579 generic.go:358] "Generic (PLEG): container finished" podID="954a0b17-ede2-4a5f-a717-fb8f76331dee" containerID="ac9ac373d68b76df0ce29d292f237361257c79f21ac910aca3d651addb111e50" exitCode=0 Apr 16 14:32:41.647144 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.647117 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" Apr 16 14:32:41.647144 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.647130 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" event={"ID":"954a0b17-ede2-4a5f-a717-fb8f76331dee","Type":"ContainerDied","Data":"ac9ac373d68b76df0ce29d292f237361257c79f21ac910aca3d651addb111e50"} Apr 16 14:32:41.647386 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.647158 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-579fc97f6b-mdg5r" event={"ID":"954a0b17-ede2-4a5f-a717-fb8f76331dee","Type":"ContainerDied","Data":"306e07d9a258a7bfafadfd7ee25b167039d310bdb1e8c2a9190c46cb5c922f75"} Apr 16 14:32:41.647386 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.647178 2579 scope.go:117] "RemoveContainer" containerID="ac9ac373d68b76df0ce29d292f237361257c79f21ac910aca3d651addb111e50" Apr 16 14:32:41.667544 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.667394 2579 scope.go:117] "RemoveContainer" containerID="ac9ac373d68b76df0ce29d292f237361257c79f21ac910aca3d651addb111e50" Apr 16 14:32:41.667908 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:32:41.667874 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac9ac373d68b76df0ce29d292f237361257c79f21ac910aca3d651addb111e50\": container with ID starting with ac9ac373d68b76df0ce29d292f237361257c79f21ac910aca3d651addb111e50 not found: ID does not exist" containerID="ac9ac373d68b76df0ce29d292f237361257c79f21ac910aca3d651addb111e50" Apr 16 14:32:41.668019 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.667917 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac9ac373d68b76df0ce29d292f237361257c79f21ac910aca3d651addb111e50"} err="failed to get container status \"ac9ac373d68b76df0ce29d292f237361257c79f21ac910aca3d651addb111e50\": rpc error: code = NotFound desc = could not find container \"ac9ac373d68b76df0ce29d292f237361257c79f21ac910aca3d651addb111e50\": container with ID starting with ac9ac373d68b76df0ce29d292f237361257c79f21ac910aca3d651addb111e50 not found: ID does not exist" Apr 16 14:32:41.674147 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.674088 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7c4b5cb6c-c868m" podStartSLOduration=0.945555109 podStartE2EDuration="5.674070478s" podCreationTimestamp="2026-04-16 14:32:36 +0000 UTC" firstStartedPulling="2026-04-16 14:32:36.609301504 +0000 UTC m=+163.227091259" lastFinishedPulling="2026-04-16 14:32:41.337816876 +0000 UTC m=+167.955606628" observedRunningTime="2026-04-16 14:32:41.671739971 +0000 UTC m=+168.289529747" watchObservedRunningTime="2026-04-16 14:32:41.674070478 +0000 UTC m=+168.291860242" Apr 16 14:32:41.688059 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.688016 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-579fc97f6b-mdg5r"] Apr 16 14:32:41.693094 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.693061 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-579fc97f6b-mdg5r"] Apr 16 14:32:41.979605 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:41.979527 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="954a0b17-ede2-4a5f-a717-fb8f76331dee" path="/var/lib/kubelet/pods/954a0b17-ede2-4a5f-a717-fb8f76331dee/volumes" Apr 16 14:32:43.975621 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:43.975602 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-src8k" Apr 16 14:32:43.978140 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:43.978123 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-zpkvj\"" Apr 16 14:32:43.986073 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:43.986054 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-src8k" Apr 16 14:32:44.130304 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:44.130273 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-src8k"] Apr 16 14:32:44.133446 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:32:44.133419 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93aad6d1_f6fd_49fd_aeac_3661dd3118bf.slice/crio-1b0889afd755c7afb254c5f317bae5d13124ce2b1b868564cf90927a03ae8e19 WatchSource:0}: Error finding container 1b0889afd755c7afb254c5f317bae5d13124ce2b1b868564cf90927a03ae8e19: Status 404 returned error can't find the container with id 1b0889afd755c7afb254c5f317bae5d13124ce2b1b868564cf90927a03ae8e19 Apr 16 14:32:44.663629 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:44.663574 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0d94d7fa-516f-4583-967f-477abf72f68e","Type":"ContainerStarted","Data":"1ec7b0a8032fdba3cd3d09d3faa906e4ed1de4cb05150807b8756dd7d3932bdc"} Apr 16 14:32:44.663629 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:44.663631 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0d94d7fa-516f-4583-967f-477abf72f68e","Type":"ContainerStarted","Data":"8c830c8a33f89aebafa79bfb4c8580b55cfdf3c1ccd754419cdbe70a762092cc"} Apr 16 14:32:44.663974 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:44.663648 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0d94d7fa-516f-4583-967f-477abf72f68e","Type":"ContainerStarted","Data":"1c8e13c5e18ac0c64dd3faed34b1e673389e3884e068b914ab3067414b490c23"} Apr 16 14:32:44.663974 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:44.663661 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0d94d7fa-516f-4583-967f-477abf72f68e","Type":"ContainerStarted","Data":"ff1e5e1ce1371e70f2726eaa8b50fd19d471e4d0a8ea317cfb4724be6cbe1ec2"} Apr 16 14:32:44.663974 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:44.663673 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0d94d7fa-516f-4583-967f-477abf72f68e","Type":"ContainerStarted","Data":"d80d276e03844e3cb060934813a7699afa3976f592d7b7dc399fb7d332a9b3bf"} Apr 16 14:32:44.664955 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:44.664924 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-src8k" event={"ID":"93aad6d1-f6fd-49fd-aeac-3661dd3118bf","Type":"ContainerStarted","Data":"1b0889afd755c7afb254c5f317bae5d13124ce2b1b868564cf90927a03ae8e19"} Apr 16 14:32:44.667646 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:44.667611 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"671ec7c2-af4a-4a14-9063-810caf17324b","Type":"ContainerStarted","Data":"e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1"} Apr 16 14:32:44.667782 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:44.667658 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"671ec7c2-af4a-4a14-9063-810caf17324b","Type":"ContainerStarted","Data":"22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e"} Apr 16 14:32:44.973750 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:44.973666 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m57qr" Apr 16 14:32:47.677546 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:47.677442 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-src8k" event={"ID":"93aad6d1-f6fd-49fd-aeac-3661dd3118bf","Type":"ContainerStarted","Data":"9ca6f20b3a1eca057404b59ba3b5178f899be3eddf53455021a7e52e6eeb3586"} Apr 16 14:32:47.680362 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:47.680334 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"671ec7c2-af4a-4a14-9063-810caf17324b","Type":"ContainerStarted","Data":"6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2"} Apr 16 14:32:47.680480 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:47.680367 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"671ec7c2-af4a-4a14-9063-810caf17324b","Type":"ContainerStarted","Data":"09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7"} Apr 16 14:32:47.680480 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:47.680379 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"671ec7c2-af4a-4a14-9063-810caf17324b","Type":"ContainerStarted","Data":"51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69"} Apr 16 14:32:47.680480 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:47.680387 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"671ec7c2-af4a-4a14-9063-810caf17324b","Type":"ContainerStarted","Data":"8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235"} Apr 16 14:32:47.683227 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:47.683198 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0d94d7fa-516f-4583-967f-477abf72f68e","Type":"ContainerStarted","Data":"3cc1373da73aae10a4a18e00c3f072e519fed301d0fca7fa1e8aac012203c444"} Apr 16 14:32:47.694328 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:47.694281 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-src8k" podStartSLOduration=138.812464001 podStartE2EDuration="2m21.694269124s" podCreationTimestamp="2026-04-16 14:30:26 +0000 UTC" firstStartedPulling="2026-04-16 14:32:44.13532028 +0000 UTC m=+170.753110020" lastFinishedPulling="2026-04-16 14:32:47.017125391 +0000 UTC m=+173.634915143" observedRunningTime="2026-04-16 14:32:47.69373432 +0000 UTC m=+174.311524084" watchObservedRunningTime="2026-04-16 14:32:47.694269124 +0000 UTC m=+174.312058890" Apr 16 14:32:47.723781 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:47.723709 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.440492061 podStartE2EDuration="10.723691737s" podCreationTimestamp="2026-04-16 14:32:37 +0000 UTC" firstStartedPulling="2026-04-16 14:32:38.730337865 +0000 UTC m=+165.348127606" lastFinishedPulling="2026-04-16 14:32:47.013537538 +0000 UTC m=+173.631327282" observedRunningTime="2026-04-16 14:32:47.722317989 +0000 UTC m=+174.340107762" watchObservedRunningTime="2026-04-16 14:32:47.723691737 +0000 UTC m=+174.341481481" Apr 16 14:32:47.747016 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:47.746951 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=4.603278549 podStartE2EDuration="15.746932435s" podCreationTimestamp="2026-04-16 14:32:32 +0000 UTC" firstStartedPulling="2026-04-16 14:32:35.869816965 +0000 UTC m=+162.487606709" lastFinishedPulling="2026-04-16 14:32:47.01347085 +0000 UTC m=+173.631260595" observedRunningTime="2026-04-16 14:32:47.746367848 +0000 UTC m=+174.364157623" watchObservedRunningTime="2026-04-16 14:32:47.746932435 +0000 UTC m=+174.364722199" Apr 16 14:32:48.276169 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:48.276137 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:32:48.514190 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:48.514157 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6db78bffcf-j4hnv"] Apr 16 14:32:48.637252 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:48.637160 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4t4r6" Apr 16 14:32:56.436998 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:56.436952 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7c4b5cb6c-c868m" Apr 16 14:32:56.436998 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:32:56.437007 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-7c4b5cb6c-c868m" Apr 16 14:33:13.534137 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:13.534077 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6db78bffcf-j4hnv" podUID="2840ca5d-533d-4e77-8636-f68c1bfa7c31" containerName="console" containerID="cri-o://04ba492bc696dd5a9fe584d120a15080fb94a7ea83fdd2c433ca255eafe07f0d" gracePeriod=15 Apr 16 14:33:13.759611 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:13.759576 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6db78bffcf-j4hnv_2840ca5d-533d-4e77-8636-f68c1bfa7c31/console/0.log" Apr 16 14:33:13.759761 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:13.759626 2579 generic.go:358] "Generic (PLEG): container finished" podID="2840ca5d-533d-4e77-8636-f68c1bfa7c31" containerID="04ba492bc696dd5a9fe584d120a15080fb94a7ea83fdd2c433ca255eafe07f0d" exitCode=2 Apr 16 14:33:13.759761 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:13.759713 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6db78bffcf-j4hnv" event={"ID":"2840ca5d-533d-4e77-8636-f68c1bfa7c31","Type":"ContainerDied","Data":"04ba492bc696dd5a9fe584d120a15080fb94a7ea83fdd2c433ca255eafe07f0d"} Apr 16 14:33:13.808619 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:13.808590 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6db78bffcf-j4hnv_2840ca5d-533d-4e77-8636-f68c1bfa7c31/console/0.log" Apr 16 14:33:13.808747 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:13.808668 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6db78bffcf-j4hnv" Apr 16 14:33:13.940072 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:13.940046 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2840ca5d-533d-4e77-8636-f68c1bfa7c31-service-ca\") pod \"2840ca5d-533d-4e77-8636-f68c1bfa7c31\" (UID: \"2840ca5d-533d-4e77-8636-f68c1bfa7c31\") " Apr 16 14:33:13.940260 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:13.940083 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2840ca5d-533d-4e77-8636-f68c1bfa7c31-console-oauth-config\") pod \"2840ca5d-533d-4e77-8636-f68c1bfa7c31\" (UID: \"2840ca5d-533d-4e77-8636-f68c1bfa7c31\") " Apr 16 14:33:13.940260 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:13.940125 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2840ca5d-533d-4e77-8636-f68c1bfa7c31-console-serving-cert\") pod \"2840ca5d-533d-4e77-8636-f68c1bfa7c31\" (UID: \"2840ca5d-533d-4e77-8636-f68c1bfa7c31\") " Apr 16 14:33:13.940396 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:13.940259 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2840ca5d-533d-4e77-8636-f68c1bfa7c31-trusted-ca-bundle\") pod \"2840ca5d-533d-4e77-8636-f68c1bfa7c31\" (UID: \"2840ca5d-533d-4e77-8636-f68c1bfa7c31\") " Apr 16 14:33:13.940396 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:13.940324 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2840ca5d-533d-4e77-8636-f68c1bfa7c31-console-config\") pod \"2840ca5d-533d-4e77-8636-f68c1bfa7c31\" (UID: \"2840ca5d-533d-4e77-8636-f68c1bfa7c31\") " Apr 16 14:33:13.940396 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:13.940366 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh8bv\" (UniqueName: \"kubernetes.io/projected/2840ca5d-533d-4e77-8636-f68c1bfa7c31-kube-api-access-zh8bv\") pod \"2840ca5d-533d-4e77-8636-f68c1bfa7c31\" (UID: \"2840ca5d-533d-4e77-8636-f68c1bfa7c31\") " Apr 16 14:33:13.940554 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:13.940398 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2840ca5d-533d-4e77-8636-f68c1bfa7c31-oauth-serving-cert\") pod \"2840ca5d-533d-4e77-8636-f68c1bfa7c31\" (UID: \"2840ca5d-533d-4e77-8636-f68c1bfa7c31\") " Apr 16 14:33:13.940554 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:13.940400 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2840ca5d-533d-4e77-8636-f68c1bfa7c31-service-ca" (OuterVolumeSpecName: "service-ca") pod "2840ca5d-533d-4e77-8636-f68c1bfa7c31" (UID: "2840ca5d-533d-4e77-8636-f68c1bfa7c31"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:33:13.940765 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:13.940697 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2840ca5d-533d-4e77-8636-f68c1bfa7c31-service-ca\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:13.940765 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:13.940718 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2840ca5d-533d-4e77-8636-f68c1bfa7c31-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "2840ca5d-533d-4e77-8636-f68c1bfa7c31" (UID: "2840ca5d-533d-4e77-8636-f68c1bfa7c31"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:33:13.941015 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:13.940876 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2840ca5d-533d-4e77-8636-f68c1bfa7c31-console-config" (OuterVolumeSpecName: "console-config") pod "2840ca5d-533d-4e77-8636-f68c1bfa7c31" (UID: "2840ca5d-533d-4e77-8636-f68c1bfa7c31"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:33:13.941098 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:13.941008 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2840ca5d-533d-4e77-8636-f68c1bfa7c31-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "2840ca5d-533d-4e77-8636-f68c1bfa7c31" (UID: "2840ca5d-533d-4e77-8636-f68c1bfa7c31"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:33:13.942822 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:13.942795 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2840ca5d-533d-4e77-8636-f68c1bfa7c31-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "2840ca5d-533d-4e77-8636-f68c1bfa7c31" (UID: "2840ca5d-533d-4e77-8636-f68c1bfa7c31"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:33:13.942944 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:13.942922 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2840ca5d-533d-4e77-8636-f68c1bfa7c31-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "2840ca5d-533d-4e77-8636-f68c1bfa7c31" (UID: "2840ca5d-533d-4e77-8636-f68c1bfa7c31"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:33:13.943007 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:13.942945 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2840ca5d-533d-4e77-8636-f68c1bfa7c31-kube-api-access-zh8bv" (OuterVolumeSpecName: "kube-api-access-zh8bv") pod "2840ca5d-533d-4e77-8636-f68c1bfa7c31" (UID: "2840ca5d-533d-4e77-8636-f68c1bfa7c31"). InnerVolumeSpecName "kube-api-access-zh8bv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:33:14.041753 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:14.041713 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2840ca5d-533d-4e77-8636-f68c1bfa7c31-trusted-ca-bundle\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:14.041753 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:14.041745 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2840ca5d-533d-4e77-8636-f68c1bfa7c31-console-config\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:14.041753 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:14.041755 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zh8bv\" (UniqueName: \"kubernetes.io/projected/2840ca5d-533d-4e77-8636-f68c1bfa7c31-kube-api-access-zh8bv\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:14.041753 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:14.041764 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2840ca5d-533d-4e77-8636-f68c1bfa7c31-oauth-serving-cert\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:14.042011 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:14.041773 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2840ca5d-533d-4e77-8636-f68c1bfa7c31-console-oauth-config\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:14.042011 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:14.041782 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2840ca5d-533d-4e77-8636-f68c1bfa7c31-console-serving-cert\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:14.763819 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:14.763777 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6db78bffcf-j4hnv_2840ca5d-533d-4e77-8636-f68c1bfa7c31/console/0.log" Apr 16 14:33:14.764317 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:14.763864 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6db78bffcf-j4hnv" event={"ID":"2840ca5d-533d-4e77-8636-f68c1bfa7c31","Type":"ContainerDied","Data":"213c40b473e615b10aeea4aca7b9c55646d6a4a55fb4a3d7aafae7e556a261ad"} Apr 16 14:33:14.764317 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:14.763891 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6db78bffcf-j4hnv" Apr 16 14:33:14.764317 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:14.763905 2579 scope.go:117] "RemoveContainer" containerID="04ba492bc696dd5a9fe584d120a15080fb94a7ea83fdd2c433ca255eafe07f0d" Apr 16 14:33:14.782229 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:14.782200 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6db78bffcf-j4hnv"] Apr 16 14:33:14.785936 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:14.785915 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6db78bffcf-j4hnv"] Apr 16 14:33:15.977435 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:15.977395 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2840ca5d-533d-4e77-8636-f68c1bfa7c31" path="/var/lib/kubelet/pods/2840ca5d-533d-4e77-8636-f68c1bfa7c31/volumes" Apr 16 14:33:16.441731 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:16.441701 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7c4b5cb6c-c868m" Apr 16 14:33:16.445541 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:16.445519 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7c4b5cb6c-c868m" Apr 16 14:33:38.276519 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:38.276484 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:38.296204 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:38.296175 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:38.846651 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:38.846626 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:52.060909 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:52.060870 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:33:52.061542 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:52.061485 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="0d94d7fa-516f-4583-967f-477abf72f68e" containerName="alertmanager" containerID="cri-o://1ec7b0a8032fdba3cd3d09d3faa906e4ed1de4cb05150807b8756dd7d3932bdc" gracePeriod=120 Apr 16 14:33:52.061695 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:52.061551 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="0d94d7fa-516f-4583-967f-477abf72f68e" containerName="kube-rbac-proxy-web" containerID="cri-o://ff1e5e1ce1371e70f2726eaa8b50fd19d471e4d0a8ea317cfb4724be6cbe1ec2" gracePeriod=120 Apr 16 14:33:52.061695 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:52.061568 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="0d94d7fa-516f-4583-967f-477abf72f68e" containerName="config-reloader" containerID="cri-o://d80d276e03844e3cb060934813a7699afa3976f592d7b7dc399fb7d332a9b3bf" gracePeriod=120 Apr 16 14:33:52.061695 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:52.061622 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="0d94d7fa-516f-4583-967f-477abf72f68e" containerName="kube-rbac-proxy" containerID="cri-o://1c8e13c5e18ac0c64dd3faed34b1e673389e3884e068b914ab3067414b490c23" gracePeriod=120 Apr 16 14:33:52.061695 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:52.061551 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="0d94d7fa-516f-4583-967f-477abf72f68e" containerName="kube-rbac-proxy-metric" containerID="cri-o://8c830c8a33f89aebafa79bfb4c8580b55cfdf3c1ccd754419cdbe70a762092cc" gracePeriod=120 Apr 16 14:33:52.061924 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:52.061629 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="0d94d7fa-516f-4583-967f-477abf72f68e" containerName="prom-label-proxy" containerID="cri-o://3cc1373da73aae10a4a18e00c3f072e519fed301d0fca7fa1e8aac012203c444" gracePeriod=120 Apr 16 14:33:52.873864 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:52.873830 2579 generic.go:358] "Generic (PLEG): container finished" podID="0d94d7fa-516f-4583-967f-477abf72f68e" containerID="3cc1373da73aae10a4a18e00c3f072e519fed301d0fca7fa1e8aac012203c444" exitCode=0 Apr 16 14:33:52.873864 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:52.873858 2579 generic.go:358] "Generic (PLEG): container finished" podID="0d94d7fa-516f-4583-967f-477abf72f68e" containerID="1c8e13c5e18ac0c64dd3faed34b1e673389e3884e068b914ab3067414b490c23" exitCode=0 Apr 16 14:33:52.873864 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:52.873866 2579 generic.go:358] "Generic (PLEG): container finished" podID="0d94d7fa-516f-4583-967f-477abf72f68e" containerID="ff1e5e1ce1371e70f2726eaa8b50fd19d471e4d0a8ea317cfb4724be6cbe1ec2" exitCode=0 Apr 16 14:33:52.873864 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:52.873872 2579 generic.go:358] "Generic (PLEG): container finished" podID="0d94d7fa-516f-4583-967f-477abf72f68e" containerID="d80d276e03844e3cb060934813a7699afa3976f592d7b7dc399fb7d332a9b3bf" exitCode=0 Apr 16 14:33:52.874174 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:52.873877 2579 generic.go:358] "Generic (PLEG): container finished" podID="0d94d7fa-516f-4583-967f-477abf72f68e" containerID="1ec7b0a8032fdba3cd3d09d3faa906e4ed1de4cb05150807b8756dd7d3932bdc" exitCode=0 Apr 16 14:33:52.874174 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:52.873909 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0d94d7fa-516f-4583-967f-477abf72f68e","Type":"ContainerDied","Data":"3cc1373da73aae10a4a18e00c3f072e519fed301d0fca7fa1e8aac012203c444"} Apr 16 14:33:52.874174 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:52.873951 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0d94d7fa-516f-4583-967f-477abf72f68e","Type":"ContainerDied","Data":"1c8e13c5e18ac0c64dd3faed34b1e673389e3884e068b914ab3067414b490c23"} Apr 16 14:33:52.874174 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:52.873965 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0d94d7fa-516f-4583-967f-477abf72f68e","Type":"ContainerDied","Data":"ff1e5e1ce1371e70f2726eaa8b50fd19d471e4d0a8ea317cfb4724be6cbe1ec2"} Apr 16 14:33:52.874174 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:52.873977 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0d94d7fa-516f-4583-967f-477abf72f68e","Type":"ContainerDied","Data":"d80d276e03844e3cb060934813a7699afa3976f592d7b7dc399fb7d332a9b3bf"} Apr 16 14:33:52.874174 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:52.873989 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0d94d7fa-516f-4583-967f-477abf72f68e","Type":"ContainerDied","Data":"1ec7b0a8032fdba3cd3d09d3faa906e4ed1de4cb05150807b8756dd7d3932bdc"} Apr 16 14:33:53.304537 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.304512 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:53.395614 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.395533 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2rz2\" (UniqueName: \"kubernetes.io/projected/0d94d7fa-516f-4583-967f-477abf72f68e-kube-api-access-b2rz2\") pod \"0d94d7fa-516f-4583-967f-477abf72f68e\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " Apr 16 14:33:53.395614 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.395579 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0d94d7fa-516f-4583-967f-477abf72f68e-metrics-client-ca\") pod \"0d94d7fa-516f-4583-967f-477abf72f68e\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " Apr 16 14:33:53.395804 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.395631 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"0d94d7fa-516f-4583-967f-477abf72f68e\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " Apr 16 14:33:53.395804 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.395664 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0d94d7fa-516f-4583-967f-477abf72f68e-config-out\") pod \"0d94d7fa-516f-4583-967f-477abf72f68e\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " Apr 16 14:33:53.395804 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.395689 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/0d94d7fa-516f-4583-967f-477abf72f68e-alertmanager-main-db\") pod \"0d94d7fa-516f-4583-967f-477abf72f68e\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " Apr 16 14:33:53.395804 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.395726 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-web-config\") pod \"0d94d7fa-516f-4583-967f-477abf72f68e\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " Apr 16 14:33:53.395804 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.395755 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-secret-alertmanager-kube-rbac-proxy-web\") pod \"0d94d7fa-516f-4583-967f-477abf72f68e\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " Apr 16 14:33:53.395804 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.395791 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-secret-alertmanager-kube-rbac-proxy\") pod \"0d94d7fa-516f-4583-967f-477abf72f68e\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " Apr 16 14:33:53.396150 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.395845 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-secret-alertmanager-main-tls\") pod \"0d94d7fa-516f-4583-967f-477abf72f68e\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " Apr 16 14:33:53.396150 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.395870 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d94d7fa-516f-4583-967f-477abf72f68e-alertmanager-trusted-ca-bundle\") pod \"0d94d7fa-516f-4583-967f-477abf72f68e\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " Apr 16 14:33:53.396150 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.395897 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-cluster-tls-config\") pod \"0d94d7fa-516f-4583-967f-477abf72f68e\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " Apr 16 14:33:53.396150 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.395924 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0d94d7fa-516f-4583-967f-477abf72f68e-tls-assets\") pod \"0d94d7fa-516f-4583-967f-477abf72f68e\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " Apr 16 14:33:53.396150 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.395972 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-config-volume\") pod \"0d94d7fa-516f-4583-967f-477abf72f68e\" (UID: \"0d94d7fa-516f-4583-967f-477abf72f68e\") " Apr 16 14:33:53.396150 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.396098 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d94d7fa-516f-4583-967f-477abf72f68e-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "0d94d7fa-516f-4583-967f-477abf72f68e" (UID: "0d94d7fa-516f-4583-967f-477abf72f68e"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:33:53.396464 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.396285 2579 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/0d94d7fa-516f-4583-967f-477abf72f68e-alertmanager-main-db\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:53.396464 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.396332 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d94d7fa-516f-4583-967f-477abf72f68e-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "0d94d7fa-516f-4583-967f-477abf72f68e" (UID: "0d94d7fa-516f-4583-967f-477abf72f68e"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:33:53.396563 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.396532 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d94d7fa-516f-4583-967f-477abf72f68e-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "0d94d7fa-516f-4583-967f-477abf72f68e" (UID: "0d94d7fa-516f-4583-967f-477abf72f68e"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:33:53.398803 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.398753 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "0d94d7fa-516f-4583-967f-477abf72f68e" (UID: "0d94d7fa-516f-4583-967f-477abf72f68e"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:33:53.399282 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.399241 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d94d7fa-516f-4583-967f-477abf72f68e-kube-api-access-b2rz2" (OuterVolumeSpecName: "kube-api-access-b2rz2") pod "0d94d7fa-516f-4583-967f-477abf72f68e" (UID: "0d94d7fa-516f-4583-967f-477abf72f68e"). InnerVolumeSpecName "kube-api-access-b2rz2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:33:53.399406 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.399275 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d94d7fa-516f-4583-967f-477abf72f68e-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "0d94d7fa-516f-4583-967f-477abf72f68e" (UID: "0d94d7fa-516f-4583-967f-477abf72f68e"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:33:53.399601 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.399571 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d94d7fa-516f-4583-967f-477abf72f68e-config-out" (OuterVolumeSpecName: "config-out") pod "0d94d7fa-516f-4583-967f-477abf72f68e" (UID: "0d94d7fa-516f-4583-967f-477abf72f68e"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:33:53.399601 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.399591 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-config-volume" (OuterVolumeSpecName: "config-volume") pod "0d94d7fa-516f-4583-967f-477abf72f68e" (UID: "0d94d7fa-516f-4583-967f-477abf72f68e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:33:53.399871 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.399841 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "0d94d7fa-516f-4583-967f-477abf72f68e" (UID: "0d94d7fa-516f-4583-967f-477abf72f68e"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:33:53.399991 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.399972 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "0d94d7fa-516f-4583-967f-477abf72f68e" (UID: "0d94d7fa-516f-4583-967f-477abf72f68e"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:33:53.400227 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.400212 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "0d94d7fa-516f-4583-967f-477abf72f68e" (UID: "0d94d7fa-516f-4583-967f-477abf72f68e"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:33:53.402995 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.402969 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "0d94d7fa-516f-4583-967f-477abf72f68e" (UID: "0d94d7fa-516f-4583-967f-477abf72f68e"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:33:53.409867 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.409844 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-web-config" (OuterVolumeSpecName: "web-config") pod "0d94d7fa-516f-4583-967f-477abf72f68e" (UID: "0d94d7fa-516f-4583-967f-477abf72f68e"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:33:53.497204 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.497154 2579 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:53.497204 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.497199 2579 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-secret-alertmanager-main-tls\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:53.497204 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.497210 2579 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d94d7fa-516f-4583-967f-477abf72f68e-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:53.497204 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.497221 2579 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-cluster-tls-config\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:53.497459 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.497232 2579 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0d94d7fa-516f-4583-967f-477abf72f68e-tls-assets\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:53.497459 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.497241 2579 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-config-volume\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:53.497459 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.497249 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b2rz2\" (UniqueName: \"kubernetes.io/projected/0d94d7fa-516f-4583-967f-477abf72f68e-kube-api-access-b2rz2\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:53.497459 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.497257 2579 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0d94d7fa-516f-4583-967f-477abf72f68e-metrics-client-ca\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:53.497459 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.497266 2579 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:53.497459 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.497276 2579 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0d94d7fa-516f-4583-967f-477abf72f68e-config-out\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:53.497459 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.497285 2579 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-web-config\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:53.497459 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.497293 2579 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0d94d7fa-516f-4583-967f-477abf72f68e-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:53.878521 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.878493 2579 generic.go:358] "Generic (PLEG): container finished" podID="0d94d7fa-516f-4583-967f-477abf72f68e" containerID="8c830c8a33f89aebafa79bfb4c8580b55cfdf3c1ccd754419cdbe70a762092cc" exitCode=0 Apr 16 14:33:53.878658 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.878531 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0d94d7fa-516f-4583-967f-477abf72f68e","Type":"ContainerDied","Data":"8c830c8a33f89aebafa79bfb4c8580b55cfdf3c1ccd754419cdbe70a762092cc"} Apr 16 14:33:53.878658 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.878554 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0d94d7fa-516f-4583-967f-477abf72f68e","Type":"ContainerDied","Data":"b64c703cbffbe56c5b3495fa5117c770f27b50a10bf82fab0657820e8eac6ad5"} Apr 16 14:33:53.878658 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.878573 2579 scope.go:117] "RemoveContainer" containerID="3cc1373da73aae10a4a18e00c3f072e519fed301d0fca7fa1e8aac012203c444" Apr 16 14:33:53.878658 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.878594 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:53.886110 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.885948 2579 scope.go:117] "RemoveContainer" containerID="8c830c8a33f89aebafa79bfb4c8580b55cfdf3c1ccd754419cdbe70a762092cc" Apr 16 14:33:53.892505 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.892488 2579 scope.go:117] "RemoveContainer" containerID="1c8e13c5e18ac0c64dd3faed34b1e673389e3884e068b914ab3067414b490c23" Apr 16 14:33:53.898420 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.898403 2579 scope.go:117] "RemoveContainer" containerID="ff1e5e1ce1371e70f2726eaa8b50fd19d471e4d0a8ea317cfb4724be6cbe1ec2" Apr 16 14:33:53.903311 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.903289 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:33:53.905750 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.905734 2579 scope.go:117] "RemoveContainer" containerID="d80d276e03844e3cb060934813a7699afa3976f592d7b7dc399fb7d332a9b3bf" Apr 16 14:33:53.912344 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.912320 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:33:53.913516 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.913496 2579 scope.go:117] "RemoveContainer" containerID="1ec7b0a8032fdba3cd3d09d3faa906e4ed1de4cb05150807b8756dd7d3932bdc" Apr 16 14:33:53.919932 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.919916 2579 scope.go:117] "RemoveContainer" containerID="8bba2946d5c6134cc944237e0e002ffb2e2756982b9be941fb43a0e2118a83ae" Apr 16 14:33:53.926401 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.926384 2579 scope.go:117] "RemoveContainer" containerID="3cc1373da73aae10a4a18e00c3f072e519fed301d0fca7fa1e8aac012203c444" Apr 16 14:33:53.926664 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:33:53.926647 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cc1373da73aae10a4a18e00c3f072e519fed301d0fca7fa1e8aac012203c444\": container with ID starting with 3cc1373da73aae10a4a18e00c3f072e519fed301d0fca7fa1e8aac012203c444 not found: ID does not exist" containerID="3cc1373da73aae10a4a18e00c3f072e519fed301d0fca7fa1e8aac012203c444" Apr 16 14:33:53.926716 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.926671 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cc1373da73aae10a4a18e00c3f072e519fed301d0fca7fa1e8aac012203c444"} err="failed to get container status \"3cc1373da73aae10a4a18e00c3f072e519fed301d0fca7fa1e8aac012203c444\": rpc error: code = NotFound desc = could not find container \"3cc1373da73aae10a4a18e00c3f072e519fed301d0fca7fa1e8aac012203c444\": container with ID starting with 3cc1373da73aae10a4a18e00c3f072e519fed301d0fca7fa1e8aac012203c444 not found: ID does not exist" Apr 16 14:33:53.926716 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.926689 2579 scope.go:117] "RemoveContainer" containerID="8c830c8a33f89aebafa79bfb4c8580b55cfdf3c1ccd754419cdbe70a762092cc" Apr 16 14:33:53.926910 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:33:53.926893 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c830c8a33f89aebafa79bfb4c8580b55cfdf3c1ccd754419cdbe70a762092cc\": container with ID starting with 8c830c8a33f89aebafa79bfb4c8580b55cfdf3c1ccd754419cdbe70a762092cc not found: ID does not exist" containerID="8c830c8a33f89aebafa79bfb4c8580b55cfdf3c1ccd754419cdbe70a762092cc" Apr 16 14:33:53.926951 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.926916 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c830c8a33f89aebafa79bfb4c8580b55cfdf3c1ccd754419cdbe70a762092cc"} err="failed to get container status \"8c830c8a33f89aebafa79bfb4c8580b55cfdf3c1ccd754419cdbe70a762092cc\": rpc error: code = NotFound desc = could not find container \"8c830c8a33f89aebafa79bfb4c8580b55cfdf3c1ccd754419cdbe70a762092cc\": container with ID starting with 8c830c8a33f89aebafa79bfb4c8580b55cfdf3c1ccd754419cdbe70a762092cc not found: ID does not exist" Apr 16 14:33:53.926951 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.926933 2579 scope.go:117] "RemoveContainer" containerID="1c8e13c5e18ac0c64dd3faed34b1e673389e3884e068b914ab3067414b490c23" Apr 16 14:33:53.927193 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:33:53.927176 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c8e13c5e18ac0c64dd3faed34b1e673389e3884e068b914ab3067414b490c23\": container with ID starting with 1c8e13c5e18ac0c64dd3faed34b1e673389e3884e068b914ab3067414b490c23 not found: ID does not exist" containerID="1c8e13c5e18ac0c64dd3faed34b1e673389e3884e068b914ab3067414b490c23" Apr 16 14:33:53.927250 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.927197 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c8e13c5e18ac0c64dd3faed34b1e673389e3884e068b914ab3067414b490c23"} err="failed to get container status \"1c8e13c5e18ac0c64dd3faed34b1e673389e3884e068b914ab3067414b490c23\": rpc error: code = NotFound desc = could not find container \"1c8e13c5e18ac0c64dd3faed34b1e673389e3884e068b914ab3067414b490c23\": container with ID starting with 1c8e13c5e18ac0c64dd3faed34b1e673389e3884e068b914ab3067414b490c23 not found: ID does not exist" Apr 16 14:33:53.927250 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.927214 2579 scope.go:117] "RemoveContainer" containerID="ff1e5e1ce1371e70f2726eaa8b50fd19d471e4d0a8ea317cfb4724be6cbe1ec2" Apr 16 14:33:53.927439 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:33:53.927421 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff1e5e1ce1371e70f2726eaa8b50fd19d471e4d0a8ea317cfb4724be6cbe1ec2\": container with ID starting with ff1e5e1ce1371e70f2726eaa8b50fd19d471e4d0a8ea317cfb4724be6cbe1ec2 not found: ID does not exist" containerID="ff1e5e1ce1371e70f2726eaa8b50fd19d471e4d0a8ea317cfb4724be6cbe1ec2" Apr 16 14:33:53.927502 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.927460 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff1e5e1ce1371e70f2726eaa8b50fd19d471e4d0a8ea317cfb4724be6cbe1ec2"} err="failed to get container status \"ff1e5e1ce1371e70f2726eaa8b50fd19d471e4d0a8ea317cfb4724be6cbe1ec2\": rpc error: code = NotFound desc = could not find container \"ff1e5e1ce1371e70f2726eaa8b50fd19d471e4d0a8ea317cfb4724be6cbe1ec2\": container with ID starting with ff1e5e1ce1371e70f2726eaa8b50fd19d471e4d0a8ea317cfb4724be6cbe1ec2 not found: ID does not exist" Apr 16 14:33:53.927502 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.927484 2579 scope.go:117] "RemoveContainer" containerID="d80d276e03844e3cb060934813a7699afa3976f592d7b7dc399fb7d332a9b3bf" Apr 16 14:33:53.927728 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:33:53.927708 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d80d276e03844e3cb060934813a7699afa3976f592d7b7dc399fb7d332a9b3bf\": container with ID starting with d80d276e03844e3cb060934813a7699afa3976f592d7b7dc399fb7d332a9b3bf not found: ID does not exist" containerID="d80d276e03844e3cb060934813a7699afa3976f592d7b7dc399fb7d332a9b3bf" Apr 16 14:33:53.927770 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.927732 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d80d276e03844e3cb060934813a7699afa3976f592d7b7dc399fb7d332a9b3bf"} err="failed to get container status \"d80d276e03844e3cb060934813a7699afa3976f592d7b7dc399fb7d332a9b3bf\": rpc error: code = NotFound desc = could not find container \"d80d276e03844e3cb060934813a7699afa3976f592d7b7dc399fb7d332a9b3bf\": container with ID starting with d80d276e03844e3cb060934813a7699afa3976f592d7b7dc399fb7d332a9b3bf not found: ID does not exist" Apr 16 14:33:53.927770 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.927746 2579 scope.go:117] "RemoveContainer" containerID="1ec7b0a8032fdba3cd3d09d3faa906e4ed1de4cb05150807b8756dd7d3932bdc" Apr 16 14:33:53.927946 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:33:53.927933 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ec7b0a8032fdba3cd3d09d3faa906e4ed1de4cb05150807b8756dd7d3932bdc\": container with ID starting with 1ec7b0a8032fdba3cd3d09d3faa906e4ed1de4cb05150807b8756dd7d3932bdc not found: ID does not exist" containerID="1ec7b0a8032fdba3cd3d09d3faa906e4ed1de4cb05150807b8756dd7d3932bdc" Apr 16 14:33:53.927992 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.927948 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ec7b0a8032fdba3cd3d09d3faa906e4ed1de4cb05150807b8756dd7d3932bdc"} err="failed to get container status \"1ec7b0a8032fdba3cd3d09d3faa906e4ed1de4cb05150807b8756dd7d3932bdc\": rpc error: code = NotFound desc = could not find container \"1ec7b0a8032fdba3cd3d09d3faa906e4ed1de4cb05150807b8756dd7d3932bdc\": container with ID starting with 1ec7b0a8032fdba3cd3d09d3faa906e4ed1de4cb05150807b8756dd7d3932bdc not found: ID does not exist" Apr 16 14:33:53.927992 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.927959 2579 scope.go:117] "RemoveContainer" containerID="8bba2946d5c6134cc944237e0e002ffb2e2756982b9be941fb43a0e2118a83ae" Apr 16 14:33:53.928159 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:33:53.928142 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bba2946d5c6134cc944237e0e002ffb2e2756982b9be941fb43a0e2118a83ae\": container with ID starting with 8bba2946d5c6134cc944237e0e002ffb2e2756982b9be941fb43a0e2118a83ae not found: ID does not exist" containerID="8bba2946d5c6134cc944237e0e002ffb2e2756982b9be941fb43a0e2118a83ae" Apr 16 14:33:53.928214 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.928161 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bba2946d5c6134cc944237e0e002ffb2e2756982b9be941fb43a0e2118a83ae"} err="failed to get container status \"8bba2946d5c6134cc944237e0e002ffb2e2756982b9be941fb43a0e2118a83ae\": rpc error: code = NotFound desc = could not find container \"8bba2946d5c6134cc944237e0e002ffb2e2756982b9be941fb43a0e2118a83ae\": container with ID starting with 8bba2946d5c6134cc944237e0e002ffb2e2756982b9be941fb43a0e2118a83ae not found: ID does not exist" Apr 16 14:33:53.942893 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.942870 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:33:53.943254 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.943238 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d94d7fa-516f-4583-967f-477abf72f68e" containerName="config-reloader" Apr 16 14:33:53.943348 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.943256 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d94d7fa-516f-4583-967f-477abf72f68e" containerName="config-reloader" Apr 16 14:33:53.943348 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.943268 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="954a0b17-ede2-4a5f-a717-fb8f76331dee" containerName="registry" Apr 16 14:33:53.943348 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.943277 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="954a0b17-ede2-4a5f-a717-fb8f76331dee" containerName="registry" Apr 16 14:33:53.943348 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.943294 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d94d7fa-516f-4583-967f-477abf72f68e" containerName="kube-rbac-proxy-metric" Apr 16 14:33:53.943348 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.943305 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d94d7fa-516f-4583-967f-477abf72f68e" containerName="kube-rbac-proxy-metric" Apr 16 14:33:53.943348 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.943321 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d94d7fa-516f-4583-967f-477abf72f68e" containerName="alertmanager" Apr 16 14:33:53.943348 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.943329 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d94d7fa-516f-4583-967f-477abf72f68e" containerName="alertmanager" Apr 16 14:33:53.943348 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.943340 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d94d7fa-516f-4583-967f-477abf72f68e" containerName="kube-rbac-proxy-web" Apr 16 14:33:53.943348 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.943349 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d94d7fa-516f-4583-967f-477abf72f68e" containerName="kube-rbac-proxy-web" Apr 16 14:33:53.943776 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.943360 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d94d7fa-516f-4583-967f-477abf72f68e" containerName="kube-rbac-proxy" Apr 16 14:33:53.943776 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.943396 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d94d7fa-516f-4583-967f-477abf72f68e" containerName="kube-rbac-proxy" Apr 16 14:33:53.943776 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.943411 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d94d7fa-516f-4583-967f-477abf72f68e" containerName="init-config-reloader" Apr 16 14:33:53.943776 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.943419 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d94d7fa-516f-4583-967f-477abf72f68e" containerName="init-config-reloader" Apr 16 14:33:53.943776 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.943429 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d94d7fa-516f-4583-967f-477abf72f68e" containerName="prom-label-proxy" Apr 16 14:33:53.943776 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.943438 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d94d7fa-516f-4583-967f-477abf72f68e" containerName="prom-label-proxy" Apr 16 14:33:53.943776 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.943453 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2840ca5d-533d-4e77-8636-f68c1bfa7c31" containerName="console" Apr 16 14:33:53.943776 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.943461 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="2840ca5d-533d-4e77-8636-f68c1bfa7c31" containerName="console" Apr 16 14:33:53.943776 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.943552 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0d94d7fa-516f-4583-967f-477abf72f68e" containerName="config-reloader" Apr 16 14:33:53.943776 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.943564 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="2840ca5d-533d-4e77-8636-f68c1bfa7c31" containerName="console" Apr 16 14:33:53.943776 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.943574 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0d94d7fa-516f-4583-967f-477abf72f68e" containerName="alertmanager" Apr 16 14:33:53.943776 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.943586 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0d94d7fa-516f-4583-967f-477abf72f68e" containerName="kube-rbac-proxy-metric" Apr 16 14:33:53.943776 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.943597 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="954a0b17-ede2-4a5f-a717-fb8f76331dee" containerName="registry" Apr 16 14:33:53.943776 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.943608 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0d94d7fa-516f-4583-967f-477abf72f68e" containerName="kube-rbac-proxy" Apr 16 14:33:53.943776 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.943618 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0d94d7fa-516f-4583-967f-477abf72f68e" containerName="prom-label-proxy" Apr 16 14:33:53.943776 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.943629 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0d94d7fa-516f-4583-967f-477abf72f68e" containerName="kube-rbac-proxy-web" Apr 16 14:33:53.947242 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.947225 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:53.949418 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.949400 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 14:33:53.949516 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.949417 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 14:33:53.949667 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.949654 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 14:33:53.949782 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.949765 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 14:33:53.949841 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.949785 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 14:33:53.949841 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.949817 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 14:33:53.949962 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.949918 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 14:33:53.950202 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.950180 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-dr2rr\"" Apr 16 14:33:53.950305 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.950205 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 14:33:53.956935 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.956913 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 14:33:53.959948 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.959927 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:33:53.978326 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:53.978305 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d94d7fa-516f-4583-967f-477abf72f68e" path="/var/lib/kubelet/pods/0d94d7fa-516f-4583-967f-477abf72f68e/volumes" Apr 16 14:33:54.101780 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.101736 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b5bff235-fbd3-4652-b00f-7ab1933a3a94-config-volume\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.101957 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.101786 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b5bff235-fbd3-4652-b00f-7ab1933a3a94-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.101957 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.101817 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5bff235-fbd3-4652-b00f-7ab1933a3a94-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.101957 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.101862 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b5bff235-fbd3-4652-b00f-7ab1933a3a94-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.101957 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.101906 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bc52\" (UniqueName: \"kubernetes.io/projected/b5bff235-fbd3-4652-b00f-7ab1933a3a94-kube-api-access-6bc52\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.101957 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.101932 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b5bff235-fbd3-4652-b00f-7ab1933a3a94-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.101957 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.101955 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b5bff235-fbd3-4652-b00f-7ab1933a3a94-web-config\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.102231 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.102014 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b5bff235-fbd3-4652-b00f-7ab1933a3a94-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.102231 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.102062 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b5bff235-fbd3-4652-b00f-7ab1933a3a94-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.102231 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.102086 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b5bff235-fbd3-4652-b00f-7ab1933a3a94-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.102231 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.102124 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b5bff235-fbd3-4652-b00f-7ab1933a3a94-config-out\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.102414 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.102269 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b5bff235-fbd3-4652-b00f-7ab1933a3a94-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.102414 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.102323 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b5bff235-fbd3-4652-b00f-7ab1933a3a94-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.202722 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.202677 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b5bff235-fbd3-4652-b00f-7ab1933a3a94-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.202881 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.202731 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b5bff235-fbd3-4652-b00f-7ab1933a3a94-config-volume\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.202881 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.202799 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b5bff235-fbd3-4652-b00f-7ab1933a3a94-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.202881 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.202819 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5bff235-fbd3-4652-b00f-7ab1933a3a94-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.202881 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.202857 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b5bff235-fbd3-4652-b00f-7ab1933a3a94-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.203113 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.202981 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6bc52\" (UniqueName: \"kubernetes.io/projected/b5bff235-fbd3-4652-b00f-7ab1933a3a94-kube-api-access-6bc52\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.203113 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.203074 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b5bff235-fbd3-4652-b00f-7ab1933a3a94-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.203113 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.203103 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b5bff235-fbd3-4652-b00f-7ab1933a3a94-web-config\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.203253 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.203167 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b5bff235-fbd3-4652-b00f-7ab1933a3a94-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.203253 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.203202 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b5bff235-fbd3-4652-b00f-7ab1933a3a94-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.203253 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.203231 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b5bff235-fbd3-4652-b00f-7ab1933a3a94-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.203470 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.203259 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b5bff235-fbd3-4652-b00f-7ab1933a3a94-config-out\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.203470 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.203293 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b5bff235-fbd3-4652-b00f-7ab1933a3a94-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.203824 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.203796 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5bff235-fbd3-4652-b00f-7ab1933a3a94-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.206336 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.205942 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b5bff235-fbd3-4652-b00f-7ab1933a3a94-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.206336 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.205950 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b5bff235-fbd3-4652-b00f-7ab1933a3a94-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.206336 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.206175 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b5bff235-fbd3-4652-b00f-7ab1933a3a94-config-volume\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.206336 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.206205 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b5bff235-fbd3-4652-b00f-7ab1933a3a94-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.206336 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.206274 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b5bff235-fbd3-4652-b00f-7ab1933a3a94-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.206336 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.206294 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b5bff235-fbd3-4652-b00f-7ab1933a3a94-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.206600 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.206393 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b5bff235-fbd3-4652-b00f-7ab1933a3a94-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.206600 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.206544 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b5bff235-fbd3-4652-b00f-7ab1933a3a94-web-config\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.206600 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.206580 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b5bff235-fbd3-4652-b00f-7ab1933a3a94-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.206685 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.206670 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b5bff235-fbd3-4652-b00f-7ab1933a3a94-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.208054 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.208018 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b5bff235-fbd3-4652-b00f-7ab1933a3a94-config-out\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.212199 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.212176 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bc52\" (UniqueName: \"kubernetes.io/projected/b5bff235-fbd3-4652-b00f-7ab1933a3a94-kube-api-access-6bc52\") pod \"alertmanager-main-0\" (UID: \"b5bff235-fbd3-4652-b00f-7ab1933a3a94\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.260070 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.260023 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:33:54.396212 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.396135 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:33:54.400458 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:33:54.400429 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5bff235_fbd3_4652_b00f_7ab1933a3a94.slice/crio-5c65e2e831178235b842a4c925bed5a9a77b660b9b62203dd94a69706d66f133 WatchSource:0}: Error finding container 5c65e2e831178235b842a4c925bed5a9a77b660b9b62203dd94a69706d66f133: Status 404 returned error can't find the container with id 5c65e2e831178235b842a4c925bed5a9a77b660b9b62203dd94a69706d66f133 Apr 16 14:33:54.883477 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.883445 2579 generic.go:358] "Generic (PLEG): container finished" podID="b5bff235-fbd3-4652-b00f-7ab1933a3a94" containerID="7da91cf27fa17f8f43824c81704b15291b26de6dcf9b0811c27e506cad79ab28" exitCode=0 Apr 16 14:33:54.883654 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.883489 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b5bff235-fbd3-4652-b00f-7ab1933a3a94","Type":"ContainerDied","Data":"7da91cf27fa17f8f43824c81704b15291b26de6dcf9b0811c27e506cad79ab28"} Apr 16 14:33:54.883654 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:54.883510 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b5bff235-fbd3-4652-b00f-7ab1933a3a94","Type":"ContainerStarted","Data":"5c65e2e831178235b842a4c925bed5a9a77b660b9b62203dd94a69706d66f133"} Apr 16 14:33:55.889292 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:55.889252 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b5bff235-fbd3-4652-b00f-7ab1933a3a94","Type":"ContainerStarted","Data":"d804f411ce857a5b89cd0504848861845a25f51b4c711efd177c4c992a496cf0"} Apr 16 14:33:55.889292 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:55.889296 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b5bff235-fbd3-4652-b00f-7ab1933a3a94","Type":"ContainerStarted","Data":"234ad1cb503fe227055c4dd4d90113116ed6a58e0558e3e98e4fc1372ca3f6a1"} Apr 16 14:33:55.889774 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:55.889309 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b5bff235-fbd3-4652-b00f-7ab1933a3a94","Type":"ContainerStarted","Data":"4dc35966764aa5b8fed02be6fa61af4a2593576b697fd6d3aa54c4b2abab095c"} Apr 16 14:33:55.889774 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:55.889321 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b5bff235-fbd3-4652-b00f-7ab1933a3a94","Type":"ContainerStarted","Data":"d3cc7023b48dce962ae252a65254a56568065fe876ef474cd3af962b4989e312"} Apr 16 14:33:55.889774 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:55.889331 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b5bff235-fbd3-4652-b00f-7ab1933a3a94","Type":"ContainerStarted","Data":"04893afa42eec96e7e6456b02d801685f206cde31d1551312d4477395bbf12da"} Apr 16 14:33:55.889774 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:55.889343 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b5bff235-fbd3-4652-b00f-7ab1933a3a94","Type":"ContainerStarted","Data":"5a29484780a0d6e1fb83b8efd6077a00f9162fe1416f0de7e930d4b0b533d30f"} Apr 16 14:33:55.920244 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:55.920149 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.920130275 podStartE2EDuration="2.920130275s" podCreationTimestamp="2026-04-16 14:33:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:33:55.918098882 +0000 UTC m=+242.535888644" watchObservedRunningTime="2026-04-16 14:33:55.920130275 +0000 UTC m=+242.537920040" Apr 16 14:33:56.066706 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.066671 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-6f8d7b8476-mqfpr"] Apr 16 14:33:56.069390 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.069373 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6f8d7b8476-mqfpr" Apr 16 14:33:56.071677 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.071640 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 14:33:56.071826 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.071653 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 14:33:56.071826 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.071653 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 14:33:56.071826 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.071659 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 14:33:56.071826 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.071735 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 14:33:56.071826 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.071746 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-k62h5\"" Apr 16 14:33:56.081862 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.081834 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 14:33:56.082653 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.082623 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6f8d7b8476-mqfpr"] Apr 16 14:33:56.224277 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.224181 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b19fca59-c615-4671-80aa-d1f9d57ddd74-metrics-client-ca\") pod \"telemeter-client-6f8d7b8476-mqfpr\" (UID: \"b19fca59-c615-4671-80aa-d1f9d57ddd74\") " pod="openshift-monitoring/telemeter-client-6f8d7b8476-mqfpr" Apr 16 14:33:56.224277 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.224223 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/b19fca59-c615-4671-80aa-d1f9d57ddd74-secret-telemeter-client\") pod \"telemeter-client-6f8d7b8476-mqfpr\" (UID: \"b19fca59-c615-4671-80aa-d1f9d57ddd74\") " pod="openshift-monitoring/telemeter-client-6f8d7b8476-mqfpr" Apr 16 14:33:56.224277 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.224254 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmthq\" (UniqueName: \"kubernetes.io/projected/b19fca59-c615-4671-80aa-d1f9d57ddd74-kube-api-access-lmthq\") pod \"telemeter-client-6f8d7b8476-mqfpr\" (UID: \"b19fca59-c615-4671-80aa-d1f9d57ddd74\") " pod="openshift-monitoring/telemeter-client-6f8d7b8476-mqfpr" Apr 16 14:33:56.224527 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.224286 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/b19fca59-c615-4671-80aa-d1f9d57ddd74-telemeter-client-tls\") pod \"telemeter-client-6f8d7b8476-mqfpr\" (UID: \"b19fca59-c615-4671-80aa-d1f9d57ddd74\") " pod="openshift-monitoring/telemeter-client-6f8d7b8476-mqfpr" Apr 16 14:33:56.224527 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.224302 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/b19fca59-c615-4671-80aa-d1f9d57ddd74-federate-client-tls\") pod \"telemeter-client-6f8d7b8476-mqfpr\" (UID: \"b19fca59-c615-4671-80aa-d1f9d57ddd74\") " pod="openshift-monitoring/telemeter-client-6f8d7b8476-mqfpr" Apr 16 14:33:56.224527 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.224432 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b19fca59-c615-4671-80aa-d1f9d57ddd74-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6f8d7b8476-mqfpr\" (UID: \"b19fca59-c615-4671-80aa-d1f9d57ddd74\") " pod="openshift-monitoring/telemeter-client-6f8d7b8476-mqfpr" Apr 16 14:33:56.224527 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.224482 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b19fca59-c615-4671-80aa-d1f9d57ddd74-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6f8d7b8476-mqfpr\" (UID: \"b19fca59-c615-4671-80aa-d1f9d57ddd74\") " pod="openshift-monitoring/telemeter-client-6f8d7b8476-mqfpr" Apr 16 14:33:56.224704 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.224560 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b19fca59-c615-4671-80aa-d1f9d57ddd74-serving-certs-ca-bundle\") pod \"telemeter-client-6f8d7b8476-mqfpr\" (UID: \"b19fca59-c615-4671-80aa-d1f9d57ddd74\") " pod="openshift-monitoring/telemeter-client-6f8d7b8476-mqfpr" Apr 16 14:33:56.325792 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.325748 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b19fca59-c615-4671-80aa-d1f9d57ddd74-metrics-client-ca\") pod \"telemeter-client-6f8d7b8476-mqfpr\" (UID: \"b19fca59-c615-4671-80aa-d1f9d57ddd74\") " pod="openshift-monitoring/telemeter-client-6f8d7b8476-mqfpr" Apr 16 14:33:56.325792 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.325793 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/b19fca59-c615-4671-80aa-d1f9d57ddd74-secret-telemeter-client\") pod \"telemeter-client-6f8d7b8476-mqfpr\" (UID: \"b19fca59-c615-4671-80aa-d1f9d57ddd74\") " pod="openshift-monitoring/telemeter-client-6f8d7b8476-mqfpr" Apr 16 14:33:56.326078 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.325821 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lmthq\" (UniqueName: \"kubernetes.io/projected/b19fca59-c615-4671-80aa-d1f9d57ddd74-kube-api-access-lmthq\") pod \"telemeter-client-6f8d7b8476-mqfpr\" (UID: \"b19fca59-c615-4671-80aa-d1f9d57ddd74\") " pod="openshift-monitoring/telemeter-client-6f8d7b8476-mqfpr" Apr 16 14:33:56.326078 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.325844 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/b19fca59-c615-4671-80aa-d1f9d57ddd74-telemeter-client-tls\") pod \"telemeter-client-6f8d7b8476-mqfpr\" (UID: \"b19fca59-c615-4671-80aa-d1f9d57ddd74\") " pod="openshift-monitoring/telemeter-client-6f8d7b8476-mqfpr" Apr 16 14:33:56.326078 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.325861 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/b19fca59-c615-4671-80aa-d1f9d57ddd74-federate-client-tls\") pod \"telemeter-client-6f8d7b8476-mqfpr\" (UID: \"b19fca59-c615-4671-80aa-d1f9d57ddd74\") " pod="openshift-monitoring/telemeter-client-6f8d7b8476-mqfpr" Apr 16 14:33:56.326078 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.325917 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b19fca59-c615-4671-80aa-d1f9d57ddd74-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6f8d7b8476-mqfpr\" (UID: \"b19fca59-c615-4671-80aa-d1f9d57ddd74\") " pod="openshift-monitoring/telemeter-client-6f8d7b8476-mqfpr" Apr 16 14:33:56.326078 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.325955 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b19fca59-c615-4671-80aa-d1f9d57ddd74-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6f8d7b8476-mqfpr\" (UID: \"b19fca59-c615-4671-80aa-d1f9d57ddd74\") " pod="openshift-monitoring/telemeter-client-6f8d7b8476-mqfpr" Apr 16 14:33:56.326331 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.326134 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b19fca59-c615-4671-80aa-d1f9d57ddd74-serving-certs-ca-bundle\") pod \"telemeter-client-6f8d7b8476-mqfpr\" (UID: \"b19fca59-c615-4671-80aa-d1f9d57ddd74\") " pod="openshift-monitoring/telemeter-client-6f8d7b8476-mqfpr" Apr 16 14:33:56.326737 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.326705 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b19fca59-c615-4671-80aa-d1f9d57ddd74-metrics-client-ca\") pod \"telemeter-client-6f8d7b8476-mqfpr\" (UID: \"b19fca59-c615-4671-80aa-d1f9d57ddd74\") " pod="openshift-monitoring/telemeter-client-6f8d7b8476-mqfpr" Apr 16 14:33:56.326855 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.326822 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b19fca59-c615-4671-80aa-d1f9d57ddd74-serving-certs-ca-bundle\") pod \"telemeter-client-6f8d7b8476-mqfpr\" (UID: \"b19fca59-c615-4671-80aa-d1f9d57ddd74\") " pod="openshift-monitoring/telemeter-client-6f8d7b8476-mqfpr" Apr 16 14:33:56.326855 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.326832 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b19fca59-c615-4671-80aa-d1f9d57ddd74-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6f8d7b8476-mqfpr\" (UID: \"b19fca59-c615-4671-80aa-d1f9d57ddd74\") " pod="openshift-monitoring/telemeter-client-6f8d7b8476-mqfpr" Apr 16 14:33:56.328484 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.328445 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b19fca59-c615-4671-80aa-d1f9d57ddd74-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6f8d7b8476-mqfpr\" (UID: \"b19fca59-c615-4671-80aa-d1f9d57ddd74\") " pod="openshift-monitoring/telemeter-client-6f8d7b8476-mqfpr" Apr 16 14:33:56.328608 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.328589 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/b19fca59-c615-4671-80aa-d1f9d57ddd74-federate-client-tls\") pod \"telemeter-client-6f8d7b8476-mqfpr\" (UID: \"b19fca59-c615-4671-80aa-d1f9d57ddd74\") " pod="openshift-monitoring/telemeter-client-6f8d7b8476-mqfpr" Apr 16 14:33:56.328940 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.328919 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/b19fca59-c615-4671-80aa-d1f9d57ddd74-secret-telemeter-client\") pod \"telemeter-client-6f8d7b8476-mqfpr\" (UID: \"b19fca59-c615-4671-80aa-d1f9d57ddd74\") " pod="openshift-monitoring/telemeter-client-6f8d7b8476-mqfpr" Apr 16 14:33:56.328980 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.328916 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/b19fca59-c615-4671-80aa-d1f9d57ddd74-telemeter-client-tls\") pod \"telemeter-client-6f8d7b8476-mqfpr\" (UID: \"b19fca59-c615-4671-80aa-d1f9d57ddd74\") " pod="openshift-monitoring/telemeter-client-6f8d7b8476-mqfpr" Apr 16 14:33:56.335897 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.335876 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmthq\" (UniqueName: \"kubernetes.io/projected/b19fca59-c615-4671-80aa-d1f9d57ddd74-kube-api-access-lmthq\") pod \"telemeter-client-6f8d7b8476-mqfpr\" (UID: \"b19fca59-c615-4671-80aa-d1f9d57ddd74\") " pod="openshift-monitoring/telemeter-client-6f8d7b8476-mqfpr" Apr 16 14:33:56.353896 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.353871 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:33:56.354374 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.354337 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="671ec7c2-af4a-4a14-9063-810caf17324b" containerName="prometheus" containerID="cri-o://22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e" gracePeriod=600 Apr 16 14:33:56.354374 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.354367 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="671ec7c2-af4a-4a14-9063-810caf17324b" containerName="kube-rbac-proxy-thanos" containerID="cri-o://6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2" gracePeriod=600 Apr 16 14:33:56.354558 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.354388 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="671ec7c2-af4a-4a14-9063-810caf17324b" containerName="kube-rbac-proxy-web" containerID="cri-o://51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69" gracePeriod=600 Apr 16 14:33:56.354558 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.354341 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="671ec7c2-af4a-4a14-9063-810caf17324b" containerName="kube-rbac-proxy" containerID="cri-o://09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7" gracePeriod=600 Apr 16 14:33:56.354558 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.354368 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="671ec7c2-af4a-4a14-9063-810caf17324b" containerName="thanos-sidecar" containerID="cri-o://8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235" gracePeriod=600 Apr 16 14:33:56.354558 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.354413 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="671ec7c2-af4a-4a14-9063-810caf17324b" containerName="config-reloader" containerID="cri-o://e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1" gracePeriod=600 Apr 16 14:33:56.383592 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.383568 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6f8d7b8476-mqfpr" Apr 16 14:33:56.516740 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.516714 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6f8d7b8476-mqfpr"] Apr 16 14:33:56.519618 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:33:56.519588 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb19fca59_c615_4671_80aa_d1f9d57ddd74.slice/crio-1068f392ccd1e722cf783e650c8b091933d589e60b4d2a8ce084f104a3c01127 WatchSource:0}: Error finding container 1068f392ccd1e722cf783e650c8b091933d589e60b4d2a8ce084f104a3c01127: Status 404 returned error can't find the container with id 1068f392ccd1e722cf783e650c8b091933d589e60b4d2a8ce084f104a3c01127 Apr 16 14:33:56.593026 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.593002 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:56.730023 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.729927 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-web-config\") pod \"671ec7c2-af4a-4a14-9063-810caf17324b\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " Apr 16 14:33:56.730023 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.729977 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-config\") pod \"671ec7c2-af4a-4a14-9063-810caf17324b\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " Apr 16 14:33:56.730023 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.730010 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/671ec7c2-af4a-4a14-9063-810caf17324b-prometheus-k8s-rulefiles-0\") pod \"671ec7c2-af4a-4a14-9063-810caf17324b\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " Apr 16 14:33:56.730332 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.730061 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/671ec7c2-af4a-4a14-9063-810caf17324b-prometheus-k8s-db\") pod \"671ec7c2-af4a-4a14-9063-810caf17324b\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " Apr 16 14:33:56.730332 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.730094 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/671ec7c2-af4a-4a14-9063-810caf17324b-prometheus-trusted-ca-bundle\") pod \"671ec7c2-af4a-4a14-9063-810caf17324b\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " Apr 16 14:33:56.730332 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.730125 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvrs4\" (UniqueName: \"kubernetes.io/projected/671ec7c2-af4a-4a14-9063-810caf17324b-kube-api-access-rvrs4\") pod \"671ec7c2-af4a-4a14-9063-810caf17324b\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " Apr 16 14:33:56.730332 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.730150 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/671ec7c2-af4a-4a14-9063-810caf17324b-config-out\") pod \"671ec7c2-af4a-4a14-9063-810caf17324b\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " Apr 16 14:33:56.730332 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.730178 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"671ec7c2-af4a-4a14-9063-810caf17324b\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " Apr 16 14:33:56.730332 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.730230 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/671ec7c2-af4a-4a14-9063-810caf17324b-tls-assets\") pod \"671ec7c2-af4a-4a14-9063-810caf17324b\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " Apr 16 14:33:56.730332 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.730295 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-secret-prometheus-k8s-tls\") pod \"671ec7c2-af4a-4a14-9063-810caf17324b\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " Apr 16 14:33:56.730332 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.730322 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-thanos-prometheus-http-client-file\") pod \"671ec7c2-af4a-4a14-9063-810caf17324b\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " Apr 16 14:33:56.730684 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.730352 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/671ec7c2-af4a-4a14-9063-810caf17324b-configmap-metrics-client-ca\") pod \"671ec7c2-af4a-4a14-9063-810caf17324b\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " Apr 16 14:33:56.730684 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.730379 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"671ec7c2-af4a-4a14-9063-810caf17324b\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " Apr 16 14:33:56.730684 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.730409 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-secret-metrics-client-certs\") pod \"671ec7c2-af4a-4a14-9063-810caf17324b\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " Apr 16 14:33:56.730684 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.730439 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/671ec7c2-af4a-4a14-9063-810caf17324b-configmap-serving-certs-ca-bundle\") pod \"671ec7c2-af4a-4a14-9063-810caf17324b\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " Apr 16 14:33:56.730684 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.730464 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-secret-grpc-tls\") pod \"671ec7c2-af4a-4a14-9063-810caf17324b\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " Apr 16 14:33:56.730684 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.730502 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/671ec7c2-af4a-4a14-9063-810caf17324b-configmap-kubelet-serving-ca-bundle\") pod \"671ec7c2-af4a-4a14-9063-810caf17324b\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " Apr 16 14:33:56.730684 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.730540 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-secret-kube-rbac-proxy\") pod \"671ec7c2-af4a-4a14-9063-810caf17324b\" (UID: \"671ec7c2-af4a-4a14-9063-810caf17324b\") " Apr 16 14:33:56.730994 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.730857 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/671ec7c2-af4a-4a14-9063-810caf17324b-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "671ec7c2-af4a-4a14-9063-810caf17324b" (UID: "671ec7c2-af4a-4a14-9063-810caf17324b"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:33:56.732123 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.732072 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/671ec7c2-af4a-4a14-9063-810caf17324b-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "671ec7c2-af4a-4a14-9063-810caf17324b" (UID: "671ec7c2-af4a-4a14-9063-810caf17324b"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:33:56.732256 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.732008 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/671ec7c2-af4a-4a14-9063-810caf17324b-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "671ec7c2-af4a-4a14-9063-810caf17324b" (UID: "671ec7c2-af4a-4a14-9063-810caf17324b"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:33:56.732256 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.732216 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/671ec7c2-af4a-4a14-9063-810caf17324b-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "671ec7c2-af4a-4a14-9063-810caf17324b" (UID: "671ec7c2-af4a-4a14-9063-810caf17324b"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:33:56.732544 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.732465 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/671ec7c2-af4a-4a14-9063-810caf17324b-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "671ec7c2-af4a-4a14-9063-810caf17324b" (UID: "671ec7c2-af4a-4a14-9063-810caf17324b"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:33:56.732544 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.732470 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/671ec7c2-af4a-4a14-9063-810caf17324b-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "671ec7c2-af4a-4a14-9063-810caf17324b" (UID: "671ec7c2-af4a-4a14-9063-810caf17324b"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:33:56.733134 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.733100 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-config" (OuterVolumeSpecName: "config") pod "671ec7c2-af4a-4a14-9063-810caf17324b" (UID: "671ec7c2-af4a-4a14-9063-810caf17324b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:33:56.733823 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.733795 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "671ec7c2-af4a-4a14-9063-810caf17324b" (UID: "671ec7c2-af4a-4a14-9063-810caf17324b"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:33:56.733939 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.733876 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/671ec7c2-af4a-4a14-9063-810caf17324b-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "671ec7c2-af4a-4a14-9063-810caf17324b" (UID: "671ec7c2-af4a-4a14-9063-810caf17324b"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:33:56.733939 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.733922 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "671ec7c2-af4a-4a14-9063-810caf17324b" (UID: "671ec7c2-af4a-4a14-9063-810caf17324b"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:33:56.734358 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.734330 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "671ec7c2-af4a-4a14-9063-810caf17324b" (UID: "671ec7c2-af4a-4a14-9063-810caf17324b"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:33:56.734641 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.734604 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "671ec7c2-af4a-4a14-9063-810caf17324b" (UID: "671ec7c2-af4a-4a14-9063-810caf17324b"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:33:56.734641 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.734615 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/671ec7c2-af4a-4a14-9063-810caf17324b-kube-api-access-rvrs4" (OuterVolumeSpecName: "kube-api-access-rvrs4") pod "671ec7c2-af4a-4a14-9063-810caf17324b" (UID: "671ec7c2-af4a-4a14-9063-810caf17324b"). InnerVolumeSpecName "kube-api-access-rvrs4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:33:56.735004 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.734981 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "671ec7c2-af4a-4a14-9063-810caf17324b" (UID: "671ec7c2-af4a-4a14-9063-810caf17324b"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:33:56.735209 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.735188 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "671ec7c2-af4a-4a14-9063-810caf17324b" (UID: "671ec7c2-af4a-4a14-9063-810caf17324b"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:33:56.735510 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.735488 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "671ec7c2-af4a-4a14-9063-810caf17324b" (UID: "671ec7c2-af4a-4a14-9063-810caf17324b"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:33:56.735510 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.735488 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/671ec7c2-af4a-4a14-9063-810caf17324b-config-out" (OuterVolumeSpecName: "config-out") pod "671ec7c2-af4a-4a14-9063-810caf17324b" (UID: "671ec7c2-af4a-4a14-9063-810caf17324b"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:33:56.743690 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.743672 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-web-config" (OuterVolumeSpecName: "web-config") pod "671ec7c2-af4a-4a14-9063-810caf17324b" (UID: "671ec7c2-af4a-4a14-9063-810caf17324b"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:33:56.832203 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.832158 2579 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/671ec7c2-af4a-4a14-9063-810caf17324b-tls-assets\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:56.832203 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.832198 2579 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-secret-prometheus-k8s-tls\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:56.832203 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.832209 2579 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-thanos-prometheus-http-client-file\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:56.832203 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.832220 2579 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/671ec7c2-af4a-4a14-9063-810caf17324b-configmap-metrics-client-ca\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:56.832456 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.832230 2579 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:56.832456 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.832239 2579 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-secret-metrics-client-certs\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:56.832456 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.832250 2579 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/671ec7c2-af4a-4a14-9063-810caf17324b-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:56.832456 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.832260 2579 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-secret-grpc-tls\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:56.832456 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.832269 2579 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/671ec7c2-af4a-4a14-9063-810caf17324b-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:56.832456 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.832277 2579 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-secret-kube-rbac-proxy\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:56.832456 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.832286 2579 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-web-config\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:56.832456 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.832296 2579 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-config\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:56.832456 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.832304 2579 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/671ec7c2-af4a-4a14-9063-810caf17324b-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:56.832456 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.832332 2579 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/671ec7c2-af4a-4a14-9063-810caf17324b-prometheus-k8s-db\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:56.832456 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.832341 2579 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/671ec7c2-af4a-4a14-9063-810caf17324b-prometheus-trusted-ca-bundle\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:56.832456 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.832350 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rvrs4\" (UniqueName: \"kubernetes.io/projected/671ec7c2-af4a-4a14-9063-810caf17324b-kube-api-access-rvrs4\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:56.832456 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.832360 2579 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/671ec7c2-af4a-4a14-9063-810caf17324b-config-out\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:56.832456 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.832369 2579 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/671ec7c2-af4a-4a14-9063-810caf17324b-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:33:56.894736 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.894688 2579 generic.go:358] "Generic (PLEG): container finished" podID="671ec7c2-af4a-4a14-9063-810caf17324b" containerID="6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2" exitCode=0 Apr 16 14:33:56.894736 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.894736 2579 generic.go:358] "Generic (PLEG): container finished" podID="671ec7c2-af4a-4a14-9063-810caf17324b" containerID="09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7" exitCode=0 Apr 16 14:33:56.894736 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.894745 2579 generic.go:358] "Generic (PLEG): container finished" podID="671ec7c2-af4a-4a14-9063-810caf17324b" containerID="51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69" exitCode=0 Apr 16 14:33:56.895230 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.894753 2579 generic.go:358] "Generic (PLEG): container finished" podID="671ec7c2-af4a-4a14-9063-810caf17324b" containerID="8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235" exitCode=0 Apr 16 14:33:56.895230 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.894761 2579 generic.go:358] "Generic (PLEG): container finished" podID="671ec7c2-af4a-4a14-9063-810caf17324b" containerID="e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1" exitCode=0 Apr 16 14:33:56.895230 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.894769 2579 generic.go:358] "Generic (PLEG): container finished" podID="671ec7c2-af4a-4a14-9063-810caf17324b" containerID="22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e" exitCode=0 Apr 16 14:33:56.895230 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.894773 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"671ec7c2-af4a-4a14-9063-810caf17324b","Type":"ContainerDied","Data":"6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2"} Apr 16 14:33:56.895230 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.894816 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"671ec7c2-af4a-4a14-9063-810caf17324b","Type":"ContainerDied","Data":"09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7"} Apr 16 14:33:56.895230 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.894817 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:56.895230 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.894828 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"671ec7c2-af4a-4a14-9063-810caf17324b","Type":"ContainerDied","Data":"51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69"} Apr 16 14:33:56.895230 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.894840 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"671ec7c2-af4a-4a14-9063-810caf17324b","Type":"ContainerDied","Data":"8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235"} Apr 16 14:33:56.895230 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.894850 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"671ec7c2-af4a-4a14-9063-810caf17324b","Type":"ContainerDied","Data":"e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1"} Apr 16 14:33:56.895230 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.894859 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"671ec7c2-af4a-4a14-9063-810caf17324b","Type":"ContainerDied","Data":"22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e"} Apr 16 14:33:56.895230 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.894867 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"671ec7c2-af4a-4a14-9063-810caf17324b","Type":"ContainerDied","Data":"1deb5e7398df1adfb152cf19c244afbc2c5325a5a1c89c3b2704bfeff93840f1"} Apr 16 14:33:56.895230 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.894882 2579 scope.go:117] "RemoveContainer" containerID="6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2" Apr 16 14:33:56.896013 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.895990 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6f8d7b8476-mqfpr" event={"ID":"b19fca59-c615-4671-80aa-d1f9d57ddd74","Type":"ContainerStarted","Data":"1068f392ccd1e722cf783e650c8b091933d589e60b4d2a8ce084f104a3c01127"} Apr 16 14:33:56.903103 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.903089 2579 scope.go:117] "RemoveContainer" containerID="09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7" Apr 16 14:33:56.910236 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.910216 2579 scope.go:117] "RemoveContainer" containerID="51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69" Apr 16 14:33:56.918754 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.918663 2579 scope.go:117] "RemoveContainer" containerID="8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235" Apr 16 14:33:56.920379 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.920359 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:33:56.925146 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.925127 2579 scope.go:117] "RemoveContainer" containerID="e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1" Apr 16 14:33:56.927560 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.927539 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:33:56.932186 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.932170 2579 scope.go:117] "RemoveContainer" containerID="22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e" Apr 16 14:33:56.938774 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.938754 2579 scope.go:117] "RemoveContainer" containerID="3385d56c60155e59f6472b0ce3848a7b5d827b42c6eae7113fa99f55e9f2e395" Apr 16 14:33:56.944946 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.944930 2579 scope.go:117] "RemoveContainer" containerID="6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2" Apr 16 14:33:56.945210 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:33:56.945190 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2\": container with ID starting with 6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2 not found: ID does not exist" containerID="6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2" Apr 16 14:33:56.945270 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.945217 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2"} err="failed to get container status \"6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2\": rpc error: code = NotFound desc = could not find container \"6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2\": container with ID starting with 6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2 not found: ID does not exist" Apr 16 14:33:56.945270 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.945236 2579 scope.go:117] "RemoveContainer" containerID="09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7" Apr 16 14:33:56.945445 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:33:56.945431 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7\": container with ID starting with 09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7 not found: ID does not exist" containerID="09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7" Apr 16 14:33:56.945484 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.945448 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7"} err="failed to get container status \"09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7\": rpc error: code = NotFound desc = could not find container \"09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7\": container with ID starting with 09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7 not found: ID does not exist" Apr 16 14:33:56.945484 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.945460 2579 scope.go:117] "RemoveContainer" containerID="51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69" Apr 16 14:33:56.945693 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:33:56.945677 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69\": container with ID starting with 51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69 not found: ID does not exist" containerID="51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69" Apr 16 14:33:56.945740 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.945700 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69"} err="failed to get container status \"51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69\": rpc error: code = NotFound desc = could not find container \"51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69\": container with ID starting with 51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69 not found: ID does not exist" Apr 16 14:33:56.945740 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.945714 2579 scope.go:117] "RemoveContainer" containerID="8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235" Apr 16 14:33:56.945892 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:33:56.945877 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235\": container with ID starting with 8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235 not found: ID does not exist" containerID="8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235" Apr 16 14:33:56.945926 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.945895 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235"} err="failed to get container status \"8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235\": rpc error: code = NotFound desc = could not find container \"8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235\": container with ID starting with 8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235 not found: ID does not exist" Apr 16 14:33:56.945926 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.945908 2579 scope.go:117] "RemoveContainer" containerID="e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1" Apr 16 14:33:56.946095 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:33:56.946078 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1\": container with ID starting with e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1 not found: ID does not exist" containerID="e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1" Apr 16 14:33:56.946141 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.946099 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1"} err="failed to get container status \"e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1\": rpc error: code = NotFound desc = could not find container \"e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1\": container with ID starting with e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1 not found: ID does not exist" Apr 16 14:33:56.946141 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.946110 2579 scope.go:117] "RemoveContainer" containerID="22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e" Apr 16 14:33:56.946344 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:33:56.946320 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e\": container with ID starting with 22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e not found: ID does not exist" containerID="22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e" Apr 16 14:33:56.946416 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.946344 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e"} err="failed to get container status \"22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e\": rpc error: code = NotFound desc = could not find container \"22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e\": container with ID starting with 22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e not found: ID does not exist" Apr 16 14:33:56.946416 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.946366 2579 scope.go:117] "RemoveContainer" containerID="3385d56c60155e59f6472b0ce3848a7b5d827b42c6eae7113fa99f55e9f2e395" Apr 16 14:33:56.946573 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:33:56.946558 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3385d56c60155e59f6472b0ce3848a7b5d827b42c6eae7113fa99f55e9f2e395\": container with ID starting with 3385d56c60155e59f6472b0ce3848a7b5d827b42c6eae7113fa99f55e9f2e395 not found: ID does not exist" containerID="3385d56c60155e59f6472b0ce3848a7b5d827b42c6eae7113fa99f55e9f2e395" Apr 16 14:33:56.946614 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.946576 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3385d56c60155e59f6472b0ce3848a7b5d827b42c6eae7113fa99f55e9f2e395"} err="failed to get container status \"3385d56c60155e59f6472b0ce3848a7b5d827b42c6eae7113fa99f55e9f2e395\": rpc error: code = NotFound desc = could not find container \"3385d56c60155e59f6472b0ce3848a7b5d827b42c6eae7113fa99f55e9f2e395\": container with ID starting with 3385d56c60155e59f6472b0ce3848a7b5d827b42c6eae7113fa99f55e9f2e395 not found: ID does not exist" Apr 16 14:33:56.946614 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.946589 2579 scope.go:117] "RemoveContainer" containerID="6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2" Apr 16 14:33:56.946810 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.946795 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2"} err="failed to get container status \"6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2\": rpc error: code = NotFound desc = could not find container \"6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2\": container with ID starting with 6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2 not found: ID does not exist" Apr 16 14:33:56.946853 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.946810 2579 scope.go:117] "RemoveContainer" containerID="09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7" Apr 16 14:33:56.946987 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.946973 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7"} err="failed to get container status \"09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7\": rpc error: code = NotFound desc = could not find container \"09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7\": container with ID starting with 09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7 not found: ID does not exist" Apr 16 14:33:56.947024 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.946987 2579 scope.go:117] "RemoveContainer" containerID="51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69" Apr 16 14:33:56.947268 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.947250 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69"} err="failed to get container status \"51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69\": rpc error: code = NotFound desc = could not find container \"51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69\": container with ID starting with 51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69 not found: ID does not exist" Apr 16 14:33:56.947311 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.947269 2579 scope.go:117] "RemoveContainer" containerID="8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235" Apr 16 14:33:56.947461 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.947447 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235"} err="failed to get container status \"8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235\": rpc error: code = NotFound desc = could not find container \"8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235\": container with ID starting with 8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235 not found: ID does not exist" Apr 16 14:33:56.947507 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.947461 2579 scope.go:117] "RemoveContainer" containerID="e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1" Apr 16 14:33:56.947664 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.947645 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1"} err="failed to get container status \"e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1\": rpc error: code = NotFound desc = could not find container \"e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1\": container with ID starting with e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1 not found: ID does not exist" Apr 16 14:33:56.947736 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.947670 2579 scope.go:117] "RemoveContainer" containerID="22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e" Apr 16 14:33:56.947893 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.947876 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e"} err="failed to get container status \"22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e\": rpc error: code = NotFound desc = could not find container \"22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e\": container with ID starting with 22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e not found: ID does not exist" Apr 16 14:33:56.947951 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.947895 2579 scope.go:117] "RemoveContainer" containerID="3385d56c60155e59f6472b0ce3848a7b5d827b42c6eae7113fa99f55e9f2e395" Apr 16 14:33:56.948110 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.948091 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3385d56c60155e59f6472b0ce3848a7b5d827b42c6eae7113fa99f55e9f2e395"} err="failed to get container status \"3385d56c60155e59f6472b0ce3848a7b5d827b42c6eae7113fa99f55e9f2e395\": rpc error: code = NotFound desc = could not find container \"3385d56c60155e59f6472b0ce3848a7b5d827b42c6eae7113fa99f55e9f2e395\": container with ID starting with 3385d56c60155e59f6472b0ce3848a7b5d827b42c6eae7113fa99f55e9f2e395 not found: ID does not exist" Apr 16 14:33:56.948169 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.948111 2579 scope.go:117] "RemoveContainer" containerID="6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2" Apr 16 14:33:56.948313 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.948295 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2"} err="failed to get container status \"6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2\": rpc error: code = NotFound desc = could not find container \"6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2\": container with ID starting with 6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2 not found: ID does not exist" Apr 16 14:33:56.948357 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.948313 2579 scope.go:117] "RemoveContainer" containerID="09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7" Apr 16 14:33:56.948519 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.948503 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7"} err="failed to get container status \"09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7\": rpc error: code = NotFound desc = could not find container \"09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7\": container with ID starting with 09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7 not found: ID does not exist" Apr 16 14:33:56.948561 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.948519 2579 scope.go:117] "RemoveContainer" containerID="51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69" Apr 16 14:33:56.948686 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.948671 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69"} err="failed to get container status \"51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69\": rpc error: code = NotFound desc = could not find container \"51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69\": container with ID starting with 51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69 not found: ID does not exist" Apr 16 14:33:56.948724 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.948686 2579 scope.go:117] "RemoveContainer" containerID="8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235" Apr 16 14:33:56.948853 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.948836 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235"} err="failed to get container status \"8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235\": rpc error: code = NotFound desc = could not find container \"8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235\": container with ID starting with 8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235 not found: ID does not exist" Apr 16 14:33:56.948935 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.948853 2579 scope.go:117] "RemoveContainer" containerID="e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1" Apr 16 14:33:56.949133 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.949115 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1"} err="failed to get container status \"e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1\": rpc error: code = NotFound desc = could not find container \"e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1\": container with ID starting with e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1 not found: ID does not exist" Apr 16 14:33:56.949201 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.949133 2579 scope.go:117] "RemoveContainer" containerID="22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e" Apr 16 14:33:56.949337 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.949321 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e"} err="failed to get container status \"22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e\": rpc error: code = NotFound desc = could not find container \"22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e\": container with ID starting with 22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e not found: ID does not exist" Apr 16 14:33:56.949377 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.949336 2579 scope.go:117] "RemoveContainer" containerID="3385d56c60155e59f6472b0ce3848a7b5d827b42c6eae7113fa99f55e9f2e395" Apr 16 14:33:56.949494 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.949480 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3385d56c60155e59f6472b0ce3848a7b5d827b42c6eae7113fa99f55e9f2e395"} err="failed to get container status \"3385d56c60155e59f6472b0ce3848a7b5d827b42c6eae7113fa99f55e9f2e395\": rpc error: code = NotFound desc = could not find container \"3385d56c60155e59f6472b0ce3848a7b5d827b42c6eae7113fa99f55e9f2e395\": container with ID starting with 3385d56c60155e59f6472b0ce3848a7b5d827b42c6eae7113fa99f55e9f2e395 not found: ID does not exist" Apr 16 14:33:56.949536 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.949494 2579 scope.go:117] "RemoveContainer" containerID="6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2" Apr 16 14:33:56.949636 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.949623 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2"} err="failed to get container status \"6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2\": rpc error: code = NotFound desc = could not find container \"6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2\": container with ID starting with 6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2 not found: ID does not exist" Apr 16 14:33:56.949636 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.949635 2579 scope.go:117] "RemoveContainer" containerID="09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7" Apr 16 14:33:56.949780 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.949766 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7"} err="failed to get container status \"09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7\": rpc error: code = NotFound desc = could not find container \"09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7\": container with ID starting with 09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7 not found: ID does not exist" Apr 16 14:33:56.949780 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.949779 2579 scope.go:117] "RemoveContainer" containerID="51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69" Apr 16 14:33:56.949957 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.949943 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69"} err="failed to get container status \"51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69\": rpc error: code = NotFound desc = could not find container \"51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69\": container with ID starting with 51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69 not found: ID does not exist" Apr 16 14:33:56.949957 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.949956 2579 scope.go:117] "RemoveContainer" containerID="8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235" Apr 16 14:33:56.950124 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.950109 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235"} err="failed to get container status \"8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235\": rpc error: code = NotFound desc = could not find container \"8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235\": container with ID starting with 8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235 not found: ID does not exist" Apr 16 14:33:56.950172 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.950123 2579 scope.go:117] "RemoveContainer" containerID="e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1" Apr 16 14:33:56.950282 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.950265 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1"} err="failed to get container status \"e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1\": rpc error: code = NotFound desc = could not find container \"e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1\": container with ID starting with e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1 not found: ID does not exist" Apr 16 14:33:56.950323 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.950281 2579 scope.go:117] "RemoveContainer" containerID="22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e" Apr 16 14:33:56.950431 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.950418 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e"} err="failed to get container status \"22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e\": rpc error: code = NotFound desc = could not find container \"22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e\": container with ID starting with 22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e not found: ID does not exist" Apr 16 14:33:56.950477 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.950431 2579 scope.go:117] "RemoveContainer" containerID="3385d56c60155e59f6472b0ce3848a7b5d827b42c6eae7113fa99f55e9f2e395" Apr 16 14:33:56.950598 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.950580 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3385d56c60155e59f6472b0ce3848a7b5d827b42c6eae7113fa99f55e9f2e395"} err="failed to get container status \"3385d56c60155e59f6472b0ce3848a7b5d827b42c6eae7113fa99f55e9f2e395\": rpc error: code = NotFound desc = could not find container \"3385d56c60155e59f6472b0ce3848a7b5d827b42c6eae7113fa99f55e9f2e395\": container with ID starting with 3385d56c60155e59f6472b0ce3848a7b5d827b42c6eae7113fa99f55e9f2e395 not found: ID does not exist" Apr 16 14:33:56.950637 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.950605 2579 scope.go:117] "RemoveContainer" containerID="6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2" Apr 16 14:33:56.950843 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.950824 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2"} err="failed to get container status \"6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2\": rpc error: code = NotFound desc = could not find container \"6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2\": container with ID starting with 6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2 not found: ID does not exist" Apr 16 14:33:56.950885 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.950844 2579 scope.go:117] "RemoveContainer" containerID="09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7" Apr 16 14:33:56.951074 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.951017 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7"} err="failed to get container status \"09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7\": rpc error: code = NotFound desc = could not find container \"09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7\": container with ID starting with 09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7 not found: ID does not exist" Apr 16 14:33:56.951074 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.951052 2579 scope.go:117] "RemoveContainer" containerID="51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69" Apr 16 14:33:56.951255 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.951239 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69"} err="failed to get container status \"51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69\": rpc error: code = NotFound desc = could not find container \"51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69\": container with ID starting with 51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69 not found: ID does not exist" Apr 16 14:33:56.951297 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.951256 2579 scope.go:117] "RemoveContainer" containerID="8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235" Apr 16 14:33:56.951457 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.951430 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235"} err="failed to get container status \"8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235\": rpc error: code = NotFound desc = could not find container \"8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235\": container with ID starting with 8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235 not found: ID does not exist" Apr 16 14:33:56.951457 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.951448 2579 scope.go:117] "RemoveContainer" containerID="e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1" Apr 16 14:33:56.951638 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.951624 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1"} err="failed to get container status \"e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1\": rpc error: code = NotFound desc = could not find container \"e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1\": container with ID starting with e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1 not found: ID does not exist" Apr 16 14:33:56.951638 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.951638 2579 scope.go:117] "RemoveContainer" containerID="22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e" Apr 16 14:33:56.951878 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.951837 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e"} err="failed to get container status \"22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e\": rpc error: code = NotFound desc = could not find container \"22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e\": container with ID starting with 22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e not found: ID does not exist" Apr 16 14:33:56.951878 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.951862 2579 scope.go:117] "RemoveContainer" containerID="3385d56c60155e59f6472b0ce3848a7b5d827b42c6eae7113fa99f55e9f2e395" Apr 16 14:33:56.952055 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.952014 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3385d56c60155e59f6472b0ce3848a7b5d827b42c6eae7113fa99f55e9f2e395"} err="failed to get container status \"3385d56c60155e59f6472b0ce3848a7b5d827b42c6eae7113fa99f55e9f2e395\": rpc error: code = NotFound desc = could not find container \"3385d56c60155e59f6472b0ce3848a7b5d827b42c6eae7113fa99f55e9f2e395\": container with ID starting with 3385d56c60155e59f6472b0ce3848a7b5d827b42c6eae7113fa99f55e9f2e395 not found: ID does not exist" Apr 16 14:33:56.952169 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.952057 2579 scope.go:117] "RemoveContainer" containerID="6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2" Apr 16 14:33:56.952321 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.952291 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2"} err="failed to get container status \"6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2\": rpc error: code = NotFound desc = could not find container \"6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2\": container with ID starting with 6aecb82c390b3b4a1694e94444b79a7abd02a3a46061af0183604f035bfd94e2 not found: ID does not exist" Apr 16 14:33:56.952321 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.952312 2579 scope.go:117] "RemoveContainer" containerID="09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7" Apr 16 14:33:56.952745 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.952690 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7"} err="failed to get container status \"09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7\": rpc error: code = NotFound desc = could not find container \"09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7\": container with ID starting with 09e1dca3edfe1ec03630522aecb6353b8537a9d44185197dad8cbf19ad5a33f7 not found: ID does not exist" Apr 16 14:33:56.952745 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.952745 2579 scope.go:117] "RemoveContainer" containerID="51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69" Apr 16 14:33:56.953058 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.953013 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69"} err="failed to get container status \"51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69\": rpc error: code = NotFound desc = could not find container \"51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69\": container with ID starting with 51b6862e8a379d17f76c716705c7d35da8f41c207876286b6039b86e5f4c4b69 not found: ID does not exist" Apr 16 14:33:56.953151 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.953059 2579 scope.go:117] "RemoveContainer" containerID="8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235" Apr 16 14:33:56.953318 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.953298 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235"} err="failed to get container status \"8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235\": rpc error: code = NotFound desc = could not find container \"8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235\": container with ID starting with 8c99b92bcf475f9fa2bba735fc7925902619314e8720b5833322e9ce44d1d235 not found: ID does not exist" Apr 16 14:33:56.953425 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.953321 2579 scope.go:117] "RemoveContainer" containerID="e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1" Apr 16 14:33:56.953541 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.953516 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1"} err="failed to get container status \"e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1\": rpc error: code = NotFound desc = could not find container \"e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1\": container with ID starting with e1c5c42232f7e3ef4fba8760a0b33923611752b74b0a4f5ce645845f78653be1 not found: ID does not exist" Apr 16 14:33:56.953541 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.953539 2579 scope.go:117] "RemoveContainer" containerID="22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e" Apr 16 14:33:56.953766 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.953748 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e"} err="failed to get container status \"22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e\": rpc error: code = NotFound desc = could not find container \"22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e\": container with ID starting with 22b728e52be67622b82e89a6a849d4a5804b386e184c55c770fc425026ef6b1e not found: ID does not exist" Apr 16 14:33:56.953822 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.953768 2579 scope.go:117] "RemoveContainer" containerID="3385d56c60155e59f6472b0ce3848a7b5d827b42c6eae7113fa99f55e9f2e395" Apr 16 14:33:56.953999 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.953977 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3385d56c60155e59f6472b0ce3848a7b5d827b42c6eae7113fa99f55e9f2e395"} err="failed to get container status \"3385d56c60155e59f6472b0ce3848a7b5d827b42c6eae7113fa99f55e9f2e395\": rpc error: code = NotFound desc = could not find container \"3385d56c60155e59f6472b0ce3848a7b5d827b42c6eae7113fa99f55e9f2e395\": container with ID starting with 3385d56c60155e59f6472b0ce3848a7b5d827b42c6eae7113fa99f55e9f2e395 not found: ID does not exist" Apr 16 14:33:56.954501 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.954485 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:33:56.954829 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.954816 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="671ec7c2-af4a-4a14-9063-810caf17324b" containerName="kube-rbac-proxy" Apr 16 14:33:56.954873 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.954831 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="671ec7c2-af4a-4a14-9063-810caf17324b" containerName="kube-rbac-proxy" Apr 16 14:33:56.954873 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.954844 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="671ec7c2-af4a-4a14-9063-810caf17324b" containerName="config-reloader" Apr 16 14:33:56.954873 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.954850 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="671ec7c2-af4a-4a14-9063-810caf17324b" containerName="config-reloader" Apr 16 14:33:56.954873 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.954861 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="671ec7c2-af4a-4a14-9063-810caf17324b" containerName="prometheus" Apr 16 14:33:56.954873 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.954867 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="671ec7c2-af4a-4a14-9063-810caf17324b" containerName="prometheus" Apr 16 14:33:56.954873 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.954873 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="671ec7c2-af4a-4a14-9063-810caf17324b" containerName="thanos-sidecar" Apr 16 14:33:56.955057 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.954878 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="671ec7c2-af4a-4a14-9063-810caf17324b" containerName="thanos-sidecar" Apr 16 14:33:56.955057 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.954884 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="671ec7c2-af4a-4a14-9063-810caf17324b" containerName="kube-rbac-proxy-web" Apr 16 14:33:56.955057 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.954891 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="671ec7c2-af4a-4a14-9063-810caf17324b" containerName="kube-rbac-proxy-web" Apr 16 14:33:56.955057 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.954905 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="671ec7c2-af4a-4a14-9063-810caf17324b" containerName="kube-rbac-proxy-thanos" Apr 16 14:33:56.955057 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.954912 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="671ec7c2-af4a-4a14-9063-810caf17324b" containerName="kube-rbac-proxy-thanos" Apr 16 14:33:56.955057 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.954921 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="671ec7c2-af4a-4a14-9063-810caf17324b" containerName="init-config-reloader" Apr 16 14:33:56.955057 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.954928 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="671ec7c2-af4a-4a14-9063-810caf17324b" containerName="init-config-reloader" Apr 16 14:33:56.955057 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.954977 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="671ec7c2-af4a-4a14-9063-810caf17324b" containerName="prometheus" Apr 16 14:33:56.955057 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.954987 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="671ec7c2-af4a-4a14-9063-810caf17324b" containerName="kube-rbac-proxy-thanos" Apr 16 14:33:56.955057 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.954998 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="671ec7c2-af4a-4a14-9063-810caf17324b" containerName="kube-rbac-proxy-web" Apr 16 14:33:56.955057 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.955008 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="671ec7c2-af4a-4a14-9063-810caf17324b" containerName="kube-rbac-proxy" Apr 16 14:33:56.955057 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.955015 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="671ec7c2-af4a-4a14-9063-810caf17324b" containerName="config-reloader" Apr 16 14:33:56.955057 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.955022 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="671ec7c2-af4a-4a14-9063-810caf17324b" containerName="thanos-sidecar" Apr 16 14:33:56.958750 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.958730 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:56.961116 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.961096 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-cmr8bvo245iv2\"" Apr 16 14:33:56.961197 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.961115 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 14:33:56.961197 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.961115 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 14:33:56.961474 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.961455 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 14:33:56.961577 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.961555 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-2xbnh\"" Apr 16 14:33:56.961636 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.961558 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 14:33:56.961636 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.961589 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 14:33:56.961732 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.961642 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 14:33:56.961782 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.961730 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 14:33:56.961817 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.961784 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 14:33:56.961817 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.961802 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 14:33:56.961874 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.961853 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 14:33:56.965113 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.965095 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 14:33:56.967484 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.967465 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 14:33:56.972176 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:56.972155 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:33:57.033750 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.033721 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b6f4294d-d673-4af4-9961-72525953ee7f-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.033897 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.033787 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b6f4294d-d673-4af4-9961-72525953ee7f-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.033897 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.033815 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vw2x\" (UniqueName: \"kubernetes.io/projected/b6f4294d-d673-4af4-9961-72525953ee7f-kube-api-access-6vw2x\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.033897 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.033870 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b6f4294d-d673-4af4-9961-72525953ee7f-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.034105 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.033902 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b6f4294d-d673-4af4-9961-72525953ee7f-config\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.034161 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.034134 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b6f4294d-d673-4af4-9961-72525953ee7f-config-out\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.034241 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.034220 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b6f4294d-d673-4af4-9961-72525953ee7f-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.034308 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.034290 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b6f4294d-d673-4af4-9961-72525953ee7f-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.034376 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.034324 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b6f4294d-d673-4af4-9961-72525953ee7f-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.034376 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.034350 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6f4294d-d673-4af4-9961-72525953ee7f-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.034468 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.034374 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6f4294d-d673-4af4-9961-72525953ee7f-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.034468 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.034401 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b6f4294d-d673-4af4-9961-72525953ee7f-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.034572 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.034469 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b6f4294d-d673-4af4-9961-72525953ee7f-web-config\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.034572 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.034499 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b6f4294d-d673-4af4-9961-72525953ee7f-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.034572 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.034559 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b6f4294d-d673-4af4-9961-72525953ee7f-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.034705 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.034589 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6f4294d-d673-4af4-9961-72525953ee7f-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.034705 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.034621 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b6f4294d-d673-4af4-9961-72525953ee7f-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.034705 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.034650 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b6f4294d-d673-4af4-9961-72525953ee7f-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.135194 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.135146 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b6f4294d-d673-4af4-9961-72525953ee7f-config-out\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.135194 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.135199 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b6f4294d-d673-4af4-9961-72525953ee7f-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.135832 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.135233 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b6f4294d-d673-4af4-9961-72525953ee7f-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.135832 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.135250 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b6f4294d-d673-4af4-9961-72525953ee7f-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.135832 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.135265 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6f4294d-d673-4af4-9961-72525953ee7f-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.135832 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.135283 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6f4294d-d673-4af4-9961-72525953ee7f-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.135832 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.135311 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b6f4294d-d673-4af4-9961-72525953ee7f-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.135832 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.135345 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b6f4294d-d673-4af4-9961-72525953ee7f-web-config\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.135832 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.135372 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b6f4294d-d673-4af4-9961-72525953ee7f-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.135832 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.135401 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b6f4294d-d673-4af4-9961-72525953ee7f-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.135832 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.135433 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6f4294d-d673-4af4-9961-72525953ee7f-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.135832 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.135463 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b6f4294d-d673-4af4-9961-72525953ee7f-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.135832 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.135491 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b6f4294d-d673-4af4-9961-72525953ee7f-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.135832 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.135563 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b6f4294d-d673-4af4-9961-72525953ee7f-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.135832 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.135608 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b6f4294d-d673-4af4-9961-72525953ee7f-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.136501 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.136131 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vw2x\" (UniqueName: \"kubernetes.io/projected/b6f4294d-d673-4af4-9961-72525953ee7f-kube-api-access-6vw2x\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.136501 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.136221 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b6f4294d-d673-4af4-9961-72525953ee7f-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.136501 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.136263 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b6f4294d-d673-4af4-9961-72525953ee7f-config\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.137315 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.136967 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6f4294d-d673-4af4-9961-72525953ee7f-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.137315 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.137243 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b6f4294d-d673-4af4-9961-72525953ee7f-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.137927 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.137891 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b6f4294d-d673-4af4-9961-72525953ee7f-config-out\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.138913 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.138734 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b6f4294d-d673-4af4-9961-72525953ee7f-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.139016 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.138930 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b6f4294d-d673-4af4-9961-72525953ee7f-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.139097 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.139013 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b6f4294d-d673-4af4-9961-72525953ee7f-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.139420 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.139395 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b6f4294d-d673-4af4-9961-72525953ee7f-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.139856 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.139831 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b6f4294d-d673-4af4-9961-72525953ee7f-config\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.140420 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.140388 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b6f4294d-d673-4af4-9961-72525953ee7f-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.140511 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.140456 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6f4294d-d673-4af4-9961-72525953ee7f-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.141521 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.141222 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6f4294d-d673-4af4-9961-72525953ee7f-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.141521 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.141457 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b6f4294d-d673-4af4-9961-72525953ee7f-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.141761 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.141739 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b6f4294d-d673-4af4-9961-72525953ee7f-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.141818 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.141768 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b6f4294d-d673-4af4-9961-72525953ee7f-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.145166 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.143147 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b6f4294d-d673-4af4-9961-72525953ee7f-web-config\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.145166 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.143580 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b6f4294d-d673-4af4-9961-72525953ee7f-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.145166 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.144217 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b6f4294d-d673-4af4-9961-72525953ee7f-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.150357 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.150336 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vw2x\" (UniqueName: \"kubernetes.io/projected/b6f4294d-d673-4af4-9961-72525953ee7f-kube-api-access-6vw2x\") pod \"prometheus-k8s-0\" (UID: \"b6f4294d-d673-4af4-9961-72525953ee7f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.268888 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.268851 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:33:57.413806 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.413773 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:33:57.416553 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:33:57.416522 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6f4294d_d673_4af4_9961_72525953ee7f.slice/crio-7498349bb27cf1f4a02a5a1fbffb5c1957e2e1bed987ae87347c8857727526e6 WatchSource:0}: Error finding container 7498349bb27cf1f4a02a5a1fbffb5c1957e2e1bed987ae87347c8857727526e6: Status 404 returned error can't find the container with id 7498349bb27cf1f4a02a5a1fbffb5c1957e2e1bed987ae87347c8857727526e6 Apr 16 14:33:57.900991 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.900951 2579 generic.go:358] "Generic (PLEG): container finished" podID="b6f4294d-d673-4af4-9961-72525953ee7f" containerID="2b4e4c6bd412aee47fb18086e52a88e7017c52d06962d6f088342e84db913513" exitCode=0 Apr 16 14:33:57.901483 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.901076 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b6f4294d-d673-4af4-9961-72525953ee7f","Type":"ContainerDied","Data":"2b4e4c6bd412aee47fb18086e52a88e7017c52d06962d6f088342e84db913513"} Apr 16 14:33:57.901483 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.901116 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b6f4294d-d673-4af4-9961-72525953ee7f","Type":"ContainerStarted","Data":"7498349bb27cf1f4a02a5a1fbffb5c1957e2e1bed987ae87347c8857727526e6"} Apr 16 14:33:57.978770 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:57.978738 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="671ec7c2-af4a-4a14-9063-810caf17324b" path="/var/lib/kubelet/pods/671ec7c2-af4a-4a14-9063-810caf17324b/volumes" Apr 16 14:33:58.912332 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:58.912296 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b6f4294d-d673-4af4-9961-72525953ee7f","Type":"ContainerStarted","Data":"17ca8acc7cbf87363b33c8b3eb599fe971ae0a76cc736a033203031966dd9b1c"} Apr 16 14:33:58.912332 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:58.912337 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b6f4294d-d673-4af4-9961-72525953ee7f","Type":"ContainerStarted","Data":"47aad0dda75624b22e2b7e29579358cb3c6531ffaa04576194782bacc397e376"} Apr 16 14:33:58.912886 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:58.912351 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b6f4294d-d673-4af4-9961-72525953ee7f","Type":"ContainerStarted","Data":"3f178f1e8c0b554c301041bd7b831ce0f3c1e7decd1127ece46ffde02f7401b9"} Apr 16 14:33:58.912886 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:58.912366 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b6f4294d-d673-4af4-9961-72525953ee7f","Type":"ContainerStarted","Data":"c85a70be13410c3d01779070b2a9a151a5f85abc255fe11195f7186edc604966"} Apr 16 14:33:58.912886 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:58.912378 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b6f4294d-d673-4af4-9961-72525953ee7f","Type":"ContainerStarted","Data":"a782d506e17a35d7c5c790bb7d1a6b7f2f3e537f557adc8ce39dce330a3cd9db"} Apr 16 14:33:58.912886 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:58.912391 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b6f4294d-d673-4af4-9961-72525953ee7f","Type":"ContainerStarted","Data":"fe5d701f9a1127e3a0d303c9e771f123cc673f4b09abde9baf0d6e6e34af664b"} Apr 16 14:33:58.940916 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:58.940812 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.940794343 podStartE2EDuration="2.940794343s" podCreationTimestamp="2026-04-16 14:33:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:33:58.939263704 +0000 UTC m=+245.557053470" watchObservedRunningTime="2026-04-16 14:33:58.940794343 +0000 UTC m=+245.558584108" Apr 16 14:33:59.917340 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:59.917303 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6f8d7b8476-mqfpr" event={"ID":"b19fca59-c615-4671-80aa-d1f9d57ddd74","Type":"ContainerStarted","Data":"23f498cde5bca74e94f10fa67829b1d02c45de7e08da92d4dc694944e853e57a"} Apr 16 14:33:59.917340 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:59.917343 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6f8d7b8476-mqfpr" event={"ID":"b19fca59-c615-4671-80aa-d1f9d57ddd74","Type":"ContainerStarted","Data":"dfc181e6eb50c193f7ee89232378d9a5dc18f78dc98c66459176c6a7e4be98f3"} Apr 16 14:33:59.917858 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:59.917353 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6f8d7b8476-mqfpr" event={"ID":"b19fca59-c615-4671-80aa-d1f9d57ddd74","Type":"ContainerStarted","Data":"ae6e218f05f7d63575c499d7c85243287a4c5738775e0b680fc080c046c4b40e"} Apr 16 14:33:59.941625 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:33:59.941580 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-6f8d7b8476-mqfpr" podStartSLOduration=1.2896396239999999 podStartE2EDuration="3.94156401s" podCreationTimestamp="2026-04-16 14:33:56 +0000 UTC" firstStartedPulling="2026-04-16 14:33:56.521485877 +0000 UTC m=+243.139275623" lastFinishedPulling="2026-04-16 14:33:59.173410264 +0000 UTC m=+245.791200009" observedRunningTime="2026-04-16 14:33:59.940240635 +0000 UTC m=+246.558030421" watchObservedRunningTime="2026-04-16 14:33:59.94156401 +0000 UTC m=+246.559353778" Apr 16 14:34:02.269486 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:34:02.269449 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:34:04.705718 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:34:04.705677 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20614211-2bef-41dc-aad8-94242eb8364c-metrics-certs\") pod \"network-metrics-daemon-m57qr\" (UID: \"20614211-2bef-41dc-aad8-94242eb8364c\") " pod="openshift-multus/network-metrics-daemon-m57qr" Apr 16 14:34:04.708205 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:34:04.708180 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20614211-2bef-41dc-aad8-94242eb8364c-metrics-certs\") pod \"network-metrics-daemon-m57qr\" (UID: \"20614211-2bef-41dc-aad8-94242eb8364c\") " pod="openshift-multus/network-metrics-daemon-m57qr" Apr 16 14:34:04.777426 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:34:04.777398 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bl9dn\"" Apr 16 14:34:04.785348 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:34:04.785326 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m57qr" Apr 16 14:34:04.903185 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:34:04.903158 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-m57qr"] Apr 16 14:34:04.905806 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:34:04.905776 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20614211_2bef_41dc_aad8_94242eb8364c.slice/crio-70ec7b44cb4a375f6787498c319762bac74550c10ab6e1637ba4754394264b94 WatchSource:0}: Error finding container 70ec7b44cb4a375f6787498c319762bac74550c10ab6e1637ba4754394264b94: Status 404 returned error can't find the container with id 70ec7b44cb4a375f6787498c319762bac74550c10ab6e1637ba4754394264b94 Apr 16 14:34:04.937254 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:34:04.937220 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-m57qr" event={"ID":"20614211-2bef-41dc-aad8-94242eb8364c","Type":"ContainerStarted","Data":"70ec7b44cb4a375f6787498c319762bac74550c10ab6e1637ba4754394264b94"} Apr 16 14:34:05.942250 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:34:05.942195 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-m57qr" event={"ID":"20614211-2bef-41dc-aad8-94242eb8364c","Type":"ContainerStarted","Data":"e0076d11ff95f3d67000b309483ef44e19e9360802525199649d0868d673a3b7"} Apr 16 14:34:06.946460 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:34:06.946426 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-m57qr" event={"ID":"20614211-2bef-41dc-aad8-94242eb8364c","Type":"ContainerStarted","Data":"18d6309397f3058be6c088f522c00375e019181ef4e0219095469705abd3cc22"} Apr 16 14:34:06.965690 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:34:06.965642 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-m57qr" podStartSLOduration=252.093929768 podStartE2EDuration="4m12.965625642s" podCreationTimestamp="2026-04-16 14:29:54 +0000 UTC" firstStartedPulling="2026-04-16 14:34:04.907555062 +0000 UTC m=+251.525344802" lastFinishedPulling="2026-04-16 14:34:05.779250917 +0000 UTC m=+252.397040676" observedRunningTime="2026-04-16 14:34:06.964405894 +0000 UTC m=+253.582195657" watchObservedRunningTime="2026-04-16 14:34:06.965625642 +0000 UTC m=+253.583415478" Apr 16 14:34:53.853902 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:34:53.853871 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-lr5dz_fca51d46-f492-4bcf-8d66-7ecb32e54d54/console-operator/1.log" Apr 16 14:34:53.854738 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:34:53.854715 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-lr5dz_fca51d46-f492-4bcf-8d66-7ecb32e54d54/console-operator/1.log" Apr 16 14:34:53.856288 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:34:53.856267 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vclsl_dc8ac89f-dee2-4e7c-b409-39b0900c673e/ovn-acl-logging/0.log" Apr 16 14:34:53.857614 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:34:53.857592 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vclsl_dc8ac89f-dee2-4e7c-b409-39b0900c673e/ovn-acl-logging/0.log" Apr 16 14:34:53.864006 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:34:53.863986 2579 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 14:34:57.269108 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:34:57.269067 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:34:57.284849 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:34:57.284823 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:34:58.109930 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:34:58.109903 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:35:56.542914 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:35:56.542827 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-j5tck"] Apr 16 14:35:56.546285 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:35:56.546266 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-j5tck" Apr 16 14:35:56.548273 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:35:56.548253 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 14:35:56.551459 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:35:56.551437 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-j5tck"] Apr 16 14:35:56.644418 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:35:56.644374 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/32a0461a-84ba-4c24-a026-df53dabb8bbe-dbus\") pod \"global-pull-secret-syncer-j5tck\" (UID: \"32a0461a-84ba-4c24-a026-df53dabb8bbe\") " pod="kube-system/global-pull-secret-syncer-j5tck" Apr 16 14:35:56.644594 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:35:56.644472 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/32a0461a-84ba-4c24-a026-df53dabb8bbe-original-pull-secret\") pod \"global-pull-secret-syncer-j5tck\" (UID: \"32a0461a-84ba-4c24-a026-df53dabb8bbe\") " pod="kube-system/global-pull-secret-syncer-j5tck" Apr 16 14:35:56.644594 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:35:56.644530 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/32a0461a-84ba-4c24-a026-df53dabb8bbe-kubelet-config\") pod \"global-pull-secret-syncer-j5tck\" (UID: \"32a0461a-84ba-4c24-a026-df53dabb8bbe\") " pod="kube-system/global-pull-secret-syncer-j5tck" Apr 16 14:35:56.745748 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:35:56.745713 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/32a0461a-84ba-4c24-a026-df53dabb8bbe-dbus\") pod \"global-pull-secret-syncer-j5tck\" (UID: \"32a0461a-84ba-4c24-a026-df53dabb8bbe\") " pod="kube-system/global-pull-secret-syncer-j5tck" Apr 16 14:35:56.745910 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:35:56.745785 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/32a0461a-84ba-4c24-a026-df53dabb8bbe-original-pull-secret\") pod \"global-pull-secret-syncer-j5tck\" (UID: \"32a0461a-84ba-4c24-a026-df53dabb8bbe\") " pod="kube-system/global-pull-secret-syncer-j5tck" Apr 16 14:35:56.745910 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:35:56.745836 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/32a0461a-84ba-4c24-a026-df53dabb8bbe-kubelet-config\") pod \"global-pull-secret-syncer-j5tck\" (UID: \"32a0461a-84ba-4c24-a026-df53dabb8bbe\") " pod="kube-system/global-pull-secret-syncer-j5tck" Apr 16 14:35:56.745989 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:35:56.745911 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/32a0461a-84ba-4c24-a026-df53dabb8bbe-kubelet-config\") pod \"global-pull-secret-syncer-j5tck\" (UID: \"32a0461a-84ba-4c24-a026-df53dabb8bbe\") " pod="kube-system/global-pull-secret-syncer-j5tck" Apr 16 14:35:56.745989 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:35:56.745934 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/32a0461a-84ba-4c24-a026-df53dabb8bbe-dbus\") pod \"global-pull-secret-syncer-j5tck\" (UID: \"32a0461a-84ba-4c24-a026-df53dabb8bbe\") " pod="kube-system/global-pull-secret-syncer-j5tck" Apr 16 14:35:56.748138 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:35:56.748119 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/32a0461a-84ba-4c24-a026-df53dabb8bbe-original-pull-secret\") pod \"global-pull-secret-syncer-j5tck\" (UID: \"32a0461a-84ba-4c24-a026-df53dabb8bbe\") " pod="kube-system/global-pull-secret-syncer-j5tck" Apr 16 14:35:56.856342 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:35:56.856255 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-j5tck" Apr 16 14:35:56.980562 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:35:56.980538 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-j5tck"] Apr 16 14:35:56.983582 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:35:56.983552 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32a0461a_84ba_4c24_a026_df53dabb8bbe.slice/crio-e604e332b728fba01b42c2c784efa81b0a3989f785da3b3613bab79472eba541 WatchSource:0}: Error finding container e604e332b728fba01b42c2c784efa81b0a3989f785da3b3613bab79472eba541: Status 404 returned error can't find the container with id e604e332b728fba01b42c2c784efa81b0a3989f785da3b3613bab79472eba541 Apr 16 14:35:56.985418 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:35:56.985399 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:35:57.267991 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:35:57.267902 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-j5tck" event={"ID":"32a0461a-84ba-4c24-a026-df53dabb8bbe","Type":"ContainerStarted","Data":"e604e332b728fba01b42c2c784efa81b0a3989f785da3b3613bab79472eba541"} Apr 16 14:36:01.280662 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:36:01.280630 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-j5tck" event={"ID":"32a0461a-84ba-4c24-a026-df53dabb8bbe","Type":"ContainerStarted","Data":"e7e468ba1a6c1dfe3ee75e026ae696ff93d5e3280e331e6c75dd1337e5d7a584"} Apr 16 14:36:01.297937 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:36:01.297891 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-j5tck" podStartSLOduration=1.47836124 podStartE2EDuration="5.29787593s" podCreationTimestamp="2026-04-16 14:35:56 +0000 UTC" firstStartedPulling="2026-04-16 14:35:56.985557951 +0000 UTC m=+363.603347693" lastFinishedPulling="2026-04-16 14:36:00.805072636 +0000 UTC m=+367.422862383" observedRunningTime="2026-04-16 14:36:01.296352691 +0000 UTC m=+367.914142454" watchObservedRunningTime="2026-04-16 14:36:01.29787593 +0000 UTC m=+367.915665692" Apr 16 14:37:40.886355 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:37:40.886321 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-97xc6"] Apr 16 14:37:40.889515 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:37:40.889498 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-97xc6" Apr 16 14:37:40.895384 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:37:40.895363 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 14:37:40.895512 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:37:40.895479 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 14:37:40.895591 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:37:40.895578 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-24bld\"" Apr 16 14:37:40.898963 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:37:40.898942 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-97xc6"] Apr 16 14:37:40.994521 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:37:40.994481 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk26v\" (UniqueName: \"kubernetes.io/projected/c38b1d9c-7fee-45ff-b93c-ca5ca6a32f74-kube-api-access-zk26v\") pod \"cert-manager-webhook-597b96b99b-97xc6\" (UID: \"c38b1d9c-7fee-45ff-b93c-ca5ca6a32f74\") " pod="cert-manager/cert-manager-webhook-597b96b99b-97xc6" Apr 16 14:37:40.994684 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:37:40.994541 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c38b1d9c-7fee-45ff-b93c-ca5ca6a32f74-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-97xc6\" (UID: \"c38b1d9c-7fee-45ff-b93c-ca5ca6a32f74\") " pod="cert-manager/cert-manager-webhook-597b96b99b-97xc6" Apr 16 14:37:41.095526 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:37:41.095489 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c38b1d9c-7fee-45ff-b93c-ca5ca6a32f74-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-97xc6\" (UID: \"c38b1d9c-7fee-45ff-b93c-ca5ca6a32f74\") " pod="cert-manager/cert-manager-webhook-597b96b99b-97xc6" Apr 16 14:37:41.095694 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:37:41.095597 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zk26v\" (UniqueName: \"kubernetes.io/projected/c38b1d9c-7fee-45ff-b93c-ca5ca6a32f74-kube-api-access-zk26v\") pod \"cert-manager-webhook-597b96b99b-97xc6\" (UID: \"c38b1d9c-7fee-45ff-b93c-ca5ca6a32f74\") " pod="cert-manager/cert-manager-webhook-597b96b99b-97xc6" Apr 16 14:37:41.104360 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:37:41.104326 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c38b1d9c-7fee-45ff-b93c-ca5ca6a32f74-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-97xc6\" (UID: \"c38b1d9c-7fee-45ff-b93c-ca5ca6a32f74\") " pod="cert-manager/cert-manager-webhook-597b96b99b-97xc6" Apr 16 14:37:41.104647 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:37:41.104628 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk26v\" (UniqueName: \"kubernetes.io/projected/c38b1d9c-7fee-45ff-b93c-ca5ca6a32f74-kube-api-access-zk26v\") pod \"cert-manager-webhook-597b96b99b-97xc6\" (UID: \"c38b1d9c-7fee-45ff-b93c-ca5ca6a32f74\") " pod="cert-manager/cert-manager-webhook-597b96b99b-97xc6" Apr 16 14:37:41.208766 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:37:41.208682 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-97xc6" Apr 16 14:37:41.333352 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:37:41.333285 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-97xc6"] Apr 16 14:37:41.335991 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:37:41.335961 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc38b1d9c_7fee_45ff_b93c_ca5ca6a32f74.slice/crio-c4e94e91a7d7ac8657c3a2b7df85eaf40ee478829c262fed520f9e768a8450ee WatchSource:0}: Error finding container c4e94e91a7d7ac8657c3a2b7df85eaf40ee478829c262fed520f9e768a8450ee: Status 404 returned error can't find the container with id c4e94e91a7d7ac8657c3a2b7df85eaf40ee478829c262fed520f9e768a8450ee Apr 16 14:37:41.565551 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:37:41.565515 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-97xc6" event={"ID":"c38b1d9c-7fee-45ff-b93c-ca5ca6a32f74","Type":"ContainerStarted","Data":"c4e94e91a7d7ac8657c3a2b7df85eaf40ee478829c262fed520f9e768a8450ee"} Apr 16 14:37:45.582187 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:37:45.582150 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-97xc6" event={"ID":"c38b1d9c-7fee-45ff-b93c-ca5ca6a32f74","Type":"ContainerStarted","Data":"335305c91ad32075429ad71bf538f24786b5bd9f3497bf9ddb0c0483f802dae5"} Apr 16 14:37:45.582576 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:37:45.582209 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-97xc6" Apr 16 14:37:45.598435 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:37:45.598360 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-97xc6" podStartSLOduration=2.101885521 podStartE2EDuration="5.598342321s" podCreationTimestamp="2026-04-16 14:37:40 +0000 UTC" firstStartedPulling="2026-04-16 14:37:41.339382115 +0000 UTC m=+467.957171873" lastFinishedPulling="2026-04-16 14:37:44.835838928 +0000 UTC m=+471.453628673" observedRunningTime="2026-04-16 14:37:45.59764526 +0000 UTC m=+472.215435037" watchObservedRunningTime="2026-04-16 14:37:45.598342321 +0000 UTC m=+472.216132083" Apr 16 14:37:51.587337 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:37:51.587295 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-97xc6" Apr 16 14:38:10.667143 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:10.667109 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-cmhw2"] Apr 16 14:38:10.669378 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:10.669362 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-cmhw2" Apr 16 14:38:10.671516 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:10.671491 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 16 14:38:10.671712 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:10.671686 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-rhxrn\"" Apr 16 14:38:10.671927 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:10.671712 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 14:38:10.671927 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:10.671784 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 16 14:38:10.672101 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:10.671796 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 14:38:10.683480 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:10.683458 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-cmhw2"] Apr 16 14:38:10.858793 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:10.858760 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/396e2470-cef3-48c8-adf2-6ac7cb57b2e3-webhook-cert\") pod \"opendatahub-operator-controller-manager-6f7bb56bb6-cmhw2\" (UID: \"396e2470-cef3-48c8-adf2-6ac7cb57b2e3\") " pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-cmhw2" Apr 16 14:38:10.858995 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:10.858802 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/396e2470-cef3-48c8-adf2-6ac7cb57b2e3-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6f7bb56bb6-cmhw2\" (UID: \"396e2470-cef3-48c8-adf2-6ac7cb57b2e3\") " pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-cmhw2" Apr 16 14:38:10.858995 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:10.858902 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2d4n\" (UniqueName: \"kubernetes.io/projected/396e2470-cef3-48c8-adf2-6ac7cb57b2e3-kube-api-access-p2d4n\") pod \"opendatahub-operator-controller-manager-6f7bb56bb6-cmhw2\" (UID: \"396e2470-cef3-48c8-adf2-6ac7cb57b2e3\") " pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-cmhw2" Apr 16 14:38:10.960174 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:10.960086 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p2d4n\" (UniqueName: \"kubernetes.io/projected/396e2470-cef3-48c8-adf2-6ac7cb57b2e3-kube-api-access-p2d4n\") pod \"opendatahub-operator-controller-manager-6f7bb56bb6-cmhw2\" (UID: \"396e2470-cef3-48c8-adf2-6ac7cb57b2e3\") " pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-cmhw2" Apr 16 14:38:10.960174 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:10.960167 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/396e2470-cef3-48c8-adf2-6ac7cb57b2e3-webhook-cert\") pod \"opendatahub-operator-controller-manager-6f7bb56bb6-cmhw2\" (UID: \"396e2470-cef3-48c8-adf2-6ac7cb57b2e3\") " pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-cmhw2" Apr 16 14:38:10.960400 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:10.960196 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/396e2470-cef3-48c8-adf2-6ac7cb57b2e3-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6f7bb56bb6-cmhw2\" (UID: \"396e2470-cef3-48c8-adf2-6ac7cb57b2e3\") " pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-cmhw2" Apr 16 14:38:10.962669 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:10.962642 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/396e2470-cef3-48c8-adf2-6ac7cb57b2e3-webhook-cert\") pod \"opendatahub-operator-controller-manager-6f7bb56bb6-cmhw2\" (UID: \"396e2470-cef3-48c8-adf2-6ac7cb57b2e3\") " pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-cmhw2" Apr 16 14:38:10.962785 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:10.962665 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/396e2470-cef3-48c8-adf2-6ac7cb57b2e3-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6f7bb56bb6-cmhw2\" (UID: \"396e2470-cef3-48c8-adf2-6ac7cb57b2e3\") " pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-cmhw2" Apr 16 14:38:10.969643 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:10.969621 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2d4n\" (UniqueName: \"kubernetes.io/projected/396e2470-cef3-48c8-adf2-6ac7cb57b2e3-kube-api-access-p2d4n\") pod \"opendatahub-operator-controller-manager-6f7bb56bb6-cmhw2\" (UID: \"396e2470-cef3-48c8-adf2-6ac7cb57b2e3\") " pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-cmhw2" Apr 16 14:38:10.980908 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:10.980889 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-cmhw2" Apr 16 14:38:11.109204 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:11.109171 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-cmhw2"] Apr 16 14:38:11.113312 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:38:11.113284 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod396e2470_cef3_48c8_adf2_6ac7cb57b2e3.slice/crio-d786841b658cbca061db4ed2166ba65a03d9e91373209265c367fbfa56011558 WatchSource:0}: Error finding container d786841b658cbca061db4ed2166ba65a03d9e91373209265c367fbfa56011558: Status 404 returned error can't find the container with id d786841b658cbca061db4ed2166ba65a03d9e91373209265c367fbfa56011558 Apr 16 14:38:11.661894 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:11.661854 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-cmhw2" event={"ID":"396e2470-cef3-48c8-adf2-6ac7cb57b2e3","Type":"ContainerStarted","Data":"d786841b658cbca061db4ed2166ba65a03d9e91373209265c367fbfa56011558"} Apr 16 14:38:13.671779 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:13.671740 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-cmhw2" event={"ID":"396e2470-cef3-48c8-adf2-6ac7cb57b2e3","Type":"ContainerStarted","Data":"1879f6787f8d5d53872e5e1f622d3e0e7a8008d1b131e579094f9a31386d1ff5"} Apr 16 14:38:13.672182 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:13.671870 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-cmhw2" Apr 16 14:38:13.696192 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:13.696117 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-cmhw2" podStartSLOduration=1.264698408 podStartE2EDuration="3.696097832s" podCreationTimestamp="2026-04-16 14:38:10 +0000 UTC" firstStartedPulling="2026-04-16 14:38:11.114846341 +0000 UTC m=+497.732636082" lastFinishedPulling="2026-04-16 14:38:13.546245765 +0000 UTC m=+500.164035506" observedRunningTime="2026-04-16 14:38:13.696054472 +0000 UTC m=+500.313844236" watchObservedRunningTime="2026-04-16 14:38:13.696097832 +0000 UTC m=+500.313887595" Apr 16 14:38:24.677327 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:24.677290 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-cmhw2" Apr 16 14:38:32.720006 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:32.719928 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-76cd85c697-7cbd8"] Apr 16 14:38:32.722449 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:32.722433 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-76cd85c697-7cbd8" Apr 16 14:38:32.725145 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:32.725111 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:38:32.726063 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:32.725926 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 14:38:32.726063 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:32.725943 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 14:38:32.726063 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:32.725931 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 14:38:32.726063 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:32.726008 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-kd2cq\"" Apr 16 14:38:32.726363 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:32.726201 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 14:38:32.742126 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:32.742094 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-76cd85c697-7cbd8"] Apr 16 14:38:32.826603 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:32.826563 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/71dd7fb7-6b6c-4fd2-8573-d5aeefd1521c-metrics-cert\") pod \"lws-controller-manager-76cd85c697-7cbd8\" (UID: \"71dd7fb7-6b6c-4fd2-8573-d5aeefd1521c\") " pod="openshift-lws-operator/lws-controller-manager-76cd85c697-7cbd8" Apr 16 14:38:32.826603 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:32.826602 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/71dd7fb7-6b6c-4fd2-8573-d5aeefd1521c-manager-config\") pod \"lws-controller-manager-76cd85c697-7cbd8\" (UID: \"71dd7fb7-6b6c-4fd2-8573-d5aeefd1521c\") " pod="openshift-lws-operator/lws-controller-manager-76cd85c697-7cbd8" Apr 16 14:38:32.826837 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:32.826648 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzjw5\" (UniqueName: \"kubernetes.io/projected/71dd7fb7-6b6c-4fd2-8573-d5aeefd1521c-kube-api-access-bzjw5\") pod \"lws-controller-manager-76cd85c697-7cbd8\" (UID: \"71dd7fb7-6b6c-4fd2-8573-d5aeefd1521c\") " pod="openshift-lws-operator/lws-controller-manager-76cd85c697-7cbd8" Apr 16 14:38:32.826837 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:32.826743 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71dd7fb7-6b6c-4fd2-8573-d5aeefd1521c-cert\") pod \"lws-controller-manager-76cd85c697-7cbd8\" (UID: \"71dd7fb7-6b6c-4fd2-8573-d5aeefd1521c\") " pod="openshift-lws-operator/lws-controller-manager-76cd85c697-7cbd8" Apr 16 14:38:32.927988 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:32.927949 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bzjw5\" (UniqueName: \"kubernetes.io/projected/71dd7fb7-6b6c-4fd2-8573-d5aeefd1521c-kube-api-access-bzjw5\") pod \"lws-controller-manager-76cd85c697-7cbd8\" (UID: \"71dd7fb7-6b6c-4fd2-8573-d5aeefd1521c\") " pod="openshift-lws-operator/lws-controller-manager-76cd85c697-7cbd8" Apr 16 14:38:32.928216 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:32.928010 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71dd7fb7-6b6c-4fd2-8573-d5aeefd1521c-cert\") pod \"lws-controller-manager-76cd85c697-7cbd8\" (UID: \"71dd7fb7-6b6c-4fd2-8573-d5aeefd1521c\") " pod="openshift-lws-operator/lws-controller-manager-76cd85c697-7cbd8" Apr 16 14:38:32.928216 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:32.928085 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/71dd7fb7-6b6c-4fd2-8573-d5aeefd1521c-metrics-cert\") pod \"lws-controller-manager-76cd85c697-7cbd8\" (UID: \"71dd7fb7-6b6c-4fd2-8573-d5aeefd1521c\") " pod="openshift-lws-operator/lws-controller-manager-76cd85c697-7cbd8" Apr 16 14:38:32.928216 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:32.928113 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/71dd7fb7-6b6c-4fd2-8573-d5aeefd1521c-manager-config\") pod \"lws-controller-manager-76cd85c697-7cbd8\" (UID: \"71dd7fb7-6b6c-4fd2-8573-d5aeefd1521c\") " pod="openshift-lws-operator/lws-controller-manager-76cd85c697-7cbd8" Apr 16 14:38:32.928892 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:32.928866 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/71dd7fb7-6b6c-4fd2-8573-d5aeefd1521c-manager-config\") pod \"lws-controller-manager-76cd85c697-7cbd8\" (UID: \"71dd7fb7-6b6c-4fd2-8573-d5aeefd1521c\") " pod="openshift-lws-operator/lws-controller-manager-76cd85c697-7cbd8" Apr 16 14:38:32.931236 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:32.931212 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/71dd7fb7-6b6c-4fd2-8573-d5aeefd1521c-metrics-cert\") pod \"lws-controller-manager-76cd85c697-7cbd8\" (UID: \"71dd7fb7-6b6c-4fd2-8573-d5aeefd1521c\") " pod="openshift-lws-operator/lws-controller-manager-76cd85c697-7cbd8" Apr 16 14:38:32.931359 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:32.931296 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71dd7fb7-6b6c-4fd2-8573-d5aeefd1521c-cert\") pod \"lws-controller-manager-76cd85c697-7cbd8\" (UID: \"71dd7fb7-6b6c-4fd2-8573-d5aeefd1521c\") " pod="openshift-lws-operator/lws-controller-manager-76cd85c697-7cbd8" Apr 16 14:38:32.940749 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:32.940717 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzjw5\" (UniqueName: \"kubernetes.io/projected/71dd7fb7-6b6c-4fd2-8573-d5aeefd1521c-kube-api-access-bzjw5\") pod \"lws-controller-manager-76cd85c697-7cbd8\" (UID: \"71dd7fb7-6b6c-4fd2-8573-d5aeefd1521c\") " pod="openshift-lws-operator/lws-controller-manager-76cd85c697-7cbd8" Apr 16 14:38:33.042958 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:33.042923 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-76cd85c697-7cbd8" Apr 16 14:38:33.209121 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:33.209089 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-76cd85c697-7cbd8"] Apr 16 14:38:33.212322 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:38:33.212286 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71dd7fb7_6b6c_4fd2_8573_d5aeefd1521c.slice/crio-82d6c2229f11a5188e3d8daeea13a8f88af489b2a45d7525ce3aca3cc2f44c62 WatchSource:0}: Error finding container 82d6c2229f11a5188e3d8daeea13a8f88af489b2a45d7525ce3aca3cc2f44c62: Status 404 returned error can't find the container with id 82d6c2229f11a5188e3d8daeea13a8f88af489b2a45d7525ce3aca3cc2f44c62 Apr 16 14:38:33.740189 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:33.740151 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-76cd85c697-7cbd8" event={"ID":"71dd7fb7-6b6c-4fd2-8573-d5aeefd1521c","Type":"ContainerStarted","Data":"82d6c2229f11a5188e3d8daeea13a8f88af489b2a45d7525ce3aca3cc2f44c62"} Apr 16 14:38:35.748437 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:35.748400 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-76cd85c697-7cbd8" event={"ID":"71dd7fb7-6b6c-4fd2-8573-d5aeefd1521c","Type":"ContainerStarted","Data":"08661de780f9e259dffa19026403769b9a06bcd4dd0d4a819ea69f48688866c7"} Apr 16 14:38:35.748821 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:35.748514 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-76cd85c697-7cbd8" Apr 16 14:38:35.765411 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:35.765357 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-76cd85c697-7cbd8" podStartSLOduration=1.32767999 podStartE2EDuration="3.765340444s" podCreationTimestamp="2026-04-16 14:38:32 +0000 UTC" firstStartedPulling="2026-04-16 14:38:33.214052231 +0000 UTC m=+519.831841973" lastFinishedPulling="2026-04-16 14:38:35.65171267 +0000 UTC m=+522.269502427" observedRunningTime="2026-04-16 14:38:35.764647648 +0000 UTC m=+522.382437411" watchObservedRunningTime="2026-04-16 14:38:35.765340444 +0000 UTC m=+522.383130210" Apr 16 14:38:46.754645 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:38:46.754614 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-76cd85c697-7cbd8" Apr 16 14:39:01.040177 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.040138 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm"] Apr 16 14:39:01.046986 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.046960 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" Apr 16 14:39:01.049480 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.049444 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 14:39:01.049628 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.049502 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 14:39:01.050068 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.049508 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 14:39:01.050204 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.050189 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-8nj5v\"" Apr 16 14:39:01.056243 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.056211 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm"] Apr 16 14:39:01.180642 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.180604 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-istio-data\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm\" (UID: \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" Apr 16 14:39:01.180642 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.180647 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm\" (UID: \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" Apr 16 14:39:01.180859 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.180676 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm\" (UID: \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" Apr 16 14:39:01.180859 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.180802 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm\" (UID: \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" Apr 16 14:39:01.180926 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.180859 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-istio-token\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm\" (UID: \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" Apr 16 14:39:01.180926 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.180890 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm\" (UID: \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" Apr 16 14:39:01.180926 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.180907 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28qdz\" (UniqueName: \"kubernetes.io/projected/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-kube-api-access-28qdz\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm\" (UID: \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" Apr 16 14:39:01.181021 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.180933 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm\" (UID: \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" Apr 16 14:39:01.181021 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.180963 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm\" (UID: \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" Apr 16 14:39:01.282412 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.282370 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm\" (UID: \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" Apr 16 14:39:01.282618 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.282435 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-istio-token\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm\" (UID: \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" Apr 16 14:39:01.282618 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.282467 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm\" (UID: \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" Apr 16 14:39:01.282618 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.282490 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-28qdz\" (UniqueName: \"kubernetes.io/projected/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-kube-api-access-28qdz\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm\" (UID: \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" Apr 16 14:39:01.282618 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.282524 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm\" (UID: \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" Apr 16 14:39:01.282854 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.282688 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm\" (UID: \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" Apr 16 14:39:01.282854 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.282750 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-istio-data\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm\" (UID: \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" Apr 16 14:39:01.282854 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.282798 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm\" (UID: \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" Apr 16 14:39:01.283009 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.282848 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm\" (UID: \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" Apr 16 14:39:01.283178 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.283153 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-istio-data\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm\" (UID: \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" Apr 16 14:39:01.283345 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.283326 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm\" (UID: \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" Apr 16 14:39:01.283450 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.283363 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm\" (UID: \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" Apr 16 14:39:01.283450 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.283405 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm\" (UID: \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" Apr 16 14:39:01.283531 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.283516 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm\" (UID: \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" Apr 16 14:39:01.285423 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.285392 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm\" (UID: \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" Apr 16 14:39:01.285518 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.285501 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm\" (UID: \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" Apr 16 14:39:01.291579 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.291517 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9"] Apr 16 14:39:01.294157 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.294139 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9" Apr 16 14:39:01.301066 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.301019 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-istio-token\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm\" (UID: \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" Apr 16 14:39:01.301347 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.301324 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-28qdz\" (UniqueName: \"kubernetes.io/projected/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-kube-api-access-28qdz\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm\" (UID: \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" Apr 16 14:39:01.305788 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.305767 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9"] Apr 16 14:39:01.366911 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.366875 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" Apr 16 14:39:01.383999 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.383972 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/0400e2bb-d2d7-4633-bdc2-77847ef73977-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9\" (UID: \"0400e2bb-d2d7-4633-bdc2-77847ef73977\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9" Apr 16 14:39:01.384134 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.384017 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/0400e2bb-d2d7-4633-bdc2-77847ef73977-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9\" (UID: \"0400e2bb-d2d7-4633-bdc2-77847ef73977\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9" Apr 16 14:39:01.384134 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.384055 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/0400e2bb-d2d7-4633-bdc2-77847ef73977-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9\" (UID: \"0400e2bb-d2d7-4633-bdc2-77847ef73977\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9" Apr 16 14:39:01.384134 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.384079 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6gmb\" (UniqueName: \"kubernetes.io/projected/0400e2bb-d2d7-4633-bdc2-77847ef73977-kube-api-access-s6gmb\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9\" (UID: \"0400e2bb-d2d7-4633-bdc2-77847ef73977\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9" Apr 16 14:39:01.384134 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.384120 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/0400e2bb-d2d7-4633-bdc2-77847ef73977-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9\" (UID: \"0400e2bb-d2d7-4633-bdc2-77847ef73977\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9" Apr 16 14:39:01.384286 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.384160 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/0400e2bb-d2d7-4633-bdc2-77847ef73977-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9\" (UID: \"0400e2bb-d2d7-4633-bdc2-77847ef73977\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9" Apr 16 14:39:01.384286 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.384196 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/0400e2bb-d2d7-4633-bdc2-77847ef73977-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9\" (UID: \"0400e2bb-d2d7-4633-bdc2-77847ef73977\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9" Apr 16 14:39:01.384363 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.384298 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/0400e2bb-d2d7-4633-bdc2-77847ef73977-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9\" (UID: \"0400e2bb-d2d7-4633-bdc2-77847ef73977\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9" Apr 16 14:39:01.384363 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.384338 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/0400e2bb-d2d7-4633-bdc2-77847ef73977-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9\" (UID: \"0400e2bb-d2d7-4633-bdc2-77847ef73977\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9" Apr 16 14:39:01.484910 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.484875 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/0400e2bb-d2d7-4633-bdc2-77847ef73977-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9\" (UID: \"0400e2bb-d2d7-4633-bdc2-77847ef73977\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9" Apr 16 14:39:01.484910 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.484912 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/0400e2bb-d2d7-4633-bdc2-77847ef73977-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9\" (UID: \"0400e2bb-d2d7-4633-bdc2-77847ef73977\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9" Apr 16 14:39:01.485211 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.484937 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/0400e2bb-d2d7-4633-bdc2-77847ef73977-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9\" (UID: \"0400e2bb-d2d7-4633-bdc2-77847ef73977\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9" Apr 16 14:39:01.485211 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.484965 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/0400e2bb-d2d7-4633-bdc2-77847ef73977-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9\" (UID: \"0400e2bb-d2d7-4633-bdc2-77847ef73977\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9" Apr 16 14:39:01.485211 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.485075 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/0400e2bb-d2d7-4633-bdc2-77847ef73977-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9\" (UID: \"0400e2bb-d2d7-4633-bdc2-77847ef73977\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9" Apr 16 14:39:01.485211 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.485117 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s6gmb\" (UniqueName: \"kubernetes.io/projected/0400e2bb-d2d7-4633-bdc2-77847ef73977-kube-api-access-s6gmb\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9\" (UID: \"0400e2bb-d2d7-4633-bdc2-77847ef73977\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9" Apr 16 14:39:01.485211 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.485184 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/0400e2bb-d2d7-4633-bdc2-77847ef73977-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9\" (UID: \"0400e2bb-d2d7-4633-bdc2-77847ef73977\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9" Apr 16 14:39:01.485484 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.485216 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/0400e2bb-d2d7-4633-bdc2-77847ef73977-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9\" (UID: \"0400e2bb-d2d7-4633-bdc2-77847ef73977\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9" Apr 16 14:39:01.485484 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.485265 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/0400e2bb-d2d7-4633-bdc2-77847ef73977-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9\" (UID: \"0400e2bb-d2d7-4633-bdc2-77847ef73977\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9" Apr 16 14:39:01.485484 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.485372 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/0400e2bb-d2d7-4633-bdc2-77847ef73977-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9\" (UID: \"0400e2bb-d2d7-4633-bdc2-77847ef73977\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9" Apr 16 14:39:01.485484 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.485396 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/0400e2bb-d2d7-4633-bdc2-77847ef73977-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9\" (UID: \"0400e2bb-d2d7-4633-bdc2-77847ef73977\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9" Apr 16 14:39:01.485682 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.485491 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/0400e2bb-d2d7-4633-bdc2-77847ef73977-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9\" (UID: \"0400e2bb-d2d7-4633-bdc2-77847ef73977\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9" Apr 16 14:39:01.485682 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.485649 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/0400e2bb-d2d7-4633-bdc2-77847ef73977-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9\" (UID: \"0400e2bb-d2d7-4633-bdc2-77847ef73977\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9" Apr 16 14:39:01.485987 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.485964 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/0400e2bb-d2d7-4633-bdc2-77847ef73977-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9\" (UID: \"0400e2bb-d2d7-4633-bdc2-77847ef73977\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9" Apr 16 14:39:01.487423 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.487404 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/0400e2bb-d2d7-4633-bdc2-77847ef73977-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9\" (UID: \"0400e2bb-d2d7-4633-bdc2-77847ef73977\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9" Apr 16 14:39:01.487810 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.487791 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/0400e2bb-d2d7-4633-bdc2-77847ef73977-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9\" (UID: \"0400e2bb-d2d7-4633-bdc2-77847ef73977\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9" Apr 16 14:39:01.500717 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.500693 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/0400e2bb-d2d7-4633-bdc2-77847ef73977-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9\" (UID: \"0400e2bb-d2d7-4633-bdc2-77847ef73977\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9" Apr 16 14:39:01.507782 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.507762 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6gmb\" (UniqueName: \"kubernetes.io/projected/0400e2bb-d2d7-4633-bdc2-77847ef73977-kube-api-access-s6gmb\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9\" (UID: \"0400e2bb-d2d7-4633-bdc2-77847ef73977\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9" Apr 16 14:39:01.526178 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.526151 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm"] Apr 16 14:39:01.529222 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:39:01.529192 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63f5bfe1_28fc_4d71_9ab9_1dbff5b5850e.slice/crio-d81947ffa98a8f9dfb8f444cf99cf3cf6c1c3f72bea038350a0998ee9dd85771 WatchSource:0}: Error finding container d81947ffa98a8f9dfb8f444cf99cf3cf6c1c3f72bea038350a0998ee9dd85771: Status 404 returned error can't find the container with id d81947ffa98a8f9dfb8f444cf99cf3cf6c1c3f72bea038350a0998ee9dd85771 Apr 16 14:39:01.628962 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.628864 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9" Apr 16 14:39:01.767908 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.767872 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9"] Apr 16 14:39:01.772329 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:39:01.772302 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0400e2bb_d2d7_4633_bdc2_77847ef73977.slice/crio-860eff9c9ce8982455c224272feddbb7e3bebd046c095472a15a9150c3881580 WatchSource:0}: Error finding container 860eff9c9ce8982455c224272feddbb7e3bebd046c095472a15a9150c3881580: Status 404 returned error can't find the container with id 860eff9c9ce8982455c224272feddbb7e3bebd046c095472a15a9150c3881580 Apr 16 14:39:01.841526 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.841486 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" event={"ID":"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e","Type":"ContainerStarted","Data":"d81947ffa98a8f9dfb8f444cf99cf3cf6c1c3f72bea038350a0998ee9dd85771"} Apr 16 14:39:01.842550 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:01.842525 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9" event={"ID":"0400e2bb-d2d7-4633-bdc2-77847ef73977","Type":"ContainerStarted","Data":"860eff9c9ce8982455c224272feddbb7e3bebd046c095472a15a9150c3881580"} Apr 16 14:39:03.887473 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:03.887434 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 16 14:39:03.887855 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:03.887525 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 16 14:39:03.887855 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:03.887608 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 16 14:39:03.893144 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:03.893118 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 16 14:39:03.893247 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:03.893186 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 16 14:39:03.893247 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:03.893225 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 16 14:39:04.066763 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.066227 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7964f57bdf-jqk5z"] Apr 16 14:39:04.069414 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.069391 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7964f57bdf-jqk5z" Apr 16 14:39:04.074893 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.073879 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 14:39:04.074893 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.074169 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 14:39:04.074893 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.074422 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 14:39:04.074893 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.074609 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 14:39:04.076109 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.075755 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 14:39:04.076109 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.075970 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-5hq8m\"" Apr 16 14:39:04.080741 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.080718 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 14:39:04.086646 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.086559 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7964f57bdf-jqk5z"] Apr 16 14:39:04.214431 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.214348 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/08fa673f-343b-4303-89a6-7d03095cc905-console-serving-cert\") pod \"console-7964f57bdf-jqk5z\" (UID: \"08fa673f-343b-4303-89a6-7d03095cc905\") " pod="openshift-console/console-7964f57bdf-jqk5z" Apr 16 14:39:04.214431 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.214415 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/08fa673f-343b-4303-89a6-7d03095cc905-console-config\") pod \"console-7964f57bdf-jqk5z\" (UID: \"08fa673f-343b-4303-89a6-7d03095cc905\") " pod="openshift-console/console-7964f57bdf-jqk5z" Apr 16 14:39:04.214651 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.214522 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08fa673f-343b-4303-89a6-7d03095cc905-trusted-ca-bundle\") pod \"console-7964f57bdf-jqk5z\" (UID: \"08fa673f-343b-4303-89a6-7d03095cc905\") " pod="openshift-console/console-7964f57bdf-jqk5z" Apr 16 14:39:04.214651 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.214556 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/08fa673f-343b-4303-89a6-7d03095cc905-oauth-serving-cert\") pod \"console-7964f57bdf-jqk5z\" (UID: \"08fa673f-343b-4303-89a6-7d03095cc905\") " pod="openshift-console/console-7964f57bdf-jqk5z" Apr 16 14:39:04.214651 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.214596 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/08fa673f-343b-4303-89a6-7d03095cc905-console-oauth-config\") pod \"console-7964f57bdf-jqk5z\" (UID: \"08fa673f-343b-4303-89a6-7d03095cc905\") " pod="openshift-console/console-7964f57bdf-jqk5z" Apr 16 14:39:04.214797 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.214645 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r8x2\" (UniqueName: \"kubernetes.io/projected/08fa673f-343b-4303-89a6-7d03095cc905-kube-api-access-9r8x2\") pod \"console-7964f57bdf-jqk5z\" (UID: \"08fa673f-343b-4303-89a6-7d03095cc905\") " pod="openshift-console/console-7964f57bdf-jqk5z" Apr 16 14:39:04.214797 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.214709 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/08fa673f-343b-4303-89a6-7d03095cc905-service-ca\") pod \"console-7964f57bdf-jqk5z\" (UID: \"08fa673f-343b-4303-89a6-7d03095cc905\") " pod="openshift-console/console-7964f57bdf-jqk5z" Apr 16 14:39:04.315581 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.315532 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08fa673f-343b-4303-89a6-7d03095cc905-trusted-ca-bundle\") pod \"console-7964f57bdf-jqk5z\" (UID: \"08fa673f-343b-4303-89a6-7d03095cc905\") " pod="openshift-console/console-7964f57bdf-jqk5z" Apr 16 14:39:04.315581 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.315579 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/08fa673f-343b-4303-89a6-7d03095cc905-oauth-serving-cert\") pod \"console-7964f57bdf-jqk5z\" (UID: \"08fa673f-343b-4303-89a6-7d03095cc905\") " pod="openshift-console/console-7964f57bdf-jqk5z" Apr 16 14:39:04.315851 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.315717 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/08fa673f-343b-4303-89a6-7d03095cc905-console-oauth-config\") pod \"console-7964f57bdf-jqk5z\" (UID: \"08fa673f-343b-4303-89a6-7d03095cc905\") " pod="openshift-console/console-7964f57bdf-jqk5z" Apr 16 14:39:04.315851 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.315775 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9r8x2\" (UniqueName: \"kubernetes.io/projected/08fa673f-343b-4303-89a6-7d03095cc905-kube-api-access-9r8x2\") pod \"console-7964f57bdf-jqk5z\" (UID: \"08fa673f-343b-4303-89a6-7d03095cc905\") " pod="openshift-console/console-7964f57bdf-jqk5z" Apr 16 14:39:04.315851 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.315825 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/08fa673f-343b-4303-89a6-7d03095cc905-service-ca\") pod \"console-7964f57bdf-jqk5z\" (UID: \"08fa673f-343b-4303-89a6-7d03095cc905\") " pod="openshift-console/console-7964f57bdf-jqk5z" Apr 16 14:39:04.315991 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.315887 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/08fa673f-343b-4303-89a6-7d03095cc905-console-serving-cert\") pod \"console-7964f57bdf-jqk5z\" (UID: \"08fa673f-343b-4303-89a6-7d03095cc905\") " pod="openshift-console/console-7964f57bdf-jqk5z" Apr 16 14:39:04.315991 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.315966 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/08fa673f-343b-4303-89a6-7d03095cc905-console-config\") pod \"console-7964f57bdf-jqk5z\" (UID: \"08fa673f-343b-4303-89a6-7d03095cc905\") " pod="openshift-console/console-7964f57bdf-jqk5z" Apr 16 14:39:04.316499 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.316472 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/08fa673f-343b-4303-89a6-7d03095cc905-oauth-serving-cert\") pod \"console-7964f57bdf-jqk5z\" (UID: \"08fa673f-343b-4303-89a6-7d03095cc905\") " pod="openshift-console/console-7964f57bdf-jqk5z" Apr 16 14:39:04.316763 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.316740 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/08fa673f-343b-4303-89a6-7d03095cc905-console-config\") pod \"console-7964f57bdf-jqk5z\" (UID: \"08fa673f-343b-4303-89a6-7d03095cc905\") " pod="openshift-console/console-7964f57bdf-jqk5z" Apr 16 14:39:04.316862 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.316771 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/08fa673f-343b-4303-89a6-7d03095cc905-service-ca\") pod \"console-7964f57bdf-jqk5z\" (UID: \"08fa673f-343b-4303-89a6-7d03095cc905\") " pod="openshift-console/console-7964f57bdf-jqk5z" Apr 16 14:39:04.316925 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.316898 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08fa673f-343b-4303-89a6-7d03095cc905-trusted-ca-bundle\") pod \"console-7964f57bdf-jqk5z\" (UID: \"08fa673f-343b-4303-89a6-7d03095cc905\") " pod="openshift-console/console-7964f57bdf-jqk5z" Apr 16 14:39:04.318755 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.318735 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/08fa673f-343b-4303-89a6-7d03095cc905-console-serving-cert\") pod \"console-7964f57bdf-jqk5z\" (UID: \"08fa673f-343b-4303-89a6-7d03095cc905\") " pod="openshift-console/console-7964f57bdf-jqk5z" Apr 16 14:39:04.318859 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.318838 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/08fa673f-343b-4303-89a6-7d03095cc905-console-oauth-config\") pod \"console-7964f57bdf-jqk5z\" (UID: \"08fa673f-343b-4303-89a6-7d03095cc905\") " pod="openshift-console/console-7964f57bdf-jqk5z" Apr 16 14:39:04.325302 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.325278 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r8x2\" (UniqueName: \"kubernetes.io/projected/08fa673f-343b-4303-89a6-7d03095cc905-kube-api-access-9r8x2\") pod \"console-7964f57bdf-jqk5z\" (UID: \"08fa673f-343b-4303-89a6-7d03095cc905\") " pod="openshift-console/console-7964f57bdf-jqk5z" Apr 16 14:39:04.389257 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.389218 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7964f57bdf-jqk5z" Apr 16 14:39:04.516320 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.516287 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7964f57bdf-jqk5z"] Apr 16 14:39:04.520283 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:39:04.520246 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08fa673f_343b_4303_89a6_7d03095cc905.slice/crio-b4413404aa75c8d2ad9719ad60841de3c0924e2f0d220a1cf003df0146ac6ca3 WatchSource:0}: Error finding container b4413404aa75c8d2ad9719ad60841de3c0924e2f0d220a1cf003df0146ac6ca3: Status 404 returned error can't find the container with id b4413404aa75c8d2ad9719ad60841de3c0924e2f0d220a1cf003df0146ac6ca3 Apr 16 14:39:04.854395 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.854358 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7964f57bdf-jqk5z" event={"ID":"08fa673f-343b-4303-89a6-7d03095cc905","Type":"ContainerStarted","Data":"bc2ae186d3adcc81698a34eed0dc45d04cb15c5f9272ac6e951aaade72348837"} Apr 16 14:39:04.854395 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.854400 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7964f57bdf-jqk5z" event={"ID":"08fa673f-343b-4303-89a6-7d03095cc905","Type":"ContainerStarted","Data":"b4413404aa75c8d2ad9719ad60841de3c0924e2f0d220a1cf003df0146ac6ca3"} Apr 16 14:39:04.855790 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.855761 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" event={"ID":"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e","Type":"ContainerStarted","Data":"2a23cc0efa8bdc8ce1ea5a9fc75a45f74bef35eef93820de603678fdac3d924b"} Apr 16 14:39:04.856994 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.856974 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9" event={"ID":"0400e2bb-d2d7-4633-bdc2-77847ef73977","Type":"ContainerStarted","Data":"2e1706f3007fcad8cdea50bf2ec2fc0501179817718ebf67ae0449fbb38438d1"} Apr 16 14:39:04.875575 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.875515 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7964f57bdf-jqk5z" podStartSLOduration=0.875499107 podStartE2EDuration="875.499107ms" podCreationTimestamp="2026-04-16 14:39:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:39:04.873875506 +0000 UTC m=+551.491665269" watchObservedRunningTime="2026-04-16 14:39:04.875499107 +0000 UTC m=+551.493288871" Apr 16 14:39:04.894288 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.894248 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" podStartSLOduration=1.5383301660000002 podStartE2EDuration="3.894237088s" podCreationTimestamp="2026-04-16 14:39:01 +0000 UTC" firstStartedPulling="2026-04-16 14:39:01.531219007 +0000 UTC m=+548.149008748" lastFinishedPulling="2026-04-16 14:39:03.887125926 +0000 UTC m=+550.504915670" observedRunningTime="2026-04-16 14:39:04.892507362 +0000 UTC m=+551.510297127" watchObservedRunningTime="2026-04-16 14:39:04.894237088 +0000 UTC m=+551.512026852" Apr 16 14:39:04.911769 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:04.911723 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9" podStartSLOduration=1.7935219770000002 podStartE2EDuration="3.911710895s" podCreationTimestamp="2026-04-16 14:39:01 +0000 UTC" firstStartedPulling="2026-04-16 14:39:01.774710619 +0000 UTC m=+548.392500363" lastFinishedPulling="2026-04-16 14:39:03.892899528 +0000 UTC m=+550.510689281" observedRunningTime="2026-04-16 14:39:04.909552796 +0000 UTC m=+551.527342560" watchObservedRunningTime="2026-04-16 14:39:04.911710895 +0000 UTC m=+551.529500656" Apr 16 14:39:05.367011 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:05.366969 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" Apr 16 14:39:05.368626 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:05.368600 2579 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.134.0.28:15021/healthz/ready\": dial tcp 10.134.0.28:15021: connect: connection refused" start-of-body= Apr 16 14:39:05.368737 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:05.368649 2579 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" podUID="63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.28:15021/healthz/ready\": dial tcp 10.134.0.28:15021: connect: connection refused" Apr 16 14:39:05.629615 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:05.629521 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9" Apr 16 14:39:05.634749 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:05.634723 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9" Apr 16 14:39:05.861285 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:05.861245 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9" Apr 16 14:39:05.862423 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:05.862405 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9" Apr 16 14:39:05.919473 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:05.919382 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm"] Apr 16 14:39:06.367688 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:06.367651 2579 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.134.0.28:15021/healthz/ready\": dial tcp 10.134.0.28:15021: connect: connection refused" start-of-body= Apr 16 14:39:06.367871 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:06.367713 2579 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" podUID="63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.28:15021/healthz/ready\": dial tcp 10.134.0.28:15021: connect: connection refused" Apr 16 14:39:07.367455 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:07.367419 2579 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.134.0.28:15021/healthz/ready\": dial tcp 10.134.0.28:15021: connect: connection refused" start-of-body= Apr 16 14:39:07.367855 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:07.367476 2579 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" podUID="63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.28:15021/healthz/ready\": dial tcp 10.134.0.28:15021: connect: connection refused" Apr 16 14:39:07.872152 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:07.872080 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" podUID="63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e" containerName="istio-proxy" containerID="cri-o://2a23cc0efa8bdc8ce1ea5a9fc75a45f74bef35eef93820de603678fdac3d924b" gracePeriod=30 Apr 16 14:39:13.124654 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.124629 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" Apr 16 14:39:13.301242 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.301203 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-istiod-ca-cert\") pod \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\" (UID: \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\") " Apr 16 14:39:13.301425 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.301292 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-credential-socket\") pod \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\" (UID: \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\") " Apr 16 14:39:13.301425 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.301337 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-istio-data\") pod \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\" (UID: \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\") " Apr 16 14:39:13.301425 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.301360 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-workload-certs\") pod \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\" (UID: \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\") " Apr 16 14:39:13.301425 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.301416 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-istio-podinfo\") pod \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\" (UID: \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\") " Apr 16 14:39:13.301687 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.301442 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28qdz\" (UniqueName: \"kubernetes.io/projected/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-kube-api-access-28qdz\") pod \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\" (UID: \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\") " Apr 16 14:39:13.301687 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.301482 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-workload-socket\") pod \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\" (UID: \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\") " Apr 16 14:39:13.301687 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.301508 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-istio-envoy\") pod \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\" (UID: \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\") " Apr 16 14:39:13.301687 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.301541 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-istio-token\") pod \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\" (UID: \"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e\") " Apr 16 14:39:13.301687 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.301661 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-credential-socket" (OuterVolumeSpecName: "credential-socket") pod "63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e" (UID: "63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e"). InnerVolumeSpecName "credential-socket". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:39:13.301906 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.301672 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-istio-data" (OuterVolumeSpecName: "istio-data") pod "63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e" (UID: "63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e"). InnerVolumeSpecName "istio-data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:39:13.301906 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.301672 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-istiod-ca-cert" (OuterVolumeSpecName: "istiod-ca-cert") pod "63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e" (UID: "63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e"). InnerVolumeSpecName "istiod-ca-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:39:13.301906 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.301742 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-workload-socket" (OuterVolumeSpecName: "workload-socket") pod "63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e" (UID: "63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e"). InnerVolumeSpecName "workload-socket". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:39:13.301906 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.301768 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-workload-certs" (OuterVolumeSpecName: "workload-certs") pod "63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e" (UID: "63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e"). InnerVolumeSpecName "workload-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:39:13.301906 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.301892 2579 reconciler_common.go:299] "Volume detached for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-credential-socket\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:39:13.302108 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.301913 2579 reconciler_common.go:299] "Volume detached for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-istio-data\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:39:13.302108 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.301928 2579 reconciler_common.go:299] "Volume detached for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-workload-certs\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:39:13.302108 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.301942 2579 reconciler_common.go:299] "Volume detached for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-workload-socket\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:39:13.302108 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.301955 2579 reconciler_common.go:299] "Volume detached for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-istiod-ca-cert\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:39:13.303763 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.303729 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-istio-podinfo" (OuterVolumeSpecName: "istio-podinfo") pod "63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e" (UID: "63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e"). InnerVolumeSpecName "istio-podinfo". PluginName "kubernetes.io/downward-api", VolumeGIDValue "" Apr 16 14:39:13.303865 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.303803 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-kube-api-access-28qdz" (OuterVolumeSpecName: "kube-api-access-28qdz") pod "63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e" (UID: "63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e"). InnerVolumeSpecName "kube-api-access-28qdz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:39:13.304079 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.304057 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-istio-envoy" (OuterVolumeSpecName: "istio-envoy") pod "63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e" (UID: "63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e"). InnerVolumeSpecName "istio-envoy". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:39:13.304079 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.304069 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-istio-token" (OuterVolumeSpecName: "istio-token") pod "63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e" (UID: "63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:39:13.403057 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.403003 2579 reconciler_common.go:299] "Volume detached for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-istio-envoy\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:39:13.403057 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.403054 2579 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-istio-token\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:39:13.403264 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.403068 2579 reconciler_common.go:299] "Volume detached for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-istio-podinfo\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:39:13.403264 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.403082 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-28qdz\" (UniqueName: \"kubernetes.io/projected/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e-kube-api-access-28qdz\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:39:13.893081 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.893043 2579 generic.go:358] "Generic (PLEG): container finished" podID="63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e" containerID="2a23cc0efa8bdc8ce1ea5a9fc75a45f74bef35eef93820de603678fdac3d924b" exitCode=0 Apr 16 14:39:13.893287 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.893138 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" Apr 16 14:39:13.893287 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.893137 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" event={"ID":"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e","Type":"ContainerDied","Data":"2a23cc0efa8bdc8ce1ea5a9fc75a45f74bef35eef93820de603678fdac3d924b"} Apr 16 14:39:13.893287 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.893184 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm" event={"ID":"63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e","Type":"ContainerDied","Data":"d81947ffa98a8f9dfb8f444cf99cf3cf6c1c3f72bea038350a0998ee9dd85771"} Apr 16 14:39:13.893287 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.893206 2579 scope.go:117] "RemoveContainer" containerID="2a23cc0efa8bdc8ce1ea5a9fc75a45f74bef35eef93820de603678fdac3d924b" Apr 16 14:39:13.902169 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.902153 2579 scope.go:117] "RemoveContainer" containerID="2a23cc0efa8bdc8ce1ea5a9fc75a45f74bef35eef93820de603678fdac3d924b" Apr 16 14:39:13.902461 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:39:13.902443 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a23cc0efa8bdc8ce1ea5a9fc75a45f74bef35eef93820de603678fdac3d924b\": container with ID starting with 2a23cc0efa8bdc8ce1ea5a9fc75a45f74bef35eef93820de603678fdac3d924b not found: ID does not exist" containerID="2a23cc0efa8bdc8ce1ea5a9fc75a45f74bef35eef93820de603678fdac3d924b" Apr 16 14:39:13.902549 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.902472 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a23cc0efa8bdc8ce1ea5a9fc75a45f74bef35eef93820de603678fdac3d924b"} err="failed to get container status \"2a23cc0efa8bdc8ce1ea5a9fc75a45f74bef35eef93820de603678fdac3d924b\": rpc error: code = NotFound desc = could not find container \"2a23cc0efa8bdc8ce1ea5a9fc75a45f74bef35eef93820de603678fdac3d924b\": container with ID starting with 2a23cc0efa8bdc8ce1ea5a9fc75a45f74bef35eef93820de603678fdac3d924b not found: ID does not exist" Apr 16 14:39:13.915683 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.915657 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm"] Apr 16 14:39:13.920460 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.920430 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5rrlrm"] Apr 16 14:39:13.978328 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:13.978296 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e" path="/var/lib/kubelet/pods/63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e/volumes" Apr 16 14:39:14.389846 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:14.389810 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7964f57bdf-jqk5z" Apr 16 14:39:14.390274 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:14.389860 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7964f57bdf-jqk5z" Apr 16 14:39:14.394657 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:14.394635 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7964f57bdf-jqk5z" Apr 16 14:39:14.901326 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:14.901301 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7964f57bdf-jqk5z" Apr 16 14:39:29.160662 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:29.160624 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-x5mgm"] Apr 16 14:39:29.161086 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:29.160971 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e" containerName="istio-proxy" Apr 16 14:39:29.161086 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:29.160981 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e" containerName="istio-proxy" Apr 16 14:39:29.161086 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:29.161068 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="63f5bfe1-28fc-4d71-9ab9-1dbff5b5850e" containerName="istio-proxy" Apr 16 14:39:29.165539 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:29.165514 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-x5mgm" Apr 16 14:39:29.167715 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:29.167688 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 14:39:29.168399 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:29.168376 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 14:39:29.168517 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:29.168430 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-pv6r2\"" Apr 16 14:39:29.174930 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:29.174907 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-x5mgm"] Apr 16 14:39:29.229286 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:29.229256 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nrjx\" (UniqueName: \"kubernetes.io/projected/93af145b-74f9-4ac9-aea5-80b05fb04636-kube-api-access-8nrjx\") pod \"kuadrant-operator-catalog-x5mgm\" (UID: \"93af145b-74f9-4ac9-aea5-80b05fb04636\") " pod="kuadrant-system/kuadrant-operator-catalog-x5mgm" Apr 16 14:39:29.329979 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:29.329945 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nrjx\" (UniqueName: \"kubernetes.io/projected/93af145b-74f9-4ac9-aea5-80b05fb04636-kube-api-access-8nrjx\") pod \"kuadrant-operator-catalog-x5mgm\" (UID: \"93af145b-74f9-4ac9-aea5-80b05fb04636\") " pod="kuadrant-system/kuadrant-operator-catalog-x5mgm" Apr 16 14:39:29.338503 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:29.338477 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nrjx\" (UniqueName: \"kubernetes.io/projected/93af145b-74f9-4ac9-aea5-80b05fb04636-kube-api-access-8nrjx\") pod \"kuadrant-operator-catalog-x5mgm\" (UID: \"93af145b-74f9-4ac9-aea5-80b05fb04636\") " pod="kuadrant-system/kuadrant-operator-catalog-x5mgm" Apr 16 14:39:29.476416 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:29.476316 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-x5mgm" Apr 16 14:39:29.515972 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:29.515939 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-x5mgm"] Apr 16 14:39:29.601477 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:29.601443 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-x5mgm"] Apr 16 14:39:29.604646 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:39:29.604617 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93af145b_74f9_4ac9_aea5_80b05fb04636.slice/crio-aef29c18f4eb292bd8b96e36d60e74a1febe6881059d77e81e6f4ac861724419 WatchSource:0}: Error finding container aef29c18f4eb292bd8b96e36d60e74a1febe6881059d77e81e6f4ac861724419: Status 404 returned error can't find the container with id aef29c18f4eb292bd8b96e36d60e74a1febe6881059d77e81e6f4ac861724419 Apr 16 14:39:29.725135 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:29.725102 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-mklm2"] Apr 16 14:39:29.729809 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:29.729760 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-mklm2" Apr 16 14:39:29.735447 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:29.735418 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-mklm2"] Apr 16 14:39:29.833779 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:29.833741 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfxrd\" (UniqueName: \"kubernetes.io/projected/b9899edc-232d-4353-bac7-ff31a12e96b6-kube-api-access-xfxrd\") pod \"kuadrant-operator-catalog-mklm2\" (UID: \"b9899edc-232d-4353-bac7-ff31a12e96b6\") " pod="kuadrant-system/kuadrant-operator-catalog-mklm2" Apr 16 14:39:29.935080 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:29.935046 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xfxrd\" (UniqueName: \"kubernetes.io/projected/b9899edc-232d-4353-bac7-ff31a12e96b6-kube-api-access-xfxrd\") pod \"kuadrant-operator-catalog-mklm2\" (UID: \"b9899edc-232d-4353-bac7-ff31a12e96b6\") " pod="kuadrant-system/kuadrant-operator-catalog-mklm2" Apr 16 14:39:29.943778 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:29.943747 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfxrd\" (UniqueName: \"kubernetes.io/projected/b9899edc-232d-4353-bac7-ff31a12e96b6-kube-api-access-xfxrd\") pod \"kuadrant-operator-catalog-mklm2\" (UID: \"b9899edc-232d-4353-bac7-ff31a12e96b6\") " pod="kuadrant-system/kuadrant-operator-catalog-mklm2" Apr 16 14:39:29.953669 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:29.953636 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-x5mgm" event={"ID":"93af145b-74f9-4ac9-aea5-80b05fb04636","Type":"ContainerStarted","Data":"aef29c18f4eb292bd8b96e36d60e74a1febe6881059d77e81e6f4ac861724419"} Apr 16 14:39:30.040188 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:30.040149 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-mklm2" Apr 16 14:39:30.194684 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:30.194658 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-mklm2"] Apr 16 14:39:30.234992 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:39:30.234950 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9899edc_232d_4353_bac7_ff31a12e96b6.slice/crio-c2980214d6e571cd4c746bd8c3088449da76872329851e8997d0c5e171e611d8 WatchSource:0}: Error finding container c2980214d6e571cd4c746bd8c3088449da76872329851e8997d0c5e171e611d8: Status 404 returned error can't find the container with id c2980214d6e571cd4c746bd8c3088449da76872329851e8997d0c5e171e611d8 Apr 16 14:39:30.959682 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:30.959591 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-mklm2" event={"ID":"b9899edc-232d-4353-bac7-ff31a12e96b6","Type":"ContainerStarted","Data":"c2980214d6e571cd4c746bd8c3088449da76872329851e8997d0c5e171e611d8"} Apr 16 14:39:31.964839 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:31.964804 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-mklm2" event={"ID":"b9899edc-232d-4353-bac7-ff31a12e96b6","Type":"ContainerStarted","Data":"508062a9bf69a6354b70196990b1c5791790cee7929f6b8995d923ddc940bac5"} Apr 16 14:39:31.966177 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:31.966118 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-x5mgm" event={"ID":"93af145b-74f9-4ac9-aea5-80b05fb04636","Type":"ContainerStarted","Data":"1cf3d2f7d8429b47fe3cd34228ebbbda246fb141796486f6a91f2354f027b397"} Apr 16 14:39:31.966293 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:31.966201 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-x5mgm" podUID="93af145b-74f9-4ac9-aea5-80b05fb04636" containerName="registry-server" containerID="cri-o://1cf3d2f7d8429b47fe3cd34228ebbbda246fb141796486f6a91f2354f027b397" gracePeriod=2 Apr 16 14:39:31.982480 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:31.982421 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-mklm2" podStartSLOduration=1.492894591 podStartE2EDuration="2.982406532s" podCreationTimestamp="2026-04-16 14:39:29 +0000 UTC" firstStartedPulling="2026-04-16 14:39:30.236765016 +0000 UTC m=+576.854554770" lastFinishedPulling="2026-04-16 14:39:31.726276967 +0000 UTC m=+578.344066711" observedRunningTime="2026-04-16 14:39:31.980215881 +0000 UTC m=+578.598005644" watchObservedRunningTime="2026-04-16 14:39:31.982406532 +0000 UTC m=+578.600196295" Apr 16 14:39:31.995288 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:31.995245 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-x5mgm" podStartSLOduration=0.877463076 podStartE2EDuration="2.995229982s" podCreationTimestamp="2026-04-16 14:39:29 +0000 UTC" firstStartedPulling="2026-04-16 14:39:29.605930993 +0000 UTC m=+576.223720734" lastFinishedPulling="2026-04-16 14:39:31.723697896 +0000 UTC m=+578.341487640" observedRunningTime="2026-04-16 14:39:31.994276582 +0000 UTC m=+578.612066346" watchObservedRunningTime="2026-04-16 14:39:31.995229982 +0000 UTC m=+578.613019744" Apr 16 14:39:32.209823 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:32.209800 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-x5mgm" Apr 16 14:39:32.257831 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:32.257750 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nrjx\" (UniqueName: \"kubernetes.io/projected/93af145b-74f9-4ac9-aea5-80b05fb04636-kube-api-access-8nrjx\") pod \"93af145b-74f9-4ac9-aea5-80b05fb04636\" (UID: \"93af145b-74f9-4ac9-aea5-80b05fb04636\") " Apr 16 14:39:32.259946 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:32.259918 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93af145b-74f9-4ac9-aea5-80b05fb04636-kube-api-access-8nrjx" (OuterVolumeSpecName: "kube-api-access-8nrjx") pod "93af145b-74f9-4ac9-aea5-80b05fb04636" (UID: "93af145b-74f9-4ac9-aea5-80b05fb04636"). InnerVolumeSpecName "kube-api-access-8nrjx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:39:32.358516 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:32.358485 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8nrjx\" (UniqueName: \"kubernetes.io/projected/93af145b-74f9-4ac9-aea5-80b05fb04636-kube-api-access-8nrjx\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:39:32.970805 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:32.970767 2579 generic.go:358] "Generic (PLEG): container finished" podID="93af145b-74f9-4ac9-aea5-80b05fb04636" containerID="1cf3d2f7d8429b47fe3cd34228ebbbda246fb141796486f6a91f2354f027b397" exitCode=0 Apr 16 14:39:32.971320 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:32.970829 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-x5mgm" Apr 16 14:39:32.971320 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:32.970851 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-x5mgm" event={"ID":"93af145b-74f9-4ac9-aea5-80b05fb04636","Type":"ContainerDied","Data":"1cf3d2f7d8429b47fe3cd34228ebbbda246fb141796486f6a91f2354f027b397"} Apr 16 14:39:32.971320 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:32.970888 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-x5mgm" event={"ID":"93af145b-74f9-4ac9-aea5-80b05fb04636","Type":"ContainerDied","Data":"aef29c18f4eb292bd8b96e36d60e74a1febe6881059d77e81e6f4ac861724419"} Apr 16 14:39:32.971320 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:32.970903 2579 scope.go:117] "RemoveContainer" containerID="1cf3d2f7d8429b47fe3cd34228ebbbda246fb141796486f6a91f2354f027b397" Apr 16 14:39:32.979618 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:32.979601 2579 scope.go:117] "RemoveContainer" containerID="1cf3d2f7d8429b47fe3cd34228ebbbda246fb141796486f6a91f2354f027b397" Apr 16 14:39:32.979848 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:39:32.979833 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cf3d2f7d8429b47fe3cd34228ebbbda246fb141796486f6a91f2354f027b397\": container with ID starting with 1cf3d2f7d8429b47fe3cd34228ebbbda246fb141796486f6a91f2354f027b397 not found: ID does not exist" containerID="1cf3d2f7d8429b47fe3cd34228ebbbda246fb141796486f6a91f2354f027b397" Apr 16 14:39:32.979889 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:32.979858 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cf3d2f7d8429b47fe3cd34228ebbbda246fb141796486f6a91f2354f027b397"} err="failed to get container status \"1cf3d2f7d8429b47fe3cd34228ebbbda246fb141796486f6a91f2354f027b397\": rpc error: code = NotFound desc = could not find container \"1cf3d2f7d8429b47fe3cd34228ebbbda246fb141796486f6a91f2354f027b397\": container with ID starting with 1cf3d2f7d8429b47fe3cd34228ebbbda246fb141796486f6a91f2354f027b397 not found: ID does not exist" Apr 16 14:39:32.991466 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:32.991437 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-x5mgm"] Apr 16 14:39:32.997649 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:32.997593 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-x5mgm"] Apr 16 14:39:33.983179 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:33.979830 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93af145b-74f9-4ac9-aea5-80b05fb04636" path="/var/lib/kubelet/pods/93af145b-74f9-4ac9-aea5-80b05fb04636/volumes" Apr 16 14:39:40.041124 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:40.041087 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-mklm2" Apr 16 14:39:40.041124 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:40.041136 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-mklm2" Apr 16 14:39:40.062827 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:40.062798 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-mklm2" Apr 16 14:39:41.019939 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:41.019906 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-mklm2" Apr 16 14:39:53.882097 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:53.882069 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-lr5dz_fca51d46-f492-4bcf-8d66-7ecb32e54d54/console-operator/1.log" Apr 16 14:39:53.882633 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:53.882256 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-lr5dz_fca51d46-f492-4bcf-8d66-7ecb32e54d54/console-operator/1.log" Apr 16 14:39:53.884334 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:53.884315 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vclsl_dc8ac89f-dee2-4e7c-b409-39b0900c673e/ovn-acl-logging/0.log" Apr 16 14:39:53.884514 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:39:53.884498 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vclsl_dc8ac89f-dee2-4e7c-b409-39b0900c673e/ovn-acl-logging/0.log" Apr 16 14:40:02.495447 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:02.495364 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-57jcf"] Apr 16 14:40:02.501510 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:02.495911 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93af145b-74f9-4ac9-aea5-80b05fb04636" containerName="registry-server" Apr 16 14:40:02.501510 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:02.495926 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="93af145b-74f9-4ac9-aea5-80b05fb04636" containerName="registry-server" Apr 16 14:40:02.501510 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:02.495980 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="93af145b-74f9-4ac9-aea5-80b05fb04636" containerName="registry-server" Apr 16 14:40:02.501944 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:02.501928 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-57jcf" Apr 16 14:40:02.504170 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:02.504150 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-7cjrp\"" Apr 16 14:40:02.509744 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:02.509725 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-57jcf"] Apr 16 14:40:02.539824 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:02.539794 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzvlr\" (UniqueName: \"kubernetes.io/projected/37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da-kube-api-access-lzvlr\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-57jcf\" (UID: \"37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-57jcf" Apr 16 14:40:02.539936 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:02.539841 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-57jcf\" (UID: \"37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-57jcf" Apr 16 14:40:02.641088 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:02.641056 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lzvlr\" (UniqueName: \"kubernetes.io/projected/37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da-kube-api-access-lzvlr\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-57jcf\" (UID: \"37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-57jcf" Apr 16 14:40:02.641227 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:02.641107 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-57jcf\" (UID: \"37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-57jcf" Apr 16 14:40:02.641480 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:02.641464 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-57jcf\" (UID: \"37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-57jcf" Apr 16 14:40:02.650153 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:02.650124 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzvlr\" (UniqueName: \"kubernetes.io/projected/37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da-kube-api-access-lzvlr\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-57jcf\" (UID: \"37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-57jcf" Apr 16 14:40:02.812603 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:02.812563 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-57jcf" Apr 16 14:40:02.945347 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:02.945316 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-57jcf"] Apr 16 14:40:02.948780 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:40:02.948755 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37ee3a35_d9e7_4d37_a7d4_3fa982e0e9da.slice/crio-6258a05b13c884134ccfb2e15555c8273ce0d8f6f0ee231eb23ec53fe78fa578 WatchSource:0}: Error finding container 6258a05b13c884134ccfb2e15555c8273ce0d8f6f0ee231eb23ec53fe78fa578: Status 404 returned error can't find the container with id 6258a05b13c884134ccfb2e15555c8273ce0d8f6f0ee231eb23ec53fe78fa578 Apr 16 14:40:03.070017 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:03.069937 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-57jcf" event={"ID":"37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da","Type":"ContainerStarted","Data":"6258a05b13c884134ccfb2e15555c8273ce0d8f6f0ee231eb23ec53fe78fa578"} Apr 16 14:40:04.488817 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:04.488777 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zqtcm"] Apr 16 14:40:04.492229 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:04.492207 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zqtcm" Apr 16 14:40:04.495098 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:04.495079 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-stzgd\"" Apr 16 14:40:04.500935 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:04.500913 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zqtcm"] Apr 16 14:40:04.558927 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:04.558898 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7shhc\" (UniqueName: \"kubernetes.io/projected/728e57e4-4431-4b46-bdeb-a11f73a35e10-kube-api-access-7shhc\") pod \"limitador-operator-controller-manager-85c4996f8c-zqtcm\" (UID: \"728e57e4-4431-4b46-bdeb-a11f73a35e10\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zqtcm" Apr 16 14:40:04.660442 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:04.660401 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7shhc\" (UniqueName: \"kubernetes.io/projected/728e57e4-4431-4b46-bdeb-a11f73a35e10-kube-api-access-7shhc\") pod \"limitador-operator-controller-manager-85c4996f8c-zqtcm\" (UID: \"728e57e4-4431-4b46-bdeb-a11f73a35e10\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zqtcm" Apr 16 14:40:04.669112 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:04.669088 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7shhc\" (UniqueName: \"kubernetes.io/projected/728e57e4-4431-4b46-bdeb-a11f73a35e10-kube-api-access-7shhc\") pod \"limitador-operator-controller-manager-85c4996f8c-zqtcm\" (UID: \"728e57e4-4431-4b46-bdeb-a11f73a35e10\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zqtcm" Apr 16 14:40:04.804311 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:04.804267 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zqtcm" Apr 16 14:40:04.933694 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:04.933672 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zqtcm"] Apr 16 14:40:04.935748 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:40:04.935718 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod728e57e4_4431_4b46_bdeb_a11f73a35e10.slice/crio-8675fbea3526f34b233d3a368fd9fbc5249c2f91f659442428d1808013aa0b1b WatchSource:0}: Error finding container 8675fbea3526f34b233d3a368fd9fbc5249c2f91f659442428d1808013aa0b1b: Status 404 returned error can't find the container with id 8675fbea3526f34b233d3a368fd9fbc5249c2f91f659442428d1808013aa0b1b Apr 16 14:40:05.078694 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:05.078602 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zqtcm" event={"ID":"728e57e4-4431-4b46-bdeb-a11f73a35e10","Type":"ContainerStarted","Data":"8675fbea3526f34b233d3a368fd9fbc5249c2f91f659442428d1808013aa0b1b"} Apr 16 14:40:07.096420 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:07.096381 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zqtcm" event={"ID":"728e57e4-4431-4b46-bdeb-a11f73a35e10","Type":"ContainerStarted","Data":"de875e4f9f6aaf296e935617db0cbe0860313e745fbf0d78cb7915340b770ea7"} Apr 16 14:40:07.096895 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:07.096476 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zqtcm" Apr 16 14:40:07.143122 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:07.143066 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zqtcm" podStartSLOduration=1.520340207 podStartE2EDuration="3.143025604s" podCreationTimestamp="2026-04-16 14:40:04 +0000 UTC" firstStartedPulling="2026-04-16 14:40:04.938899603 +0000 UTC m=+611.556689353" lastFinishedPulling="2026-04-16 14:40:06.561585009 +0000 UTC m=+613.179374750" observedRunningTime="2026-04-16 14:40:07.14079493 +0000 UTC m=+613.758584730" watchObservedRunningTime="2026-04-16 14:40:07.143025604 +0000 UTC m=+613.760815367" Apr 16 14:40:09.108243 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:09.108208 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-57jcf" event={"ID":"37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da","Type":"ContainerStarted","Data":"d93ee61c3519107c8aeb22cc1f42213c6857a1942a06dcafa68e3192d3e17453"} Apr 16 14:40:09.108662 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:09.108278 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-57jcf" Apr 16 14:40:09.130116 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:09.130065 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-57jcf" podStartSLOduration=2.039834528 podStartE2EDuration="7.130048537s" podCreationTimestamp="2026-04-16 14:40:02 +0000 UTC" firstStartedPulling="2026-04-16 14:40:02.950966405 +0000 UTC m=+609.568756146" lastFinishedPulling="2026-04-16 14:40:08.041180415 +0000 UTC m=+614.658970155" observedRunningTime="2026-04-16 14:40:09.127106592 +0000 UTC m=+615.744896355" watchObservedRunningTime="2026-04-16 14:40:09.130048537 +0000 UTC m=+615.747838299" Apr 16 14:40:18.104585 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:18.104546 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zqtcm" Apr 16 14:40:20.114777 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:20.114745 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-57jcf" Apr 16 14:40:21.083869 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.083835 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-57jcf"] Apr 16 14:40:21.084134 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.084108 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-57jcf" podUID="37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da" containerName="manager" containerID="cri-o://d93ee61c3519107c8aeb22cc1f42213c6857a1942a06dcafa68e3192d3e17453" gracePeriod=2 Apr 16 14:40:21.095148 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.095016 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-57jcf"] Apr 16 14:40:21.106815 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.106542 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9nw7g"] Apr 16 14:40:21.107117 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.107098 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da" containerName="manager" Apr 16 14:40:21.107117 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.107118 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da" containerName="manager" Apr 16 14:40:21.107272 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.107217 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da" containerName="manager" Apr 16 14:40:21.113101 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.113077 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zqtcm"] Apr 16 14:40:21.113219 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.113114 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zqtcm"] Apr 16 14:40:21.113312 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.113289 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9nw7g" Apr 16 14:40:21.113558 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.113508 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zqtcm" podUID="728e57e4-4431-4b46-bdeb-a11f73a35e10" containerName="manager" containerID="cri-o://de875e4f9f6aaf296e935617db0cbe0860313e745fbf0d78cb7915340b770ea7" gracePeriod=2 Apr 16 14:40:21.115380 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.115346 2579 status_manager.go:895] "Failed to get status for pod" podUID="37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-57jcf" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-57jcf\" is forbidden: User \"system:node:ip-10-0-140-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-144.ec2.internal' and this object" Apr 16 14:40:21.116942 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.116907 2579 status_manager.go:895] "Failed to get status for pod" podUID="728e57e4-4431-4b46-bdeb-a11f73a35e10" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zqtcm" err="pods \"limitador-operator-controller-manager-85c4996f8c-zqtcm\" is forbidden: User \"system:node:ip-10-0-140-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-144.ec2.internal' and this object" Apr 16 14:40:21.121484 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.121444 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9nw7g"] Apr 16 14:40:21.129353 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.129328 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-gsvz9"] Apr 16 14:40:21.129761 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.129748 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="728e57e4-4431-4b46-bdeb-a11f73a35e10" containerName="manager" Apr 16 14:40:21.129820 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.129763 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="728e57e4-4431-4b46-bdeb-a11f73a35e10" containerName="manager" Apr 16 14:40:21.129857 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.129822 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="728e57e4-4431-4b46-bdeb-a11f73a35e10" containerName="manager" Apr 16 14:40:21.132907 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.132890 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-gsvz9" Apr 16 14:40:21.145179 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.145154 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-gsvz9"] Apr 16 14:40:21.170059 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.170009 2579 status_manager.go:895] "Failed to get status for pod" podUID="37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-57jcf" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-57jcf\" is forbidden: User \"system:node:ip-10-0-140-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-144.ec2.internal' and this object" Apr 16 14:40:21.171635 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.171607 2579 status_manager.go:895] "Failed to get status for pod" podUID="728e57e4-4431-4b46-bdeb-a11f73a35e10" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zqtcm" err="pods \"limitador-operator-controller-manager-85c4996f8c-zqtcm\" is forbidden: User \"system:node:ip-10-0-140-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-144.ec2.internal' and this object" Apr 16 14:40:21.218107 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.218074 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jzc5\" (UniqueName: \"kubernetes.io/projected/4c1ac03a-ba5a-4e62-9f70-cb64955730e1-kube-api-access-2jzc5\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-9nw7g\" (UID: \"4c1ac03a-ba5a-4e62-9f70-cb64955730e1\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9nw7g" Apr 16 14:40:21.218251 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.218156 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8bzq\" (UniqueName: \"kubernetes.io/projected/c06bde55-395a-4576-8df5-d90d1fed8162-kube-api-access-m8bzq\") pod \"limitador-operator-controller-manager-85c4996f8c-gsvz9\" (UID: \"c06bde55-395a-4576-8df5-d90d1fed8162\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-gsvz9" Apr 16 14:40:21.218251 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.218201 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4c1ac03a-ba5a-4e62-9f70-cb64955730e1-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-9nw7g\" (UID: \"4c1ac03a-ba5a-4e62-9f70-cb64955730e1\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9nw7g" Apr 16 14:40:21.319773 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.319732 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2jzc5\" (UniqueName: \"kubernetes.io/projected/4c1ac03a-ba5a-4e62-9f70-cb64955730e1-kube-api-access-2jzc5\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-9nw7g\" (UID: \"4c1ac03a-ba5a-4e62-9f70-cb64955730e1\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9nw7g" Apr 16 14:40:21.320502 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.319798 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m8bzq\" (UniqueName: \"kubernetes.io/projected/c06bde55-395a-4576-8df5-d90d1fed8162-kube-api-access-m8bzq\") pod \"limitador-operator-controller-manager-85c4996f8c-gsvz9\" (UID: \"c06bde55-395a-4576-8df5-d90d1fed8162\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-gsvz9" Apr 16 14:40:21.320502 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.319825 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4c1ac03a-ba5a-4e62-9f70-cb64955730e1-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-9nw7g\" (UID: \"4c1ac03a-ba5a-4e62-9f70-cb64955730e1\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9nw7g" Apr 16 14:40:21.320502 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.320221 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4c1ac03a-ba5a-4e62-9f70-cb64955730e1-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-9nw7g\" (UID: \"4c1ac03a-ba5a-4e62-9f70-cb64955730e1\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9nw7g" Apr 16 14:40:21.332820 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.332792 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8bzq\" (UniqueName: \"kubernetes.io/projected/c06bde55-395a-4576-8df5-d90d1fed8162-kube-api-access-m8bzq\") pod \"limitador-operator-controller-manager-85c4996f8c-gsvz9\" (UID: \"c06bde55-395a-4576-8df5-d90d1fed8162\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-gsvz9" Apr 16 14:40:21.333226 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.333202 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jzc5\" (UniqueName: \"kubernetes.io/projected/4c1ac03a-ba5a-4e62-9f70-cb64955730e1-kube-api-access-2jzc5\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-9nw7g\" (UID: \"4c1ac03a-ba5a-4e62-9f70-cb64955730e1\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9nw7g" Apr 16 14:40:21.358855 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.358835 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-57jcf" Apr 16 14:40:21.360787 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.360757 2579 status_manager.go:895] "Failed to get status for pod" podUID="37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-57jcf" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-57jcf\" is forbidden: User \"system:node:ip-10-0-140-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-144.ec2.internal' and this object" Apr 16 14:40:21.362249 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.362218 2579 status_manager.go:895] "Failed to get status for pod" podUID="728e57e4-4431-4b46-bdeb-a11f73a35e10" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zqtcm" err="pods \"limitador-operator-controller-manager-85c4996f8c-zqtcm\" is forbidden: User \"system:node:ip-10-0-140-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-144.ec2.internal' and this object" Apr 16 14:40:21.362335 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.362292 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zqtcm" Apr 16 14:40:21.363737 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.363714 2579 status_manager.go:895] "Failed to get status for pod" podUID="37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-57jcf" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-57jcf\" is forbidden: User \"system:node:ip-10-0-140-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-144.ec2.internal' and this object" Apr 16 14:40:21.365110 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.365085 2579 status_manager.go:895] "Failed to get status for pod" podUID="728e57e4-4431-4b46-bdeb-a11f73a35e10" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zqtcm" err="pods \"limitador-operator-controller-manager-85c4996f8c-zqtcm\" is forbidden: User \"system:node:ip-10-0-140-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-144.ec2.internal' and this object" Apr 16 14:40:21.421135 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.421103 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da-extensions-socket-volume\") pod \"37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da\" (UID: \"37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da\") " Apr 16 14:40:21.421135 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.421136 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7shhc\" (UniqueName: \"kubernetes.io/projected/728e57e4-4431-4b46-bdeb-a11f73a35e10-kube-api-access-7shhc\") pod \"728e57e4-4431-4b46-bdeb-a11f73a35e10\" (UID: \"728e57e4-4431-4b46-bdeb-a11f73a35e10\") " Apr 16 14:40:21.421282 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.421162 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzvlr\" (UniqueName: \"kubernetes.io/projected/37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da-kube-api-access-lzvlr\") pod \"37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da\" (UID: \"37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da\") " Apr 16 14:40:21.421581 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.421558 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da" (UID: "37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:40:21.423322 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.423294 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/728e57e4-4431-4b46-bdeb-a11f73a35e10-kube-api-access-7shhc" (OuterVolumeSpecName: "kube-api-access-7shhc") pod "728e57e4-4431-4b46-bdeb-a11f73a35e10" (UID: "728e57e4-4431-4b46-bdeb-a11f73a35e10"). InnerVolumeSpecName "kube-api-access-7shhc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:40:21.423409 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.423327 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da-kube-api-access-lzvlr" (OuterVolumeSpecName: "kube-api-access-lzvlr") pod "37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da" (UID: "37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da"). InnerVolumeSpecName "kube-api-access-lzvlr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:40:21.522147 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.522112 2579 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da-extensions-socket-volume\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:40:21.522147 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.522142 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7shhc\" (UniqueName: \"kubernetes.io/projected/728e57e4-4431-4b46-bdeb-a11f73a35e10-kube-api-access-7shhc\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:40:21.522147 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.522152 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lzvlr\" (UniqueName: \"kubernetes.io/projected/37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da-kube-api-access-lzvlr\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:40:21.536014 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.535982 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9nw7g" Apr 16 14:40:21.544282 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.544256 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-gsvz9" Apr 16 14:40:21.673229 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.673198 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9nw7g"] Apr 16 14:40:21.676514 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:40:21.676485 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c1ac03a_ba5a_4e62_9f70_cb64955730e1.slice/crio-91def9e485f5cca8aa16c276dc90eaf5926b8ef5d21627c668b606fc4ad55a05 WatchSource:0}: Error finding container 91def9e485f5cca8aa16c276dc90eaf5926b8ef5d21627c668b606fc4ad55a05: Status 404 returned error can't find the container with id 91def9e485f5cca8aa16c276dc90eaf5926b8ef5d21627c668b606fc4ad55a05 Apr 16 14:40:21.699110 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.699075 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-gsvz9"] Apr 16 14:40:21.701564 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:40:21.701517 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc06bde55_395a_4576_8df5_d90d1fed8162.slice/crio-60dcd435a1ee573c1f8469d45be989b97e1d66ec85719802d79a93a8a2a9f9db WatchSource:0}: Error finding container 60dcd435a1ee573c1f8469d45be989b97e1d66ec85719802d79a93a8a2a9f9db: Status 404 returned error can't find the container with id 60dcd435a1ee573c1f8469d45be989b97e1d66ec85719802d79a93a8a2a9f9db Apr 16 14:40:21.987051 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.986957 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da" path="/var/lib/kubelet/pods/37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da/volumes" Apr 16 14:40:21.987330 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:21.987317 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="728e57e4-4431-4b46-bdeb-a11f73a35e10" path="/var/lib/kubelet/pods/728e57e4-4431-4b46-bdeb-a11f73a35e10/volumes" Apr 16 14:40:22.155013 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:22.154967 2579 generic.go:358] "Generic (PLEG): container finished" podID="728e57e4-4431-4b46-bdeb-a11f73a35e10" containerID="de875e4f9f6aaf296e935617db0cbe0860313e745fbf0d78cb7915340b770ea7" exitCode=0 Apr 16 14:40:22.155467 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:22.155024 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zqtcm" Apr 16 14:40:22.155467 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:22.155070 2579 scope.go:117] "RemoveContainer" containerID="de875e4f9f6aaf296e935617db0cbe0860313e745fbf0d78cb7915340b770ea7" Apr 16 14:40:22.156687 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:22.156664 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9nw7g" event={"ID":"4c1ac03a-ba5a-4e62-9f70-cb64955730e1","Type":"ContainerStarted","Data":"f3fa8989d6180afffb2ac5290651bdb32d77b8914807580dc37c0233fdc524f8"} Apr 16 14:40:22.156782 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:22.156713 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9nw7g" event={"ID":"4c1ac03a-ba5a-4e62-9f70-cb64955730e1","Type":"ContainerStarted","Data":"91def9e485f5cca8aa16c276dc90eaf5926b8ef5d21627c668b606fc4ad55a05"} Apr 16 14:40:22.156851 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:22.156803 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9nw7g" Apr 16 14:40:22.158302 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:22.158274 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-gsvz9" event={"ID":"c06bde55-395a-4576-8df5-d90d1fed8162","Type":"ContainerStarted","Data":"623feaf497667dad026601e06e48ddca73bad31c47b98c54d88c952a87b07bb6"} Apr 16 14:40:22.158302 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:22.158299 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-gsvz9" event={"ID":"c06bde55-395a-4576-8df5-d90d1fed8162","Type":"ContainerStarted","Data":"60dcd435a1ee573c1f8469d45be989b97e1d66ec85719802d79a93a8a2a9f9db"} Apr 16 14:40:22.158461 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:22.158395 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-gsvz9" Apr 16 14:40:22.159502 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:22.159484 2579 generic.go:358] "Generic (PLEG): container finished" podID="37ee3a35-d9e7-4d37-a7d4-3fa982e0e9da" containerID="d93ee61c3519107c8aeb22cc1f42213c6857a1942a06dcafa68e3192d3e17453" exitCode=0 Apr 16 14:40:22.159603 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:22.159532 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-57jcf" Apr 16 14:40:22.164149 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:22.164132 2579 scope.go:117] "RemoveContainer" containerID="de875e4f9f6aaf296e935617db0cbe0860313e745fbf0d78cb7915340b770ea7" Apr 16 14:40:22.164376 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:40:22.164356 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de875e4f9f6aaf296e935617db0cbe0860313e745fbf0d78cb7915340b770ea7\": container with ID starting with de875e4f9f6aaf296e935617db0cbe0860313e745fbf0d78cb7915340b770ea7 not found: ID does not exist" containerID="de875e4f9f6aaf296e935617db0cbe0860313e745fbf0d78cb7915340b770ea7" Apr 16 14:40:22.164445 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:22.164386 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de875e4f9f6aaf296e935617db0cbe0860313e745fbf0d78cb7915340b770ea7"} err="failed to get container status \"de875e4f9f6aaf296e935617db0cbe0860313e745fbf0d78cb7915340b770ea7\": rpc error: code = NotFound desc = could not find container \"de875e4f9f6aaf296e935617db0cbe0860313e745fbf0d78cb7915340b770ea7\": container with ID starting with de875e4f9f6aaf296e935617db0cbe0860313e745fbf0d78cb7915340b770ea7 not found: ID does not exist" Apr 16 14:40:22.164445 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:22.164409 2579 scope.go:117] "RemoveContainer" containerID="d93ee61c3519107c8aeb22cc1f42213c6857a1942a06dcafa68e3192d3e17453" Apr 16 14:40:22.171648 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:22.171631 2579 scope.go:117] "RemoveContainer" containerID="d93ee61c3519107c8aeb22cc1f42213c6857a1942a06dcafa68e3192d3e17453" Apr 16 14:40:22.171904 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:40:22.171877 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d93ee61c3519107c8aeb22cc1f42213c6857a1942a06dcafa68e3192d3e17453\": container with ID starting with d93ee61c3519107c8aeb22cc1f42213c6857a1942a06dcafa68e3192d3e17453 not found: ID does not exist" containerID="d93ee61c3519107c8aeb22cc1f42213c6857a1942a06dcafa68e3192d3e17453" Apr 16 14:40:22.171981 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:22.171912 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d93ee61c3519107c8aeb22cc1f42213c6857a1942a06dcafa68e3192d3e17453"} err="failed to get container status \"d93ee61c3519107c8aeb22cc1f42213c6857a1942a06dcafa68e3192d3e17453\": rpc error: code = NotFound desc = could not find container \"d93ee61c3519107c8aeb22cc1f42213c6857a1942a06dcafa68e3192d3e17453\": container with ID starting with d93ee61c3519107c8aeb22cc1f42213c6857a1942a06dcafa68e3192d3e17453 not found: ID does not exist" Apr 16 14:40:22.181960 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:22.181919 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9nw7g" podStartSLOduration=1.181907486 podStartE2EDuration="1.181907486s" podCreationTimestamp="2026-04-16 14:40:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:40:22.179527439 +0000 UTC m=+628.797317203" watchObservedRunningTime="2026-04-16 14:40:22.181907486 +0000 UTC m=+628.799697248" Apr 16 14:40:22.197354 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:22.197312 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-gsvz9" podStartSLOduration=1.197300705 podStartE2EDuration="1.197300705s" podCreationTimestamp="2026-04-16 14:40:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:40:22.195548535 +0000 UTC m=+628.813338285" watchObservedRunningTime="2026-04-16 14:40:22.197300705 +0000 UTC m=+628.815090468" Apr 16 14:40:33.167207 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:33.167178 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9nw7g" Apr 16 14:40:33.167651 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:33.167227 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-gsvz9" Apr 16 14:40:37.341535 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:37.341499 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9nw7g"] Apr 16 14:40:37.341909 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:37.341734 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9nw7g" podUID="4c1ac03a-ba5a-4e62-9f70-cb64955730e1" containerName="manager" containerID="cri-o://f3fa8989d6180afffb2ac5290651bdb32d77b8914807580dc37c0233fdc524f8" gracePeriod=10 Apr 16 14:40:37.598223 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:37.598165 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9nw7g" Apr 16 14:40:37.767160 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:37.767124 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4c1ac03a-ba5a-4e62-9f70-cb64955730e1-extensions-socket-volume\") pod \"4c1ac03a-ba5a-4e62-9f70-cb64955730e1\" (UID: \"4c1ac03a-ba5a-4e62-9f70-cb64955730e1\") " Apr 16 14:40:37.767346 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:37.767211 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jzc5\" (UniqueName: \"kubernetes.io/projected/4c1ac03a-ba5a-4e62-9f70-cb64955730e1-kube-api-access-2jzc5\") pod \"4c1ac03a-ba5a-4e62-9f70-cb64955730e1\" (UID: \"4c1ac03a-ba5a-4e62-9f70-cb64955730e1\") " Apr 16 14:40:37.767616 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:37.767590 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c1ac03a-ba5a-4e62-9f70-cb64955730e1-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "4c1ac03a-ba5a-4e62-9f70-cb64955730e1" (UID: "4c1ac03a-ba5a-4e62-9f70-cb64955730e1"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:40:37.769432 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:37.769404 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c1ac03a-ba5a-4e62-9f70-cb64955730e1-kube-api-access-2jzc5" (OuterVolumeSpecName: "kube-api-access-2jzc5") pod "4c1ac03a-ba5a-4e62-9f70-cb64955730e1" (UID: "4c1ac03a-ba5a-4e62-9f70-cb64955730e1"). InnerVolumeSpecName "kube-api-access-2jzc5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:40:37.868466 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:37.868377 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2jzc5\" (UniqueName: \"kubernetes.io/projected/4c1ac03a-ba5a-4e62-9f70-cb64955730e1-kube-api-access-2jzc5\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:40:37.868466 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:37.868409 2579 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4c1ac03a-ba5a-4e62-9f70-cb64955730e1-extensions-socket-volume\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:40:38.217504 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:38.217407 2579 generic.go:358] "Generic (PLEG): container finished" podID="4c1ac03a-ba5a-4e62-9f70-cb64955730e1" containerID="f3fa8989d6180afffb2ac5290651bdb32d77b8914807580dc37c0233fdc524f8" exitCode=0 Apr 16 14:40:38.217504 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:38.217456 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9nw7g" event={"ID":"4c1ac03a-ba5a-4e62-9f70-cb64955730e1","Type":"ContainerDied","Data":"f3fa8989d6180afffb2ac5290651bdb32d77b8914807580dc37c0233fdc524f8"} Apr 16 14:40:38.217504 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:38.217486 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9nw7g" Apr 16 14:40:38.217504 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:38.217499 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9nw7g" event={"ID":"4c1ac03a-ba5a-4e62-9f70-cb64955730e1","Type":"ContainerDied","Data":"91def9e485f5cca8aa16c276dc90eaf5926b8ef5d21627c668b606fc4ad55a05"} Apr 16 14:40:38.217828 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:38.217511 2579 scope.go:117] "RemoveContainer" containerID="f3fa8989d6180afffb2ac5290651bdb32d77b8914807580dc37c0233fdc524f8" Apr 16 14:40:38.225989 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:38.225975 2579 scope.go:117] "RemoveContainer" containerID="f3fa8989d6180afffb2ac5290651bdb32d77b8914807580dc37c0233fdc524f8" Apr 16 14:40:38.226281 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:40:38.226251 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3fa8989d6180afffb2ac5290651bdb32d77b8914807580dc37c0233fdc524f8\": container with ID starting with f3fa8989d6180afffb2ac5290651bdb32d77b8914807580dc37c0233fdc524f8 not found: ID does not exist" containerID="f3fa8989d6180afffb2ac5290651bdb32d77b8914807580dc37c0233fdc524f8" Apr 16 14:40:38.226386 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:38.226286 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3fa8989d6180afffb2ac5290651bdb32d77b8914807580dc37c0233fdc524f8"} err="failed to get container status \"f3fa8989d6180afffb2ac5290651bdb32d77b8914807580dc37c0233fdc524f8\": rpc error: code = NotFound desc = could not find container \"f3fa8989d6180afffb2ac5290651bdb32d77b8914807580dc37c0233fdc524f8\": container with ID starting with f3fa8989d6180afffb2ac5290651bdb32d77b8914807580dc37c0233fdc524f8 not found: ID does not exist" Apr 16 14:40:38.241598 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:38.241570 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9nw7g"] Apr 16 14:40:38.248499 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:38.248477 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-9nw7g"] Apr 16 14:40:39.977882 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:39.977844 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c1ac03a-ba5a-4e62-9f70-cb64955730e1" path="/var/lib/kubelet/pods/4c1ac03a-ba5a-4e62-9f70-cb64955730e1/volumes" Apr 16 14:40:53.610018 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:53.609963 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m"] Apr 16 14:40:53.610653 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:53.610564 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c1ac03a-ba5a-4e62-9f70-cb64955730e1" containerName="manager" Apr 16 14:40:53.610653 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:53.610584 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c1ac03a-ba5a-4e62-9f70-cb64955730e1" containerName="manager" Apr 16 14:40:53.610797 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:53.610673 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c1ac03a-ba5a-4e62-9f70-cb64955730e1" containerName="manager" Apr 16 14:40:53.615719 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:53.615696 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m" Apr 16 14:40:53.617933 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:53.617912 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-r9854\"" Apr 16 14:40:53.626807 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:53.626785 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m"] Apr 16 14:40:53.705311 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:53.705277 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrqpj\" (UniqueName: \"kubernetes.io/projected/0fde0890-19e1-42aa-8b06-7b88945d97ca-kube-api-access-xrqpj\") pod \"maas-default-gateway-openshift-default-58b6f876-gtk5m\" (UID: \"0fde0890-19e1-42aa-8b06-7b88945d97ca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m" Apr 16 14:40:53.705478 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:53.705321 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/0fde0890-19e1-42aa-8b06-7b88945d97ca-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-gtk5m\" (UID: \"0fde0890-19e1-42aa-8b06-7b88945d97ca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m" Apr 16 14:40:53.705478 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:53.705350 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/0fde0890-19e1-42aa-8b06-7b88945d97ca-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-gtk5m\" (UID: \"0fde0890-19e1-42aa-8b06-7b88945d97ca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m" Apr 16 14:40:53.705478 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:53.705373 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/0fde0890-19e1-42aa-8b06-7b88945d97ca-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-gtk5m\" (UID: \"0fde0890-19e1-42aa-8b06-7b88945d97ca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m" Apr 16 14:40:53.705478 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:53.705395 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/0fde0890-19e1-42aa-8b06-7b88945d97ca-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-gtk5m\" (UID: \"0fde0890-19e1-42aa-8b06-7b88945d97ca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m" Apr 16 14:40:53.705478 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:53.705457 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/0fde0890-19e1-42aa-8b06-7b88945d97ca-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-gtk5m\" (UID: \"0fde0890-19e1-42aa-8b06-7b88945d97ca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m" Apr 16 14:40:53.705692 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:53.705518 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/0fde0890-19e1-42aa-8b06-7b88945d97ca-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-gtk5m\" (UID: \"0fde0890-19e1-42aa-8b06-7b88945d97ca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m" Apr 16 14:40:53.705692 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:53.705560 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/0fde0890-19e1-42aa-8b06-7b88945d97ca-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-gtk5m\" (UID: \"0fde0890-19e1-42aa-8b06-7b88945d97ca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m" Apr 16 14:40:53.705692 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:53.705632 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/0fde0890-19e1-42aa-8b06-7b88945d97ca-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-gtk5m\" (UID: \"0fde0890-19e1-42aa-8b06-7b88945d97ca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m" Apr 16 14:40:53.806495 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:53.806457 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/0fde0890-19e1-42aa-8b06-7b88945d97ca-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-gtk5m\" (UID: \"0fde0890-19e1-42aa-8b06-7b88945d97ca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m" Apr 16 14:40:53.806668 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:53.806508 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/0fde0890-19e1-42aa-8b06-7b88945d97ca-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-gtk5m\" (UID: \"0fde0890-19e1-42aa-8b06-7b88945d97ca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m" Apr 16 14:40:53.806668 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:53.806622 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/0fde0890-19e1-42aa-8b06-7b88945d97ca-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-gtk5m\" (UID: \"0fde0890-19e1-42aa-8b06-7b88945d97ca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m" Apr 16 14:40:53.806748 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:53.806689 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrqpj\" (UniqueName: \"kubernetes.io/projected/0fde0890-19e1-42aa-8b06-7b88945d97ca-kube-api-access-xrqpj\") pod \"maas-default-gateway-openshift-default-58b6f876-gtk5m\" (UID: \"0fde0890-19e1-42aa-8b06-7b88945d97ca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m" Apr 16 14:40:53.806748 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:53.806717 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/0fde0890-19e1-42aa-8b06-7b88945d97ca-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-gtk5m\" (UID: \"0fde0890-19e1-42aa-8b06-7b88945d97ca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m" Apr 16 14:40:53.806748 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:53.806735 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/0fde0890-19e1-42aa-8b06-7b88945d97ca-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-gtk5m\" (UID: \"0fde0890-19e1-42aa-8b06-7b88945d97ca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m" Apr 16 14:40:53.806899 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:53.806751 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/0fde0890-19e1-42aa-8b06-7b88945d97ca-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-gtk5m\" (UID: \"0fde0890-19e1-42aa-8b06-7b88945d97ca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m" Apr 16 14:40:53.806899 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:53.806767 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/0fde0890-19e1-42aa-8b06-7b88945d97ca-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-gtk5m\" (UID: \"0fde0890-19e1-42aa-8b06-7b88945d97ca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m" Apr 16 14:40:53.806899 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:53.806823 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/0fde0890-19e1-42aa-8b06-7b88945d97ca-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-gtk5m\" (UID: \"0fde0890-19e1-42aa-8b06-7b88945d97ca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m" Apr 16 14:40:53.806899 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:53.806853 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/0fde0890-19e1-42aa-8b06-7b88945d97ca-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-gtk5m\" (UID: \"0fde0890-19e1-42aa-8b06-7b88945d97ca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m" Apr 16 14:40:53.807140 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:53.806977 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/0fde0890-19e1-42aa-8b06-7b88945d97ca-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-gtk5m\" (UID: \"0fde0890-19e1-42aa-8b06-7b88945d97ca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m" Apr 16 14:40:53.807229 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:53.807205 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/0fde0890-19e1-42aa-8b06-7b88945d97ca-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-gtk5m\" (UID: \"0fde0890-19e1-42aa-8b06-7b88945d97ca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m" Apr 16 14:40:53.807374 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:53.807353 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/0fde0890-19e1-42aa-8b06-7b88945d97ca-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-gtk5m\" (UID: \"0fde0890-19e1-42aa-8b06-7b88945d97ca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m" Apr 16 14:40:53.807468 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:53.807450 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/0fde0890-19e1-42aa-8b06-7b88945d97ca-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-gtk5m\" (UID: \"0fde0890-19e1-42aa-8b06-7b88945d97ca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m" Apr 16 14:40:53.808961 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:53.808934 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/0fde0890-19e1-42aa-8b06-7b88945d97ca-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-gtk5m\" (UID: \"0fde0890-19e1-42aa-8b06-7b88945d97ca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m" Apr 16 14:40:53.809110 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:53.809071 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/0fde0890-19e1-42aa-8b06-7b88945d97ca-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-gtk5m\" (UID: \"0fde0890-19e1-42aa-8b06-7b88945d97ca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m" Apr 16 14:40:53.815830 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:53.815801 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/0fde0890-19e1-42aa-8b06-7b88945d97ca-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-gtk5m\" (UID: \"0fde0890-19e1-42aa-8b06-7b88945d97ca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m" Apr 16 14:40:53.816342 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:53.816306 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrqpj\" (UniqueName: \"kubernetes.io/projected/0fde0890-19e1-42aa-8b06-7b88945d97ca-kube-api-access-xrqpj\") pod \"maas-default-gateway-openshift-default-58b6f876-gtk5m\" (UID: \"0fde0890-19e1-42aa-8b06-7b88945d97ca\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m" Apr 16 14:40:53.930233 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:53.930206 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-r9854\"" Apr 16 14:40:53.938367 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:53.938349 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m" Apr 16 14:40:54.080681 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:54.080651 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m"] Apr 16 14:40:54.081785 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:40:54.081762 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fde0890_19e1_42aa_8b06_7b88945d97ca.slice/crio-6308bf20a035f872ed7b5623fe985fe29c1a7bdab0100be0c7986b304c81944c WatchSource:0}: Error finding container 6308bf20a035f872ed7b5623fe985fe29c1a7bdab0100be0c7986b304c81944c: Status 404 returned error can't find the container with id 6308bf20a035f872ed7b5623fe985fe29c1a7bdab0100be0c7986b304c81944c Apr 16 14:40:54.083788 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:54.083756 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 16 14:40:54.083870 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:54.083814 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 16 14:40:54.083870 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:54.083845 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 16 14:40:54.278584 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:54.278539 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m" event={"ID":"0fde0890-19e1-42aa-8b06-7b88945d97ca","Type":"ContainerStarted","Data":"5c3ba3e6066d671281dfa27d3764c89c1c5114d27dae14bbabde4ed72ff02fbf"} Apr 16 14:40:54.278584 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:54.278578 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m" event={"ID":"0fde0890-19e1-42aa-8b06-7b88945d97ca","Type":"ContainerStarted","Data":"6308bf20a035f872ed7b5623fe985fe29c1a7bdab0100be0c7986b304c81944c"} Apr 16 14:40:54.303729 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:54.303663 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m" podStartSLOduration=1.3036432470000001 podStartE2EDuration="1.303643247s" podCreationTimestamp="2026-04-16 14:40:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:40:54.303328114 +0000 UTC m=+660.921117891" watchObservedRunningTime="2026-04-16 14:40:54.303643247 +0000 UTC m=+660.921433011" Apr 16 14:40:54.938801 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:54.938769 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m" Apr 16 14:40:55.944303 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:55.944270 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m" Apr 16 14:40:56.285824 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:56.285793 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m" Apr 16 14:40:56.286900 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:56.286881 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gtk5m" Apr 16 14:40:57.955287 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:57.955206 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-nztc2"] Apr 16 14:40:57.958786 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:57.958765 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-nztc2" Apr 16 14:40:57.961398 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:57.961380 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-r2tmc\"" Apr 16 14:40:57.961497 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:57.961383 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 14:40:57.970272 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:57.970248 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-nztc2"] Apr 16 14:40:58.047293 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:58.047258 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhh6b\" (UniqueName: \"kubernetes.io/projected/c65f11ca-3fb3-4544-a2b8-02e01f9583ae-kube-api-access-nhh6b\") pod \"limitador-limitador-7d549b5b-nztc2\" (UID: \"c65f11ca-3fb3-4544-a2b8-02e01f9583ae\") " pod="kuadrant-system/limitador-limitador-7d549b5b-nztc2" Apr 16 14:40:58.047483 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:58.047333 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c65f11ca-3fb3-4544-a2b8-02e01f9583ae-config-file\") pod \"limitador-limitador-7d549b5b-nztc2\" (UID: \"c65f11ca-3fb3-4544-a2b8-02e01f9583ae\") " pod="kuadrant-system/limitador-limitador-7d549b5b-nztc2" Apr 16 14:40:58.047937 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:58.047908 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-nztc2"] Apr 16 14:40:58.148555 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:58.148515 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c65f11ca-3fb3-4544-a2b8-02e01f9583ae-config-file\") pod \"limitador-limitador-7d549b5b-nztc2\" (UID: \"c65f11ca-3fb3-4544-a2b8-02e01f9583ae\") " pod="kuadrant-system/limitador-limitador-7d549b5b-nztc2" Apr 16 14:40:58.148741 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:58.148639 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhh6b\" (UniqueName: \"kubernetes.io/projected/c65f11ca-3fb3-4544-a2b8-02e01f9583ae-kube-api-access-nhh6b\") pod \"limitador-limitador-7d549b5b-nztc2\" (UID: \"c65f11ca-3fb3-4544-a2b8-02e01f9583ae\") " pod="kuadrant-system/limitador-limitador-7d549b5b-nztc2" Apr 16 14:40:58.149150 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:58.149130 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c65f11ca-3fb3-4544-a2b8-02e01f9583ae-config-file\") pod \"limitador-limitador-7d549b5b-nztc2\" (UID: \"c65f11ca-3fb3-4544-a2b8-02e01f9583ae\") " pod="kuadrant-system/limitador-limitador-7d549b5b-nztc2" Apr 16 14:40:58.157384 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:58.157361 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhh6b\" (UniqueName: \"kubernetes.io/projected/c65f11ca-3fb3-4544-a2b8-02e01f9583ae-kube-api-access-nhh6b\") pod \"limitador-limitador-7d549b5b-nztc2\" (UID: \"c65f11ca-3fb3-4544-a2b8-02e01f9583ae\") " pod="kuadrant-system/limitador-limitador-7d549b5b-nztc2" Apr 16 14:40:58.220324 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:58.220239 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n8rrq"] Apr 16 14:40:58.224231 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:58.224206 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-n8rrq" Apr 16 14:40:58.232157 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:58.232127 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n8rrq"] Apr 16 14:40:58.258610 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:58.258574 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n8rrq"] Apr 16 14:40:58.270449 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:58.270418 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-nztc2" Apr 16 14:40:58.350319 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:58.350282 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/588c0261-0abb-4c7e-952c-5e04e6cbb288-config-file\") pod \"limitador-limitador-78c99df468-n8rrq\" (UID: \"588c0261-0abb-4c7e-952c-5e04e6cbb288\") " pod="kuadrant-system/limitador-limitador-78c99df468-n8rrq" Apr 16 14:40:58.350502 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:58.350322 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbnvk\" (UniqueName: \"kubernetes.io/projected/588c0261-0abb-4c7e-952c-5e04e6cbb288-kube-api-access-mbnvk\") pod \"limitador-limitador-78c99df468-n8rrq\" (UID: \"588c0261-0abb-4c7e-952c-5e04e6cbb288\") " pod="kuadrant-system/limitador-limitador-78c99df468-n8rrq" Apr 16 14:40:58.405366 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:58.405341 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-nztc2"] Apr 16 14:40:58.407466 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:40:58.407442 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc65f11ca_3fb3_4544_a2b8_02e01f9583ae.slice/crio-450a62236e370e17cfc234a7a45a7383da20137f2778d62c79b723c5905fb3a6 WatchSource:0}: Error finding container 450a62236e370e17cfc234a7a45a7383da20137f2778d62c79b723c5905fb3a6: Status 404 returned error can't find the container with id 450a62236e370e17cfc234a7a45a7383da20137f2778d62c79b723c5905fb3a6 Apr 16 14:40:58.409490 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:58.409469 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:40:58.451938 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:58.451907 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/588c0261-0abb-4c7e-952c-5e04e6cbb288-config-file\") pod \"limitador-limitador-78c99df468-n8rrq\" (UID: \"588c0261-0abb-4c7e-952c-5e04e6cbb288\") " pod="kuadrant-system/limitador-limitador-78c99df468-n8rrq" Apr 16 14:40:58.451938 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:58.451942 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mbnvk\" (UniqueName: \"kubernetes.io/projected/588c0261-0abb-4c7e-952c-5e04e6cbb288-kube-api-access-mbnvk\") pod \"limitador-limitador-78c99df468-n8rrq\" (UID: \"588c0261-0abb-4c7e-952c-5e04e6cbb288\") " pod="kuadrant-system/limitador-limitador-78c99df468-n8rrq" Apr 16 14:40:58.452634 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:58.452615 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/588c0261-0abb-4c7e-952c-5e04e6cbb288-config-file\") pod \"limitador-limitador-78c99df468-n8rrq\" (UID: \"588c0261-0abb-4c7e-952c-5e04e6cbb288\") " pod="kuadrant-system/limitador-limitador-78c99df468-n8rrq" Apr 16 14:40:58.460228 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:58.460203 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbnvk\" (UniqueName: \"kubernetes.io/projected/588c0261-0abb-4c7e-952c-5e04e6cbb288-kube-api-access-mbnvk\") pod \"limitador-limitador-78c99df468-n8rrq\" (UID: \"588c0261-0abb-4c7e-952c-5e04e6cbb288\") " pod="kuadrant-system/limitador-limitador-78c99df468-n8rrq" Apr 16 14:40:58.538207 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:58.538170 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-n8rrq" Apr 16 14:40:58.686747 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:58.686723 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-n8rrq"] Apr 16 14:40:58.688878 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:40:58.688851 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod588c0261_0abb_4c7e_952c_5e04e6cbb288.slice/crio-e42c3d5d5dc22265aa3306d591563ced552088e8a1d06feb8a46adbfdfbef077 WatchSource:0}: Error finding container e42c3d5d5dc22265aa3306d591563ced552088e8a1d06feb8a46adbfdfbef077: Status 404 returned error can't find the container with id e42c3d5d5dc22265aa3306d591563ced552088e8a1d06feb8a46adbfdfbef077 Apr 16 14:40:59.311861 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:59.311822 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-nztc2" event={"ID":"c65f11ca-3fb3-4544-a2b8-02e01f9583ae","Type":"ContainerStarted","Data":"450a62236e370e17cfc234a7a45a7383da20137f2778d62c79b723c5905fb3a6"} Apr 16 14:40:59.317691 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:40:59.317652 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-n8rrq" event={"ID":"588c0261-0abb-4c7e-952c-5e04e6cbb288","Type":"ContainerStarted","Data":"e42c3d5d5dc22265aa3306d591563ced552088e8a1d06feb8a46adbfdfbef077"} Apr 16 14:41:02.330534 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:02.330495 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-nztc2" event={"ID":"c65f11ca-3fb3-4544-a2b8-02e01f9583ae","Type":"ContainerStarted","Data":"66886b4ee21ce900748c2a0651e40239cd8e955a2c7336c273c2908cc41660ba"} Apr 16 14:41:02.330949 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:02.330554 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-nztc2" Apr 16 14:41:02.331926 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:02.331905 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-n8rrq" event={"ID":"588c0261-0abb-4c7e-952c-5e04e6cbb288","Type":"ContainerStarted","Data":"dd65c313bb5c213b27dcd133d432527f9292e6bf53b8ad97e1bd8e8bc55aed33"} Apr 16 14:41:02.332007 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:02.331992 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-n8rrq" Apr 16 14:41:02.350052 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:02.349979 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-nztc2" podStartSLOduration=2.324843084 podStartE2EDuration="5.34996223s" podCreationTimestamp="2026-04-16 14:40:57 +0000 UTC" firstStartedPulling="2026-04-16 14:40:58.409661153 +0000 UTC m=+665.027450894" lastFinishedPulling="2026-04-16 14:41:01.434780289 +0000 UTC m=+668.052570040" observedRunningTime="2026-04-16 14:41:02.349292059 +0000 UTC m=+668.967081822" watchObservedRunningTime="2026-04-16 14:41:02.34996223 +0000 UTC m=+668.967751994" Apr 16 14:41:02.366316 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:02.366266 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-n8rrq" podStartSLOduration=1.615072447 podStartE2EDuration="4.366252877s" podCreationTimestamp="2026-04-16 14:40:58 +0000 UTC" firstStartedPulling="2026-04-16 14:40:58.690662391 +0000 UTC m=+665.308452139" lastFinishedPulling="2026-04-16 14:41:01.441842764 +0000 UTC m=+668.059632569" observedRunningTime="2026-04-16 14:41:02.364321196 +0000 UTC m=+668.982110956" watchObservedRunningTime="2026-04-16 14:41:02.366252877 +0000 UTC m=+668.984042639" Apr 16 14:41:13.337094 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:13.337062 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-n8rrq" Apr 16 14:41:13.337533 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:13.337113 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-nztc2" Apr 16 14:41:13.406868 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:13.406831 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-nztc2"] Apr 16 14:41:13.407092 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:13.407049 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-nztc2" podUID="c65f11ca-3fb3-4544-a2b8-02e01f9583ae" containerName="limitador" containerID="cri-o://66886b4ee21ce900748c2a0651e40239cd8e955a2c7336c273c2908cc41660ba" gracePeriod=30 Apr 16 14:41:13.951908 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:13.951883 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-nztc2" Apr 16 14:41:14.105175 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:14.105144 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c65f11ca-3fb3-4544-a2b8-02e01f9583ae-config-file\") pod \"c65f11ca-3fb3-4544-a2b8-02e01f9583ae\" (UID: \"c65f11ca-3fb3-4544-a2b8-02e01f9583ae\") " Apr 16 14:41:14.105175 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:14.105182 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhh6b\" (UniqueName: \"kubernetes.io/projected/c65f11ca-3fb3-4544-a2b8-02e01f9583ae-kube-api-access-nhh6b\") pod \"c65f11ca-3fb3-4544-a2b8-02e01f9583ae\" (UID: \"c65f11ca-3fb3-4544-a2b8-02e01f9583ae\") " Apr 16 14:41:14.105501 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:14.105481 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c65f11ca-3fb3-4544-a2b8-02e01f9583ae-config-file" (OuterVolumeSpecName: "config-file") pod "c65f11ca-3fb3-4544-a2b8-02e01f9583ae" (UID: "c65f11ca-3fb3-4544-a2b8-02e01f9583ae"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:41:14.107312 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:14.107280 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c65f11ca-3fb3-4544-a2b8-02e01f9583ae-kube-api-access-nhh6b" (OuterVolumeSpecName: "kube-api-access-nhh6b") pod "c65f11ca-3fb3-4544-a2b8-02e01f9583ae" (UID: "c65f11ca-3fb3-4544-a2b8-02e01f9583ae"). InnerVolumeSpecName "kube-api-access-nhh6b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:41:14.206018 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:14.205983 2579 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c65f11ca-3fb3-4544-a2b8-02e01f9583ae-config-file\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:41:14.206018 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:14.206014 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nhh6b\" (UniqueName: \"kubernetes.io/projected/c65f11ca-3fb3-4544-a2b8-02e01f9583ae-kube-api-access-nhh6b\") on node \"ip-10-0-140-144.ec2.internal\" DevicePath \"\"" Apr 16 14:41:14.377821 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:14.377732 2579 generic.go:358] "Generic (PLEG): container finished" podID="c65f11ca-3fb3-4544-a2b8-02e01f9583ae" containerID="66886b4ee21ce900748c2a0651e40239cd8e955a2c7336c273c2908cc41660ba" exitCode=0 Apr 16 14:41:14.377821 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:14.377799 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-nztc2" Apr 16 14:41:14.378282 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:14.377800 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-nztc2" event={"ID":"c65f11ca-3fb3-4544-a2b8-02e01f9583ae","Type":"ContainerDied","Data":"66886b4ee21ce900748c2a0651e40239cd8e955a2c7336c273c2908cc41660ba"} Apr 16 14:41:14.378282 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:14.377898 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-nztc2" event={"ID":"c65f11ca-3fb3-4544-a2b8-02e01f9583ae","Type":"ContainerDied","Data":"450a62236e370e17cfc234a7a45a7383da20137f2778d62c79b723c5905fb3a6"} Apr 16 14:41:14.378282 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:14.377918 2579 scope.go:117] "RemoveContainer" containerID="66886b4ee21ce900748c2a0651e40239cd8e955a2c7336c273c2908cc41660ba" Apr 16 14:41:14.386539 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:14.386520 2579 scope.go:117] "RemoveContainer" containerID="66886b4ee21ce900748c2a0651e40239cd8e955a2c7336c273c2908cc41660ba" Apr 16 14:41:14.386776 ip-10-0-140-144 kubenswrapper[2579]: E0416 14:41:14.386758 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66886b4ee21ce900748c2a0651e40239cd8e955a2c7336c273c2908cc41660ba\": container with ID starting with 66886b4ee21ce900748c2a0651e40239cd8e955a2c7336c273c2908cc41660ba not found: ID does not exist" containerID="66886b4ee21ce900748c2a0651e40239cd8e955a2c7336c273c2908cc41660ba" Apr 16 14:41:14.386825 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:14.386785 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66886b4ee21ce900748c2a0651e40239cd8e955a2c7336c273c2908cc41660ba"} err="failed to get container status \"66886b4ee21ce900748c2a0651e40239cd8e955a2c7336c273c2908cc41660ba\": rpc error: code = NotFound desc = could not find container \"66886b4ee21ce900748c2a0651e40239cd8e955a2c7336c273c2908cc41660ba\": container with ID starting with 66886b4ee21ce900748c2a0651e40239cd8e955a2c7336c273c2908cc41660ba not found: ID does not exist" Apr 16 14:41:14.403440 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:14.403415 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-nztc2"] Apr 16 14:41:14.408089 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:14.408062 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-nztc2"] Apr 16 14:41:15.978436 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:15.978405 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c65f11ca-3fb3-4544-a2b8-02e01f9583ae" path="/var/lib/kubelet/pods/c65f11ca-3fb3-4544-a2b8-02e01f9583ae/volumes" Apr 16 14:41:19.070964 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:19.070927 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-crp8n"] Apr 16 14:41:19.071556 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:19.071527 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c65f11ca-3fb3-4544-a2b8-02e01f9583ae" containerName="limitador" Apr 16 14:41:19.071636 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:19.071560 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c65f11ca-3fb3-4544-a2b8-02e01f9583ae" containerName="limitador" Apr 16 14:41:19.071680 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:19.071663 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="c65f11ca-3fb3-4544-a2b8-02e01f9583ae" containerName="limitador" Apr 16 14:41:19.076552 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:19.076527 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-crp8n" Apr 16 14:41:19.080629 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:19.080609 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 16 14:41:19.081268 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:19.081249 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-vj4nj\"" Apr 16 14:41:19.084186 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:19.084166 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-crp8n"] Apr 16 14:41:19.255865 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:19.255824 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/b55b6a93-720b-41aa-b058-8a2c92f86afd-data\") pod \"postgres-868db5846d-crp8n\" (UID: \"b55b6a93-720b-41aa-b058-8a2c92f86afd\") " pod="opendatahub/postgres-868db5846d-crp8n" Apr 16 14:41:19.256069 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:19.255872 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m72g7\" (UniqueName: \"kubernetes.io/projected/b55b6a93-720b-41aa-b058-8a2c92f86afd-kube-api-access-m72g7\") pod \"postgres-868db5846d-crp8n\" (UID: \"b55b6a93-720b-41aa-b058-8a2c92f86afd\") " pod="opendatahub/postgres-868db5846d-crp8n" Apr 16 14:41:19.356748 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:19.356648 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/b55b6a93-720b-41aa-b058-8a2c92f86afd-data\") pod \"postgres-868db5846d-crp8n\" (UID: \"b55b6a93-720b-41aa-b058-8a2c92f86afd\") " pod="opendatahub/postgres-868db5846d-crp8n" Apr 16 14:41:19.356748 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:19.356696 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m72g7\" (UniqueName: \"kubernetes.io/projected/b55b6a93-720b-41aa-b058-8a2c92f86afd-kube-api-access-m72g7\") pod \"postgres-868db5846d-crp8n\" (UID: \"b55b6a93-720b-41aa-b058-8a2c92f86afd\") " pod="opendatahub/postgres-868db5846d-crp8n" Apr 16 14:41:19.357148 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:19.357125 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/b55b6a93-720b-41aa-b058-8a2c92f86afd-data\") pod \"postgres-868db5846d-crp8n\" (UID: \"b55b6a93-720b-41aa-b058-8a2c92f86afd\") " pod="opendatahub/postgres-868db5846d-crp8n" Apr 16 14:41:19.368738 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:19.368707 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m72g7\" (UniqueName: \"kubernetes.io/projected/b55b6a93-720b-41aa-b058-8a2c92f86afd-kube-api-access-m72g7\") pod \"postgres-868db5846d-crp8n\" (UID: \"b55b6a93-720b-41aa-b058-8a2c92f86afd\") " pod="opendatahub/postgres-868db5846d-crp8n" Apr 16 14:41:19.389051 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:19.389007 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-crp8n" Apr 16 14:41:19.723660 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:19.723636 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-crp8n"] Apr 16 14:41:19.726302 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:41:19.726272 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb55b6a93_720b_41aa_b058_8a2c92f86afd.slice/crio-57f9e6803ff8c14b8207e48d08c47199525b21ae63a8aafed2e4281fb7c21a03 WatchSource:0}: Error finding container 57f9e6803ff8c14b8207e48d08c47199525b21ae63a8aafed2e4281fb7c21a03: Status 404 returned error can't find the container with id 57f9e6803ff8c14b8207e48d08c47199525b21ae63a8aafed2e4281fb7c21a03 Apr 16 14:41:20.399803 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:20.399765 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-crp8n" event={"ID":"b55b6a93-720b-41aa-b058-8a2c92f86afd","Type":"ContainerStarted","Data":"57f9e6803ff8c14b8207e48d08c47199525b21ae63a8aafed2e4281fb7c21a03"} Apr 16 14:41:26.423270 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:26.423231 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-crp8n" event={"ID":"b55b6a93-720b-41aa-b058-8a2c92f86afd","Type":"ContainerStarted","Data":"cff06a7902ff3b093221b2deced8d4a71917ed753f5d4b771f7e36e6c59eab20"} Apr 16 14:41:26.423672 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:26.423387 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-crp8n" Apr 16 14:41:26.444357 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:26.444308 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-crp8n" podStartSLOduration=0.907804392 podStartE2EDuration="7.444293804s" podCreationTimestamp="2026-04-16 14:41:19 +0000 UTC" firstStartedPulling="2026-04-16 14:41:19.72752676 +0000 UTC m=+686.345316501" lastFinishedPulling="2026-04-16 14:41:26.264016164 +0000 UTC m=+692.881805913" observedRunningTime="2026-04-16 14:41:26.442985466 +0000 UTC m=+693.060775230" watchObservedRunningTime="2026-04-16 14:41:26.444293804 +0000 UTC m=+693.062083566" Apr 16 14:41:32.455976 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:41:32.455902 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-crp8n" Apr 16 14:44:53.907618 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:44:53.907530 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-lr5dz_fca51d46-f492-4bcf-8d66-7ecb32e54d54/console-operator/1.log" Apr 16 14:44:53.909928 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:44:53.909902 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-lr5dz_fca51d46-f492-4bcf-8d66-7ecb32e54d54/console-operator/1.log" Apr 16 14:44:53.910136 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:44:53.910119 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vclsl_dc8ac89f-dee2-4e7c-b409-39b0900c673e/ovn-acl-logging/0.log" Apr 16 14:44:53.912566 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:44:53.912548 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vclsl_dc8ac89f-dee2-4e7c-b409-39b0900c673e/ovn-acl-logging/0.log" Apr 16 14:49:53.933236 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:49:53.933210 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-lr5dz_fca51d46-f492-4bcf-8d66-7ecb32e54d54/console-operator/1.log" Apr 16 14:49:53.935621 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:49:53.935599 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vclsl_dc8ac89f-dee2-4e7c-b409-39b0900c673e/ovn-acl-logging/0.log" Apr 16 14:49:53.936239 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:49:53.936219 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-lr5dz_fca51d46-f492-4bcf-8d66-7ecb32e54d54/console-operator/1.log" Apr 16 14:49:53.938494 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:49:53.938471 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vclsl_dc8ac89f-dee2-4e7c-b409-39b0900c673e/ovn-acl-logging/0.log" Apr 16 14:52:15.657671 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:15.657638 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6f7bb56bb6-cmhw2_396e2470-cef3-48c8-adf2-6ac7cb57b2e3/manager/0.log" Apr 16 14:52:15.884630 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:15.884603 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-crp8n_b55b6a93-720b-41aa-b058-8a2c92f86afd/postgres/0.log" Apr 16 14:52:17.613619 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:17.613587 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-mklm2_b9899edc-232d-4353-bac7-ff31a12e96b6/registry-server/0.log" Apr 16 14:52:17.845055 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:17.845011 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-n8rrq_588c0261-0abb-4c7e-952c-5e04e6cbb288/limitador/0.log" Apr 16 14:52:17.960545 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:17.960464 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-gsvz9_c06bde55-395a-4576-8df5-d90d1fed8162/manager/0.log" Apr 16 14:52:18.302665 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:18.302623 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9_0400e2bb-d2d7-4633-bdc2-77847ef73977/istio-proxy/0.log" Apr 16 14:52:18.756977 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:18.756895 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-gtk5m_0fde0890-19e1-42aa-8b06-7b88945d97ca/istio-proxy/0.log" Apr 16 14:52:25.819412 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:25.819380 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-j5tck_32a0461a-84ba-4c24-a026-df53dabb8bbe/global-pull-secret-syncer/0.log" Apr 16 14:52:26.039808 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:26.039779 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-t5h5z_42343077-d6d1-4ad1-ba48-405f8545fbef/konnectivity-agent/0.log" Apr 16 14:52:26.090417 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:26.090343 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-140-144.ec2.internal_ab72f00075c1175e59c7e696357a6702/haproxy/0.log" Apr 16 14:52:30.460762 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:30.460734 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-mklm2_b9899edc-232d-4353-bac7-ff31a12e96b6/registry-server/0.log" Apr 16 14:52:30.530273 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:30.530233 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-n8rrq_588c0261-0abb-4c7e-952c-5e04e6cbb288/limitador/0.log" Apr 16 14:52:30.560422 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:30.560390 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-gsvz9_c06bde55-395a-4576-8df5-d90d1fed8162/manager/0.log" Apr 16 14:52:31.854360 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:31.854279 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b5bff235-fbd3-4652-b00f-7ab1933a3a94/alertmanager/0.log" Apr 16 14:52:31.878312 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:31.878285 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b5bff235-fbd3-4652-b00f-7ab1933a3a94/config-reloader/0.log" Apr 16 14:52:31.903410 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:31.903390 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b5bff235-fbd3-4652-b00f-7ab1933a3a94/kube-rbac-proxy-web/0.log" Apr 16 14:52:31.927771 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:31.927744 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b5bff235-fbd3-4652-b00f-7ab1933a3a94/kube-rbac-proxy/0.log" Apr 16 14:52:31.959164 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:31.959131 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b5bff235-fbd3-4652-b00f-7ab1933a3a94/kube-rbac-proxy-metric/0.log" Apr 16 14:52:31.989172 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:31.989144 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b5bff235-fbd3-4652-b00f-7ab1933a3a94/prom-label-proxy/0.log" Apr 16 14:52:32.022681 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:32.022655 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b5bff235-fbd3-4652-b00f-7ab1933a3a94/init-config-reloader/0.log" Apr 16 14:52:32.118375 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:32.118297 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-g8nc9_f50b4fad-71ea-4a5d-bac7-f28d01dab723/kube-state-metrics/0.log" Apr 16 14:52:32.145913 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:32.145890 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-g8nc9_f50b4fad-71ea-4a5d-bac7-f28d01dab723/kube-rbac-proxy-main/0.log" Apr 16 14:52:32.182556 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:32.182531 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-g8nc9_f50b4fad-71ea-4a5d-bac7-f28d01dab723/kube-rbac-proxy-self/0.log" Apr 16 14:52:32.218583 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:32.218538 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7c4b5cb6c-c868m_2cf40cf8-92d8-4a5a-afda-a5c5e3009baf/metrics-server/0.log" Apr 16 14:52:32.464276 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:32.464205 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lqxdm_06ded581-64bf-4eda-b995-f0d8319c99ac/node-exporter/0.log" Apr 16 14:52:32.487936 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:32.487915 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lqxdm_06ded581-64bf-4eda-b995-f0d8319c99ac/kube-rbac-proxy/0.log" Apr 16 14:52:32.514293 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:32.514274 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lqxdm_06ded581-64bf-4eda-b995-f0d8319c99ac/init-textfile/0.log" Apr 16 14:52:32.660860 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:32.660830 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b6f4294d-d673-4af4-9961-72525953ee7f/prometheus/0.log" Apr 16 14:52:32.687274 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:32.687245 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b6f4294d-d673-4af4-9961-72525953ee7f/config-reloader/0.log" Apr 16 14:52:32.714869 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:32.714791 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b6f4294d-d673-4af4-9961-72525953ee7f/thanos-sidecar/0.log" Apr 16 14:52:32.745929 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:32.745903 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b6f4294d-d673-4af4-9961-72525953ee7f/kube-rbac-proxy-web/0.log" Apr 16 14:52:32.776351 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:32.776323 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b6f4294d-d673-4af4-9961-72525953ee7f/kube-rbac-proxy/0.log" Apr 16 14:52:32.802679 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:32.802628 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b6f4294d-d673-4af4-9961-72525953ee7f/kube-rbac-proxy-thanos/0.log" Apr 16 14:52:32.839168 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:32.839103 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b6f4294d-d673-4af4-9961-72525953ee7f/init-config-reloader/0.log" Apr 16 14:52:32.975010 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:32.974939 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6f8d7b8476-mqfpr_b19fca59-c615-4671-80aa-d1f9d57ddd74/telemeter-client/0.log" Apr 16 14:52:33.000944 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:33.000923 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6f8d7b8476-mqfpr_b19fca59-c615-4671-80aa-d1f9d57ddd74/reload/0.log" Apr 16 14:52:33.027627 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:33.027598 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6f8d7b8476-mqfpr_b19fca59-c615-4671-80aa-d1f9d57ddd74/kube-rbac-proxy/0.log" Apr 16 14:52:34.266610 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:34.266584 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-mxz6w_bcf5dc26-6a7b-4d47-bd94-a76c16744638/networking-console-plugin/0.log" Apr 16 14:52:34.623007 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:34.622974 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mcxmz/perf-node-gather-daemonset-xcw2n"] Apr 16 14:52:34.626685 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:34.626664 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-xcw2n" Apr 16 14:52:34.629286 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:34.629263 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mcxmz\"/\"openshift-service-ca.crt\"" Apr 16 14:52:34.629402 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:34.629288 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mcxmz\"/\"kube-root-ca.crt\"" Apr 16 14:52:34.629402 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:34.629306 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-mcxmz\"/\"default-dockercfg-d2ggf\"" Apr 16 14:52:34.634406 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:34.634387 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mcxmz/perf-node-gather-daemonset-xcw2n"] Apr 16 14:52:34.649793 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:34.649767 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npdzg\" (UniqueName: \"kubernetes.io/projected/4405016a-cea0-4801-8cf0-c3c65ea2da7b-kube-api-access-npdzg\") pod \"perf-node-gather-daemonset-xcw2n\" (UID: \"4405016a-cea0-4801-8cf0-c3c65ea2da7b\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-xcw2n" Apr 16 14:52:34.649887 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:34.649806 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4405016a-cea0-4801-8cf0-c3c65ea2da7b-podres\") pod \"perf-node-gather-daemonset-xcw2n\" (UID: \"4405016a-cea0-4801-8cf0-c3c65ea2da7b\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-xcw2n" Apr 16 14:52:34.649940 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:34.649886 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4405016a-cea0-4801-8cf0-c3c65ea2da7b-sys\") pod \"perf-node-gather-daemonset-xcw2n\" (UID: \"4405016a-cea0-4801-8cf0-c3c65ea2da7b\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-xcw2n" Apr 16 14:52:34.649940 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:34.649927 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4405016a-cea0-4801-8cf0-c3c65ea2da7b-proc\") pod \"perf-node-gather-daemonset-xcw2n\" (UID: \"4405016a-cea0-4801-8cf0-c3c65ea2da7b\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-xcw2n" Apr 16 14:52:34.650016 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:34.649951 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4405016a-cea0-4801-8cf0-c3c65ea2da7b-lib-modules\") pod \"perf-node-gather-daemonset-xcw2n\" (UID: \"4405016a-cea0-4801-8cf0-c3c65ea2da7b\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-xcw2n" Apr 16 14:52:34.750826 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:34.750798 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-npdzg\" (UniqueName: \"kubernetes.io/projected/4405016a-cea0-4801-8cf0-c3c65ea2da7b-kube-api-access-npdzg\") pod \"perf-node-gather-daemonset-xcw2n\" (UID: \"4405016a-cea0-4801-8cf0-c3c65ea2da7b\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-xcw2n" Apr 16 14:52:34.750959 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:34.750834 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4405016a-cea0-4801-8cf0-c3c65ea2da7b-podres\") pod \"perf-node-gather-daemonset-xcw2n\" (UID: \"4405016a-cea0-4801-8cf0-c3c65ea2da7b\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-xcw2n" Apr 16 14:52:34.750959 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:34.750886 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4405016a-cea0-4801-8cf0-c3c65ea2da7b-sys\") pod \"perf-node-gather-daemonset-xcw2n\" (UID: \"4405016a-cea0-4801-8cf0-c3c65ea2da7b\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-xcw2n" Apr 16 14:52:34.750959 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:34.750924 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4405016a-cea0-4801-8cf0-c3c65ea2da7b-proc\") pod \"perf-node-gather-daemonset-xcw2n\" (UID: \"4405016a-cea0-4801-8cf0-c3c65ea2da7b\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-xcw2n" Apr 16 14:52:34.750959 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:34.750946 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4405016a-cea0-4801-8cf0-c3c65ea2da7b-lib-modules\") pod \"perf-node-gather-daemonset-xcw2n\" (UID: \"4405016a-cea0-4801-8cf0-c3c65ea2da7b\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-xcw2n" Apr 16 14:52:34.751130 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:34.750997 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4405016a-cea0-4801-8cf0-c3c65ea2da7b-sys\") pod \"perf-node-gather-daemonset-xcw2n\" (UID: \"4405016a-cea0-4801-8cf0-c3c65ea2da7b\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-xcw2n" Apr 16 14:52:34.751130 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:34.751012 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4405016a-cea0-4801-8cf0-c3c65ea2da7b-proc\") pod \"perf-node-gather-daemonset-xcw2n\" (UID: \"4405016a-cea0-4801-8cf0-c3c65ea2da7b\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-xcw2n" Apr 16 14:52:34.751130 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:34.751065 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4405016a-cea0-4801-8cf0-c3c65ea2da7b-podres\") pod \"perf-node-gather-daemonset-xcw2n\" (UID: \"4405016a-cea0-4801-8cf0-c3c65ea2da7b\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-xcw2n" Apr 16 14:52:34.751130 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:34.751079 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4405016a-cea0-4801-8cf0-c3c65ea2da7b-lib-modules\") pod \"perf-node-gather-daemonset-xcw2n\" (UID: \"4405016a-cea0-4801-8cf0-c3c65ea2da7b\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-xcw2n" Apr 16 14:52:34.759800 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:34.759779 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-npdzg\" (UniqueName: \"kubernetes.io/projected/4405016a-cea0-4801-8cf0-c3c65ea2da7b-kube-api-access-npdzg\") pod \"perf-node-gather-daemonset-xcw2n\" (UID: \"4405016a-cea0-4801-8cf0-c3c65ea2da7b\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-xcw2n" Apr 16 14:52:34.873755 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:34.873682 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-lr5dz_fca51d46-f492-4bcf-8d66-7ecb32e54d54/console-operator/1.log" Apr 16 14:52:34.878964 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:34.878940 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-lr5dz_fca51d46-f492-4bcf-8d66-7ecb32e54d54/console-operator/2.log" Apr 16 14:52:34.937304 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:34.937274 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-xcw2n" Apr 16 14:52:35.059355 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:35.059331 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mcxmz/perf-node-gather-daemonset-xcw2n"] Apr 16 14:52:35.062014 ip-10-0-140-144 kubenswrapper[2579]: W0416 14:52:35.061984 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4405016a_cea0_4801_8cf0_c3c65ea2da7b.slice/crio-f4c9ec34eab429392a67f78fef2cc54975e9d79fd5c0fb841e376740335ec2b1 WatchSource:0}: Error finding container f4c9ec34eab429392a67f78fef2cc54975e9d79fd5c0fb841e376740335ec2b1: Status 404 returned error can't find the container with id f4c9ec34eab429392a67f78fef2cc54975e9d79fd5c0fb841e376740335ec2b1 Apr 16 14:52:35.063521 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:35.063494 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:52:35.427940 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:35.427865 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7964f57bdf-jqk5z_08fa673f-343b-4303-89a6-7d03095cc905/console/0.log" Apr 16 14:52:35.471805 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:35.471772 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-zvqf4_c405329f-15f2-4fd4-8b05-4f3e19783a99/download-server/0.log" Apr 16 14:52:35.777645 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:35.777610 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-xcw2n" event={"ID":"4405016a-cea0-4801-8cf0-c3c65ea2da7b","Type":"ContainerStarted","Data":"b53e35ca6e24af865d0fb259b074bb76b8e817945fbb2c35f3ccc1774f66ea5c"} Apr 16 14:52:35.777645 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:35.777651 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-xcw2n" event={"ID":"4405016a-cea0-4801-8cf0-c3c65ea2da7b","Type":"ContainerStarted","Data":"f4c9ec34eab429392a67f78fef2cc54975e9d79fd5c0fb841e376740335ec2b1"} Apr 16 14:52:35.777859 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:35.777705 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-xcw2n" Apr 16 14:52:35.796288 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:35.796239 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-xcw2n" podStartSLOduration=1.7962263699999999 podStartE2EDuration="1.79622637s" podCreationTimestamp="2026-04-16 14:52:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:52:35.793479203 +0000 UTC m=+1362.411268966" watchObservedRunningTime="2026-04-16 14:52:35.79622637 +0000 UTC m=+1362.414016134" Apr 16 14:52:36.027263 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:36.027219 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-gr6dx_3e29e4f2-44c1-4a7f-b243-c4e1d79a60ae/volume-data-source-validator/0.log" Apr 16 14:52:36.803396 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:36.803364 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4t4r6_0b44026e-170c-4488-824c-c757c82f68cd/dns/0.log" Apr 16 14:52:36.826562 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:36.826534 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4t4r6_0b44026e-170c-4488-824c-c757c82f68cd/kube-rbac-proxy/0.log" Apr 16 14:52:36.981014 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:36.980984 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-vjfvw_1ec5f4c5-a526-4cf8-82d9-0b554ed2fbaa/dns-node-resolver/0.log" Apr 16 14:52:37.524860 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:37.524834 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4648f_76137a5e-4deb-4e87-b7e5-d17bde0111e4/node-ca/0.log" Apr 16 14:52:38.444374 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:38.444343 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfvbks9_0400e2bb-d2d7-4633-bdc2-77847ef73977/istio-proxy/0.log" Apr 16 14:52:38.585865 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:38.585838 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-gtk5m_0fde0890-19e1-42aa-8b06-7b88945d97ca/istio-proxy/0.log" Apr 16 14:52:39.238945 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:39.238917 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-src8k_93aad6d1-f6fd-49fd-aeac-3661dd3118bf/serve-healthcheck-canary/0.log" Apr 16 14:52:39.796239 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:39.796207 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6w8wv_eadb0665-d202-43bc-b37c-41f49b666d83/kube-rbac-proxy/0.log" Apr 16 14:52:39.818537 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:39.818509 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6w8wv_eadb0665-d202-43bc-b37c-41f49b666d83/exporter/0.log" Apr 16 14:52:39.843222 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:39.843197 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6w8wv_eadb0665-d202-43bc-b37c-41f49b666d83/extractor/0.log" Apr 16 14:52:41.791838 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:41.791814 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-xcw2n" Apr 16 14:52:42.068678 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:42.068563 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6f7bb56bb6-cmhw2_396e2470-cef3-48c8-adf2-6ac7cb57b2e3/manager/0.log" Apr 16 14:52:42.143616 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:42.143586 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-crp8n_b55b6a93-720b-41aa-b058-8a2c92f86afd/postgres/0.log" Apr 16 14:52:43.468801 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:43.468774 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-76cd85c697-7cbd8_71dd7fb7-6b6c-4fd2-8573-d5aeefd1521c/manager/0.log" Apr 16 14:52:48.404170 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:48.404133 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-ltfsn_b73c7eb9-b024-4371-b219-6f114a27050f/migrator/0.log" Apr 16 14:52:48.427529 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:48.427485 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-ltfsn_b73c7eb9-b024-4371-b219-6f114a27050f/graceful-termination/0.log" Apr 16 14:52:50.100335 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:50.100310 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7dhhl_88fbb2da-2de8-4d14-aa18-1817fb16e61c/kube-multus-additional-cni-plugins/0.log" Apr 16 14:52:50.125750 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:50.125721 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7dhhl_88fbb2da-2de8-4d14-aa18-1817fb16e61c/egress-router-binary-copy/0.log" Apr 16 14:52:50.150051 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:50.150004 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7dhhl_88fbb2da-2de8-4d14-aa18-1817fb16e61c/cni-plugins/0.log" Apr 16 14:52:50.173581 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:50.173552 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7dhhl_88fbb2da-2de8-4d14-aa18-1817fb16e61c/bond-cni-plugin/0.log" Apr 16 14:52:50.197563 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:50.197539 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7dhhl_88fbb2da-2de8-4d14-aa18-1817fb16e61c/routeoverride-cni/0.log" Apr 16 14:52:50.255754 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:50.255721 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7dhhl_88fbb2da-2de8-4d14-aa18-1817fb16e61c/whereabouts-cni-bincopy/0.log" Apr 16 14:52:50.290644 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:50.290620 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7dhhl_88fbb2da-2de8-4d14-aa18-1817fb16e61c/whereabouts-cni/0.log" Apr 16 14:52:50.515613 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:50.515527 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-v76f6_79ff87e3-bc60-4320-a398-0c605679612c/kube-multus/0.log" Apr 16 14:52:50.667507 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:50.667471 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-m57qr_20614211-2bef-41dc-aad8-94242eb8364c/network-metrics-daemon/0.log" Apr 16 14:52:50.690122 ip-10-0-140-144 kubenswrapper[2579]: I0416 14:52:50.690096 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-m57qr_20614211-2bef-41dc-aad8-94242eb8364c/kube-rbac-proxy/0.log"