Apr 24 21:15:39.891148 ip-10-0-134-147 systemd[1]: Starting Kubernetes Kubelet... Apr 24 21:15:40.320996 ip-10-0-134-147 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:15:40.320996 ip-10-0-134-147 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 21:15:40.320996 ip-10-0-134-147 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:15:40.320996 ip-10-0-134-147 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 21:15:40.320996 ip-10-0-134-147 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:15:40.322513 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.322428 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 21:15:40.325556 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325541 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:15:40.325591 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325564 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:15:40.325591 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325569 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:15:40.325591 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325572 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:15:40.325591 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325575 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:15:40.325591 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325578 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:15:40.325591 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325581 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:15:40.325591 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325584 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:15:40.325591 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325587 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:15:40.325591 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325590 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:15:40.325591 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325592 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:15:40.325591 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325595 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:15:40.325889 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325599 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:15:40.325889 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325601 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:15:40.325889 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325604 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:15:40.325889 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325607 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:15:40.325889 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325610 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:15:40.325889 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325612 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:15:40.325889 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325615 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:15:40.325889 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325618 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:15:40.325889 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325622 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:15:40.325889 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325626 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:15:40.325889 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325629 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:15:40.325889 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325632 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:15:40.325889 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325635 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:15:40.325889 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325638 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:15:40.325889 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325641 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:15:40.325889 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325643 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:15:40.325889 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325646 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:15:40.325889 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325649 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:15:40.325889 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325651 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:15:40.326337 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325662 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:15:40.326337 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325665 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:15:40.326337 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325669 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:15:40.326337 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325671 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:15:40.326337 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325674 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:15:40.326337 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325678 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:15:40.326337 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325682 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:15:40.326337 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325685 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:15:40.326337 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325688 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:15:40.326337 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325690 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:15:40.326337 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325693 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:15:40.326337 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325696 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:15:40.326337 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325698 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:15:40.326337 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325701 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:15:40.326337 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325704 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:15:40.326337 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325707 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:15:40.326337 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325710 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:15:40.326337 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325712 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:15:40.326337 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325715 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:15:40.326337 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325718 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:15:40.326820 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325721 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:15:40.326820 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325724 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:15:40.326820 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325727 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:15:40.326820 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325730 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:15:40.326820 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325732 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:15:40.326820 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325735 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:15:40.326820 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325738 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:15:40.326820 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325740 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:15:40.326820 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325743 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:15:40.326820 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325746 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:15:40.326820 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325748 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:15:40.326820 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325751 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:15:40.326820 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325754 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:15:40.326820 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325756 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:15:40.326820 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325759 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:15:40.326820 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325762 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:15:40.326820 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325765 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:15:40.326820 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325768 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:15:40.326820 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325770 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:15:40.326820 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325773 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:15:40.327319 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325776 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:15:40.327319 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325778 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:15:40.327319 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325781 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:15:40.327319 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325783 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:15:40.327319 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325786 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:15:40.327319 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325789 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:15:40.327319 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325793 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:15:40.327319 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325795 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:15:40.327319 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325798 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:15:40.327319 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325801 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:15:40.327319 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325804 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:15:40.327319 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325806 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:15:40.327319 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325809 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:15:40.327319 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325811 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:15:40.327319 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.325814 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:15:40.327319 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326214 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:15:40.327319 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326219 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:15:40.327319 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326223 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:15:40.327319 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326227 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:15:40.327803 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326231 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:15:40.327803 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326234 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:15:40.327803 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326237 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:15:40.327803 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326240 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:15:40.327803 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326243 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:15:40.327803 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326246 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:15:40.327803 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326248 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:15:40.327803 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326251 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:15:40.327803 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326254 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:15:40.327803 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326258 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:15:40.327803 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326261 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:15:40.327803 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326264 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:15:40.327803 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326267 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:15:40.327803 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326270 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:15:40.327803 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326272 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:15:40.327803 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326275 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:15:40.327803 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326277 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:15:40.327803 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326280 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:15:40.327803 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326283 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:15:40.327803 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326286 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:15:40.328294 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326289 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:15:40.328294 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326291 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:15:40.328294 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326294 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:15:40.328294 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326297 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:15:40.328294 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326299 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:15:40.328294 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326302 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:15:40.328294 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326304 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:15:40.328294 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326307 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:15:40.328294 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326309 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:15:40.328294 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326312 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:15:40.328294 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326314 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:15:40.328294 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326317 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:15:40.328294 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326320 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:15:40.328294 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326322 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:15:40.328294 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326324 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:15:40.328294 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326327 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:15:40.328294 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326330 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:15:40.328294 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326332 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:15:40.328294 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326335 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:15:40.328294 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326337 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:15:40.328778 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326340 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:15:40.328778 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326344 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:15:40.328778 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326348 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:15:40.328778 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326350 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:15:40.328778 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326353 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:15:40.328778 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326355 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:15:40.328778 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326358 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:15:40.328778 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326361 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:15:40.328778 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326363 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:15:40.328778 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326366 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:15:40.328778 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326368 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:15:40.328778 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326371 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:15:40.328778 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326374 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:15:40.328778 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326376 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:15:40.328778 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326379 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:15:40.328778 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326381 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:15:40.328778 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326384 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:15:40.328778 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326387 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:15:40.328778 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326390 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:15:40.328778 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326392 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:15:40.329280 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326395 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:15:40.329280 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326398 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:15:40.329280 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326400 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:15:40.329280 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326403 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:15:40.329280 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326405 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:15:40.329280 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326408 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:15:40.329280 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326410 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:15:40.329280 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326413 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:15:40.329280 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326416 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:15:40.329280 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326418 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:15:40.329280 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326421 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:15:40.329280 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326423 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:15:40.329280 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326426 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:15:40.329280 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326429 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:15:40.329280 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326432 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:15:40.329280 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326436 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:15:40.329280 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326438 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:15:40.329280 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326441 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:15:40.329280 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326443 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:15:40.329280 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326446 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:15:40.329796 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326449 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:15:40.329796 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.326452 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:15:40.329796 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327766 2573 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 21:15:40.329796 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327775 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 21:15:40.329796 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327782 2573 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 21:15:40.329796 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327786 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 21:15:40.329796 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327791 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 21:15:40.329796 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327794 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 21:15:40.329796 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327798 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 21:15:40.329796 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327803 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 21:15:40.329796 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327806 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 21:15:40.329796 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327810 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 21:15:40.329796 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327813 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 21:15:40.329796 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327817 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 21:15:40.329796 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327820 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 21:15:40.329796 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327823 2573 flags.go:64] FLAG: --cgroup-root="" Apr 24 21:15:40.329796 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327826 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 21:15:40.329796 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327830 2573 flags.go:64] FLAG: --client-ca-file="" Apr 24 21:15:40.329796 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327833 2573 flags.go:64] FLAG: --cloud-config="" Apr 24 21:15:40.329796 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327836 2573 flags.go:64] FLAG: --cloud-provider="external" Apr 24 21:15:40.329796 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327839 2573 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 21:15:40.329796 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327844 2573 flags.go:64] FLAG: --cluster-domain="" Apr 24 21:15:40.329796 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327847 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 21:15:40.329796 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327850 2573 flags.go:64] FLAG: --config-dir="" Apr 24 21:15:40.330395 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327853 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 21:15:40.330395 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327856 2573 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 21:15:40.330395 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327860 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 21:15:40.330395 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327864 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 21:15:40.330395 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327878 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 21:15:40.330395 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327882 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 21:15:40.330395 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327885 2573 flags.go:64] FLAG: --contention-profiling="false" Apr 24 21:15:40.330395 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327889 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 21:15:40.330395 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327892 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 21:15:40.330395 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327896 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 21:15:40.330395 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327899 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 21:15:40.330395 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327903 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 21:15:40.330395 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327906 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 21:15:40.330395 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327909 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 21:15:40.330395 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327912 2573 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 21:15:40.330395 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327915 2573 flags.go:64] FLAG: --enable-server="true" Apr 24 21:15:40.330395 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327918 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 21:15:40.330395 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327923 2573 flags.go:64] FLAG: --event-burst="100" Apr 24 21:15:40.330395 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327926 2573 flags.go:64] FLAG: --event-qps="50" Apr 24 21:15:40.330395 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327929 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 21:15:40.330395 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327932 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 21:15:40.330395 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327935 2573 flags.go:64] FLAG: --eviction-hard="" Apr 24 21:15:40.330395 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327939 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 21:15:40.330395 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327942 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 21:15:40.331027 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327946 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 21:15:40.331027 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327949 2573 flags.go:64] FLAG: --eviction-soft="" Apr 24 21:15:40.331027 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327952 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 21:15:40.331027 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327955 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 21:15:40.331027 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327958 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 21:15:40.331027 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327961 2573 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 21:15:40.331027 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327964 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 21:15:40.331027 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327967 2573 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 21:15:40.331027 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327970 2573 flags.go:64] FLAG: --feature-gates="" Apr 24 21:15:40.331027 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327974 2573 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 21:15:40.331027 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327977 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 21:15:40.331027 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327980 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 21:15:40.331027 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327984 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 21:15:40.331027 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327987 2573 flags.go:64] FLAG: --healthz-port="10248" Apr 24 21:15:40.331027 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327991 2573 flags.go:64] FLAG: --help="false" Apr 24 21:15:40.331027 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327994 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-134-147.ec2.internal" Apr 24 21:15:40.331027 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.327997 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 21:15:40.331027 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328000 2573 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 21:15:40.331027 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328003 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 21:15:40.331027 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328006 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 21:15:40.331027 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328010 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 21:15:40.331027 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328013 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 21:15:40.331027 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328016 2573 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 21:15:40.331583 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328018 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 21:15:40.331583 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328022 2573 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 21:15:40.331583 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328025 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 21:15:40.331583 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328028 2573 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 21:15:40.331583 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328031 2573 flags.go:64] FLAG: --kube-reserved="" Apr 24 21:15:40.331583 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328034 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 21:15:40.331583 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328037 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 21:15:40.331583 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328040 2573 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 21:15:40.331583 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328043 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 21:15:40.331583 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328046 2573 flags.go:64] FLAG: --lock-file="" Apr 24 21:15:40.331583 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328049 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 21:15:40.331583 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328052 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 21:15:40.331583 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328055 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 21:15:40.331583 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328060 2573 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 21:15:40.331583 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328063 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 21:15:40.331583 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328066 2573 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 21:15:40.331583 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328070 2573 flags.go:64] FLAG: --logging-format="text" Apr 24 21:15:40.331583 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328072 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 21:15:40.331583 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328076 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 21:15:40.331583 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328079 2573 flags.go:64] FLAG: --manifest-url="" Apr 24 21:15:40.331583 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328082 2573 flags.go:64] FLAG: --manifest-url-header="" Apr 24 21:15:40.331583 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328086 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 21:15:40.331583 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328090 2573 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 21:15:40.331583 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328094 2573 flags.go:64] FLAG: --max-pods="110" Apr 24 21:15:40.331583 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328097 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 21:15:40.332192 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328100 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 21:15:40.332192 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328103 2573 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 21:15:40.332192 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328107 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 21:15:40.332192 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328110 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 21:15:40.332192 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328113 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 21:15:40.332192 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328116 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 21:15:40.332192 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328123 2573 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 21:15:40.332192 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328127 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 21:15:40.332192 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328130 2573 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 21:15:40.332192 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328133 2573 flags.go:64] FLAG: --pod-cidr="" Apr 24 21:15:40.332192 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328136 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 21:15:40.332192 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328141 2573 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 21:15:40.332192 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328144 2573 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 21:15:40.332192 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328147 2573 flags.go:64] FLAG: --pods-per-core="0" Apr 24 21:15:40.332192 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328150 2573 flags.go:64] FLAG: --port="10250" Apr 24 21:15:40.332192 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328154 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 21:15:40.332192 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328157 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-09bea0e6bf533f170" Apr 24 21:15:40.332192 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328160 2573 flags.go:64] FLAG: --qos-reserved="" Apr 24 21:15:40.332192 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328163 2573 flags.go:64] FLAG: --read-only-port="10255" Apr 24 21:15:40.332192 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328166 2573 flags.go:64] FLAG: --register-node="true" Apr 24 21:15:40.332192 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328169 2573 flags.go:64] FLAG: --register-schedulable="true" Apr 24 21:15:40.332192 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328172 2573 flags.go:64] FLAG: --register-with-taints="" Apr 24 21:15:40.332192 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328176 2573 flags.go:64] FLAG: --registry-burst="10" Apr 24 21:15:40.332192 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328179 2573 flags.go:64] FLAG: --registry-qps="5" Apr 24 21:15:40.332792 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328182 2573 flags.go:64] FLAG: --reserved-cpus="" Apr 24 21:15:40.332792 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328185 2573 flags.go:64] FLAG: --reserved-memory="" Apr 24 21:15:40.332792 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328189 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 21:15:40.332792 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328192 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 21:15:40.332792 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328195 2573 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 21:15:40.332792 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328198 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 21:15:40.332792 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328203 2573 flags.go:64] FLAG: --runonce="false" Apr 24 21:15:40.332792 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328206 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 21:15:40.332792 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328209 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 21:15:40.332792 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328213 2573 flags.go:64] FLAG: --seccomp-default="false" Apr 24 21:15:40.332792 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328216 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 21:15:40.332792 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328219 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 21:15:40.332792 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328222 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 21:15:40.332792 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328225 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 21:15:40.332792 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328229 2573 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 21:15:40.332792 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328232 2573 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 21:15:40.332792 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328235 2573 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 21:15:40.332792 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328238 2573 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 21:15:40.332792 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328240 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 21:15:40.332792 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328244 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 21:15:40.332792 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328247 2573 flags.go:64] FLAG: --system-cgroups="" Apr 24 21:15:40.332792 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328249 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 21:15:40.332792 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328255 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 21:15:40.332792 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328258 2573 flags.go:64] FLAG: --tls-cert-file="" Apr 24 21:15:40.332792 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328261 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 21:15:40.333409 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328266 2573 flags.go:64] FLAG: --tls-min-version="" Apr 24 21:15:40.333409 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328268 2573 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 21:15:40.333409 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328271 2573 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 21:15:40.333409 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328274 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 21:15:40.333409 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328277 2573 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 21:15:40.333409 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328280 2573 flags.go:64] FLAG: --v="2" Apr 24 21:15:40.333409 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328284 2573 flags.go:64] FLAG: --version="false" Apr 24 21:15:40.333409 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328289 2573 flags.go:64] FLAG: --vmodule="" Apr 24 21:15:40.333409 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328293 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 21:15:40.333409 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.328296 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 21:15:40.333409 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328385 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:15:40.333409 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328389 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:15:40.333409 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328392 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:15:40.333409 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328395 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:15:40.333409 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328399 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:15:40.333409 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328402 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:15:40.333409 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328405 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:15:40.333409 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328408 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:15:40.333409 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328411 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:15:40.333409 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328414 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:15:40.333409 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328417 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:15:40.333409 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328420 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:15:40.333975 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328422 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:15:40.333975 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328426 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:15:40.333975 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328428 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:15:40.333975 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328431 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:15:40.333975 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328434 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:15:40.333975 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328436 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:15:40.333975 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328439 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:15:40.333975 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328441 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:15:40.333975 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328444 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:15:40.333975 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328447 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:15:40.333975 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328449 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:15:40.333975 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328452 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:15:40.333975 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328454 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:15:40.333975 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328457 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:15:40.333975 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328460 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:15:40.333975 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328462 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:15:40.333975 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328465 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:15:40.333975 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328468 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:15:40.333975 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328471 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:15:40.333975 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328473 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:15:40.334502 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328476 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:15:40.334502 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328479 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:15:40.334502 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328481 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:15:40.334502 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328489 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:15:40.334502 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328491 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:15:40.334502 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328494 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:15:40.334502 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328497 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:15:40.334502 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328500 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:15:40.334502 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328503 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:15:40.334502 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328505 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:15:40.334502 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328508 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:15:40.334502 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328512 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:15:40.334502 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328515 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:15:40.334502 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328517 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:15:40.334502 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328520 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:15:40.334502 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328523 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:15:40.334502 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328525 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:15:40.334502 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328528 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:15:40.334502 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328530 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:15:40.335273 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328533 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:15:40.335273 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328535 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:15:40.335273 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328538 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:15:40.335273 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328540 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:15:40.335273 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328543 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:15:40.335273 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328547 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:15:40.335273 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328551 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:15:40.335273 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328554 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:15:40.335273 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328557 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:15:40.335273 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328559 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:15:40.335273 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328562 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:15:40.335273 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328565 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:15:40.335273 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328567 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:15:40.335273 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328571 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:15:40.335273 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328575 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:15:40.335273 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328577 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:15:40.335273 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328582 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:15:40.335273 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328584 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:15:40.335273 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328587 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:15:40.335948 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328590 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:15:40.335948 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328593 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:15:40.335948 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328596 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:15:40.335948 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328599 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:15:40.335948 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328601 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:15:40.335948 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328605 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:15:40.335948 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328607 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:15:40.335948 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328610 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:15:40.335948 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328613 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:15:40.335948 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328615 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:15:40.335948 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328618 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:15:40.335948 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328620 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:15:40.335948 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328623 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:15:40.335948 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328626 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:15:40.335948 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328628 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:15:40.335948 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.328631 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:15:40.336377 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.329160 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:15:40.336598 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.336579 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 21:15:40.336630 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.336600 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 21:15:40.336659 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336648 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:15:40.336659 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336654 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:15:40.336659 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336657 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:15:40.336737 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336661 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:15:40.336737 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336664 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:15:40.336737 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336667 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:15:40.336737 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336671 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:15:40.336737 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336675 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:15:40.336737 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336678 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:15:40.336737 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336681 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:15:40.336737 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336684 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:15:40.336737 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336686 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:15:40.336737 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336689 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:15:40.336737 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336692 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:15:40.336737 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336695 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:15:40.336737 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336697 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:15:40.336737 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336700 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:15:40.336737 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336703 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:15:40.336737 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336705 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:15:40.336737 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336708 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:15:40.336737 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336711 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:15:40.336737 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336713 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:15:40.337220 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336716 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:15:40.337220 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336719 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:15:40.337220 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336722 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:15:40.337220 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336724 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:15:40.337220 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336727 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:15:40.337220 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336730 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:15:40.337220 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336732 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:15:40.337220 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336735 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:15:40.337220 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336737 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:15:40.337220 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336741 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:15:40.337220 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336743 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:15:40.337220 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336746 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:15:40.337220 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336749 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:15:40.337220 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336752 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:15:40.337220 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336755 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:15:40.337220 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336757 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:15:40.337220 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336760 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:15:40.337220 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336762 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:15:40.337220 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336765 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:15:40.337220 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336768 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:15:40.337713 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336771 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:15:40.337713 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336773 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:15:40.337713 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336776 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:15:40.337713 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336779 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:15:40.337713 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336781 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:15:40.337713 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336784 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:15:40.337713 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336787 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:15:40.337713 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336789 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:15:40.337713 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336792 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:15:40.337713 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336794 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:15:40.337713 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336797 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:15:40.337713 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336799 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:15:40.337713 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336802 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:15:40.337713 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336805 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:15:40.337713 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336808 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:15:40.337713 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336811 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:15:40.337713 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336813 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:15:40.337713 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336816 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:15:40.337713 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336818 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:15:40.337713 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336821 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:15:40.338226 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336823 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:15:40.338226 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336826 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:15:40.338226 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336829 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:15:40.338226 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336831 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:15:40.338226 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336834 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:15:40.338226 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336836 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:15:40.338226 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336839 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:15:40.338226 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336843 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:15:40.338226 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336847 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:15:40.338226 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336850 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:15:40.338226 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336853 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:15:40.338226 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336856 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:15:40.338226 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336859 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:15:40.338226 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336861 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:15:40.338226 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336865 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:15:40.338226 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336882 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:15:40.338226 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336886 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:15:40.338226 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336889 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:15:40.338226 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336892 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:15:40.338226 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336895 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:15:40.338702 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336897 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:15:40.338702 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336900 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:15:40.338702 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336903 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:15:40.338702 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.336905 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:15:40.338702 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.336911 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:15:40.338702 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337002 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:15:40.338702 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337008 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:15:40.338702 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337011 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:15:40.338702 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337014 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:15:40.338702 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337017 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:15:40.338702 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337019 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:15:40.338702 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337022 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:15:40.338702 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337025 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:15:40.338702 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337028 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:15:40.338702 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337031 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:15:40.338702 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337033 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:15:40.339116 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337037 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:15:40.339116 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337041 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:15:40.339116 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337044 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:15:40.339116 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337046 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:15:40.339116 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337049 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:15:40.339116 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337051 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:15:40.339116 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337054 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:15:40.339116 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337058 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:15:40.339116 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337061 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:15:40.339116 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337064 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:15:40.339116 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337067 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:15:40.339116 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337070 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:15:40.339116 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337073 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:15:40.339116 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337076 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:15:40.339116 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337079 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:15:40.339116 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337082 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:15:40.339116 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337085 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:15:40.339116 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337087 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:15:40.339116 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337090 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:15:40.339604 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337093 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:15:40.339604 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337095 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:15:40.339604 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337098 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:15:40.339604 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337102 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:15:40.339604 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337104 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:15:40.339604 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337107 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:15:40.339604 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337110 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:15:40.339604 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337112 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:15:40.339604 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337115 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:15:40.339604 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337117 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:15:40.339604 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337120 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:15:40.339604 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337123 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:15:40.339604 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337125 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:15:40.339604 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337128 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:15:40.339604 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337130 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:15:40.339604 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337133 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:15:40.339604 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337135 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:15:40.339604 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337138 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:15:40.339604 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337140 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:15:40.339604 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337143 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:15:40.340102 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337145 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:15:40.340102 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337148 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:15:40.340102 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337150 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:15:40.340102 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337153 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:15:40.340102 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337156 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:15:40.340102 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337159 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:15:40.340102 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337161 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:15:40.340102 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337164 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:15:40.340102 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337166 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:15:40.340102 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337169 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:15:40.340102 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337172 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:15:40.340102 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337174 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:15:40.340102 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337177 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:15:40.340102 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337179 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:15:40.340102 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337182 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:15:40.340102 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337185 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:15:40.340102 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337188 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:15:40.340102 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337190 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:15:40.340102 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337193 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:15:40.340102 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337195 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:15:40.340629 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337198 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:15:40.340629 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337200 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:15:40.340629 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337203 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:15:40.340629 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337205 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:15:40.340629 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337208 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:15:40.340629 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337211 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:15:40.340629 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337213 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:15:40.340629 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337216 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:15:40.340629 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337218 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:15:40.340629 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337221 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:15:40.340629 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337224 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:15:40.340629 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337227 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:15:40.340629 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337229 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:15:40.340629 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337232 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:15:40.340629 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337235 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:15:40.340629 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:40.337237 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:15:40.341051 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.337242 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:15:40.341051 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.337858 2573 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 21:15:40.341051 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.340141 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 21:15:40.341051 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.340931 2573 server.go:1019] "Starting client certificate rotation" Apr 24 21:15:40.341051 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.341041 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:15:40.341263 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.341108 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:15:40.367131 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.367108 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:15:40.374035 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.374011 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:15:40.386193 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.386174 2573 log.go:25] "Validated CRI v1 runtime API" Apr 24 21:15:40.391984 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.391931 2573 log.go:25] "Validated CRI v1 image API" Apr 24 21:15:40.393648 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.393629 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 21:15:40.397095 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.397072 2573 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 7a54b650-8db1-4e99-9ad2-6168df77dcd7:/dev/nvme0n1p3 ce44f6f8-a326-47d6-849f-e79bcf90053a:/dev/nvme0n1p4] Apr 24 21:15:40.397159 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.397095 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 21:15:40.400170 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.400152 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:15:40.402610 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.402512 2573 manager.go:217] Machine: {Timestamp:2026-04-24 21:15:40.400730024 +0000 UTC m=+0.396480773 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099560 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec23adb4e2a4508ae79df3a985bf9f0d SystemUUID:ec23adb4-e2a4-508a-e79d-f3a985bf9f0d BootID:67994fae-97ff-4e52-9193-fc6410c74304 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:7a:82:c6:67:6d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:7a:82:c6:67:6d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:16:fb:92:7d:f8:18 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 21:15:40.402610 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.402609 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 21:15:40.402708 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.402688 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 21:15:40.404896 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.404860 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 21:15:40.405029 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.404899 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-147.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 21:15:40.405071 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.405038 2573 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 21:15:40.405071 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.405047 2573 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 21:15:40.405071 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.405061 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:15:40.405696 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.405685 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:15:40.407643 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.407633 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:15:40.407744 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.407735 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 21:15:40.410607 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.410597 2573 kubelet.go:491] "Attempting to sync node with API server" Apr 24 21:15:40.410639 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.410611 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 21:15:40.410639 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.410622 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 21:15:40.410639 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.410631 2573 kubelet.go:397] "Adding apiserver pod source" Apr 24 21:15:40.410639 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.410639 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 21:15:40.411674 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.411661 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:15:40.411722 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.411687 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:15:40.414566 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.414550 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 21:15:40.415694 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.415681 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 21:15:40.417689 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.417672 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 21:15:40.417689 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.417692 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 21:15:40.417803 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.417698 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 21:15:40.417803 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.417704 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 21:15:40.417803 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.417709 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 21:15:40.417803 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.417715 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 21:15:40.417803 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.417723 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 21:15:40.417803 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.417733 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 21:15:40.417803 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.417742 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 21:15:40.417803 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.417751 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 21:15:40.417803 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.417768 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 21:15:40.417803 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.417779 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 21:15:40.418574 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.418564 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 21:15:40.418574 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.418573 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 21:15:40.421991 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.421977 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 21:15:40.422061 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.422011 2573 server.go:1295] "Started kubelet" Apr 24 21:15:40.422143 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.422113 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 21:15:40.422205 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.422102 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 21:15:40.422205 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.422175 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 21:15:40.422764 ip-10-0-134-147 systemd[1]: Started Kubernetes Kubelet. Apr 24 21:15:40.423140 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.423122 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 21:15:40.424803 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.424789 2573 server.go:317] "Adding debug handlers to kubelet server" Apr 24 21:15:40.425754 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:40.425720 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-147.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 21:15:40.425849 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.425793 2573 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-147.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:15:40.425849 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:40.425812 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 21:15:40.429639 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.429622 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 21:15:40.430441 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.430424 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 21:15:40.431146 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:40.430098 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-147.ec2.internal.18a967830682e316 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-147.ec2.internal,UID:ip-10-0-134-147.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-134-147.ec2.internal,},FirstTimestamp:2026-04-24 21:15:40.421989142 +0000 UTC m=+0.417739893,LastTimestamp:2026-04-24 21:15:40.421989142 +0000 UTC m=+0.417739893,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-147.ec2.internal,}" Apr 24 21:15:40.431270 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.431195 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 21:15:40.431333 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.431271 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 21:15:40.431333 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.431289 2573 factory.go:55] Registering systemd factory Apr 24 21:15:40.431436 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.431352 2573 factory.go:223] Registration of the systemd container factory successfully Apr 24 21:15:40.431436 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.431291 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 21:15:40.432628 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.432606 2573 reconstruct.go:97] "Volume reconstruction finished" Apr 24 21:15:40.432628 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.432629 2573 reconciler.go:26] "Reconciler: start to sync state" Apr 24 21:15:40.432722 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.432670 2573 factory.go:153] Registering CRI-O factory Apr 24 21:15:40.432722 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.432685 2573 factory.go:223] Registration of the crio container factory successfully Apr 24 21:15:40.432776 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.432740 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 21:15:40.432776 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.432763 2573 factory.go:103] Registering Raw factory Apr 24 21:15:40.432828 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.432779 2573 manager.go:1196] Started watching for new ooms in manager Apr 24 21:15:40.432855 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:40.432826 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-147.ec2.internal\" not found" Apr 24 21:15:40.433331 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.433316 2573 manager.go:319] Starting recovery of all containers Apr 24 21:15:40.435977 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:40.435948 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 21:15:40.436087 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:40.436063 2573 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-134-147.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 21:15:40.436650 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:40.436610 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 21:15:40.439478 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.439453 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-6scjm" Apr 24 21:15:40.442681 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.442524 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 21:15:40.444617 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.444585 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-6scjm" Apr 24 21:15:40.444961 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.444817 2573 manager.go:324] Recovery completed Apr 24 21:15:40.449558 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.449545 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:15:40.451904 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.451890 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-147.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:15:40.451971 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.451916 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-147.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:15:40.451971 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.451926 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-147.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:15:40.452362 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.452350 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 21:15:40.452362 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.452360 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 21:15:40.452457 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.452374 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:15:40.453992 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:40.453926 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-147.ec2.internal.18a96783084b592d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-147.ec2.internal,UID:ip-10-0-134-147.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-134-147.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-134-147.ec2.internal,},FirstTimestamp:2026-04-24 21:15:40.451903789 +0000 UTC m=+0.447654538,LastTimestamp:2026-04-24 21:15:40.451903789 +0000 UTC m=+0.447654538,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-147.ec2.internal,}" Apr 24 21:15:40.454452 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.454438 2573 policy_none.go:49] "None policy: Start" Apr 24 21:15:40.454521 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.454456 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 21:15:40.454521 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.454469 2573 state_mem.go:35] "Initializing new in-memory state store" Apr 24 21:15:40.492574 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.492559 2573 manager.go:341] "Starting Device Plugin manager" Apr 24 21:15:40.504034 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:40.492594 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 21:15:40.504034 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.492604 2573 server.go:85] "Starting device plugin registration server" Apr 24 21:15:40.504034 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.492858 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 21:15:40.504034 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.492888 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 21:15:40.504034 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.493092 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 21:15:40.504034 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.493171 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 21:15:40.504034 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.493180 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 21:15:40.504034 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:40.493691 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 21:15:40.504034 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:40.493729 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-147.ec2.internal\" not found" Apr 24 21:15:40.566345 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.566326 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 21:15:40.566434 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.566354 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 21:15:40.566434 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.566379 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 21:15:40.566434 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.566385 2573 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 21:15:40.566434 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:40.566415 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 21:15:40.568808 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.568787 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:15:40.593929 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.593889 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:15:40.594666 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.594652 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-147.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:15:40.594736 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.594691 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-147.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:15:40.594736 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.594702 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-147.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:15:40.594736 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.594722 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-147.ec2.internal" Apr 24 21:15:40.603974 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.603960 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-147.ec2.internal" Apr 24 21:15:40.604047 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:40.603979 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-147.ec2.internal\": node \"ip-10-0-134-147.ec2.internal\" not found" Apr 24 21:15:40.621975 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:40.621961 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-147.ec2.internal\" not found" Apr 24 21:15:40.666532 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.666511 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-134-147.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-147.ec2.internal"] Apr 24 21:15:40.666604 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.666565 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:15:40.667907 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.667891 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-147.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:15:40.667975 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.667916 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-147.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:15:40.667975 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.667929 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-147.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:15:40.669230 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.669218 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:15:40.669358 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.669343 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-147.ec2.internal" Apr 24 21:15:40.669402 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.669370 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:15:40.669884 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.669859 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-147.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:15:40.669964 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.669885 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-147.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:15:40.669964 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.669900 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-147.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:15:40.669964 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.669906 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-147.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:15:40.669964 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.669916 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-147.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:15:40.670137 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.669917 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-147.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:15:40.671063 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.671047 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-147.ec2.internal" Apr 24 21:15:40.671143 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.671077 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:15:40.671667 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.671653 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-147.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:15:40.671735 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.671681 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-147.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:15:40.671735 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.671694 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-147.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:15:40.692394 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:40.692375 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-147.ec2.internal\" not found" node="ip-10-0-134-147.ec2.internal" Apr 24 21:15:40.696312 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:40.696297 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-147.ec2.internal\" not found" node="ip-10-0-134-147.ec2.internal" Apr 24 21:15:40.722428 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:40.722408 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-147.ec2.internal\" not found" Apr 24 21:15:40.734449 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.734429 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c7195f4bce997b5527f5c21d5b6e5e49-config\") pod \"kube-apiserver-proxy-ip-10-0-134-147.ec2.internal\" (UID: \"c7195f4bce997b5527f5c21d5b6e5e49\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-147.ec2.internal" Apr 24 21:15:40.734522 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.734451 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e662b2546977ef7b16e2af25ff1dfef2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-147.ec2.internal\" (UID: \"e662b2546977ef7b16e2af25ff1dfef2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-147.ec2.internal" Apr 24 21:15:40.734522 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.734466 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e662b2546977ef7b16e2af25ff1dfef2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-147.ec2.internal\" (UID: \"e662b2546977ef7b16e2af25ff1dfef2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-147.ec2.internal" Apr 24 21:15:40.822760 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:40.822732 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-147.ec2.internal\" not found" Apr 24 21:15:40.835178 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.835158 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e662b2546977ef7b16e2af25ff1dfef2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-147.ec2.internal\" (UID: \"e662b2546977ef7b16e2af25ff1dfef2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-147.ec2.internal" Apr 24 21:15:40.835236 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.835182 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c7195f4bce997b5527f5c21d5b6e5e49-config\") pod \"kube-apiserver-proxy-ip-10-0-134-147.ec2.internal\" (UID: \"c7195f4bce997b5527f5c21d5b6e5e49\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-147.ec2.internal" Apr 24 21:15:40.835236 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.835197 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e662b2546977ef7b16e2af25ff1dfef2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-147.ec2.internal\" (UID: \"e662b2546977ef7b16e2af25ff1dfef2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-147.ec2.internal" Apr 24 21:15:40.835302 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.835236 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e662b2546977ef7b16e2af25ff1dfef2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-147.ec2.internal\" (UID: \"e662b2546977ef7b16e2af25ff1dfef2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-147.ec2.internal" Apr 24 21:15:40.835302 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.835248 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e662b2546977ef7b16e2af25ff1dfef2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-147.ec2.internal\" (UID: \"e662b2546977ef7b16e2af25ff1dfef2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-147.ec2.internal" Apr 24 21:15:40.835302 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.835283 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c7195f4bce997b5527f5c21d5b6e5e49-config\") pod \"kube-apiserver-proxy-ip-10-0-134-147.ec2.internal\" (UID: \"c7195f4bce997b5527f5c21d5b6e5e49\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-147.ec2.internal" Apr 24 21:15:40.923625 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:40.923561 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-147.ec2.internal\" not found" Apr 24 21:15:40.994110 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.994087 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-147.ec2.internal" Apr 24 21:15:40.998267 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:40.998249 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-147.ec2.internal" Apr 24 21:15:41.024134 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:41.024099 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-147.ec2.internal\" not found" Apr 24 21:15:41.124599 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:41.124578 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-147.ec2.internal\" not found" Apr 24 21:15:41.225104 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:41.225036 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-147.ec2.internal\" not found" Apr 24 21:15:41.251371 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:41.251349 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:15:41.325986 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:41.325963 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-147.ec2.internal\" not found" Apr 24 21:15:41.340445 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:41.340417 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 21:15:41.340558 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:41.340541 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:15:41.340599 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:41.340583 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:15:41.426150 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:41.426126 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-147.ec2.internal\" not found" Apr 24 21:15:41.430569 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:41.430550 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 21:15:41.443677 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:41.443659 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:15:41.446513 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:41.446483 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 21:10:40 +0000 UTC" deadline="2027-09-26 21:55:13.274992715 +0000 UTC" Apr 24 21:15:41.446513 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:41.446511 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12480h39m31.828484502s" Apr 24 21:15:41.467639 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:41.467619 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-dj8c4" Apr 24 21:15:41.472948 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:41.472926 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-dj8c4" Apr 24 21:15:41.488277 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:41.488235 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:15:41.504461 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:41.504430 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7195f4bce997b5527f5c21d5b6e5e49.slice/crio-3c19ce26cd206c10100c3c44f36362325b7be9f93040da20dd1a204d9ff7bbc3 WatchSource:0}: Error finding container 3c19ce26cd206c10100c3c44f36362325b7be9f93040da20dd1a204d9ff7bbc3: Status 404 returned error can't find the container with id 3c19ce26cd206c10100c3c44f36362325b7be9f93040da20dd1a204d9ff7bbc3 Apr 24 21:15:41.504702 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:41.504689 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode662b2546977ef7b16e2af25ff1dfef2.slice/crio-8c483e0b3867d8149e317866f821d19f24887c17b0813a830c022257ae22c8b8 WatchSource:0}: Error finding container 8c483e0b3867d8149e317866f821d19f24887c17b0813a830c022257ae22c8b8: Status 404 returned error can't find the container with id 8c483e0b3867d8149e317866f821d19f24887c17b0813a830c022257ae22c8b8 Apr 24 21:15:41.508821 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:41.508808 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:15:41.526733 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:41.526710 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-147.ec2.internal\" not found" Apr 24 21:15:41.568816 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:41.568772 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-147.ec2.internal" event={"ID":"e662b2546977ef7b16e2af25ff1dfef2","Type":"ContainerStarted","Data":"8c483e0b3867d8149e317866f821d19f24887c17b0813a830c022257ae22c8b8"} Apr 24 21:15:41.569700 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:41.569680 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-147.ec2.internal" event={"ID":"c7195f4bce997b5527f5c21d5b6e5e49","Type":"ContainerStarted","Data":"3c19ce26cd206c10100c3c44f36362325b7be9f93040da20dd1a204d9ff7bbc3"} Apr 24 21:15:41.626797 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:41.626777 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-147.ec2.internal\" not found" Apr 24 21:15:41.727310 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:41.727289 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-147.ec2.internal\" not found" Apr 24 21:15:41.827786 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:41.827734 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-147.ec2.internal\" not found" Apr 24 21:15:41.928502 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:41.928475 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-147.ec2.internal\" not found" Apr 24 21:15:42.008102 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.008075 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:15:42.031222 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.031205 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-147.ec2.internal" Apr 24 21:15:42.043596 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.043577 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:15:42.044529 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.044507 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-147.ec2.internal" Apr 24 21:15:42.050366 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.050347 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:15:42.412555 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.412515 2573 apiserver.go:52] "Watching apiserver" Apr 24 21:15:42.422699 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.422675 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 21:15:42.425238 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.425210 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-t26r7","openshift-dns/node-resolver-zscc7","openshift-image-registry/node-ca-wv544","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-147.ec2.internal","openshift-multus/multus-additional-cni-plugins-hz9sp","openshift-multus/network-metrics-daemon-m6d6n","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gqz96","openshift-multus/multus-4qhzb","openshift-network-diagnostics/network-check-target-wtj7q","openshift-network-operator/iptables-alerter-bztfd","openshift-ovn-kubernetes/ovnkube-node-zvr9w","kube-system/konnectivity-agent-vcw5r","kube-system/kube-apiserver-proxy-ip-10-0-134-147.ec2.internal"] Apr 24 21:15:42.428921 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.428887 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-bztfd" Apr 24 21:15:42.430651 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.430319 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zscc7" Apr 24 21:15:42.431794 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.431770 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wv544" Apr 24 21:15:42.433749 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.433340 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-mct2p\"" Apr 24 21:15:42.433749 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.433533 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m6d6n" Apr 24 21:15:42.433749 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:42.433603 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m6d6n" podUID="223043ea-b132-4d5d-9a14-0496d53fdc53" Apr 24 21:15:42.433749 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.433632 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 21:15:42.434015 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.433942 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 21:15:42.434015 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.433951 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 21:15:42.434329 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.434103 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 21:15:42.434329 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.434117 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:15:42.434329 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.434176 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-6ppx7\"" Apr 24 21:15:42.435053 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.435034 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 21:15:42.435443 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.435297 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 21:15:42.435443 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.435339 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-btscm\"" Apr 24 21:15:42.435648 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.435632 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hz9sp" Apr 24 21:15:42.436348 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.436181 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 21:15:42.438438 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.438394 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 21:15:42.439133 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.439087 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 21:15:42.441164 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.439675 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.441164 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.439759 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 21:15:42.441164 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.439841 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 21:15:42.441164 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.440073 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 21:15:42.441164 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.440591 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-cx7l9\"" Apr 24 21:15:42.443373 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.442557 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.443373 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.442582 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gqz96" Apr 24 21:15:42.443373 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.443307 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/019f7af9-37e7-4923-a370-a980a06b7377-iptables-alerter-script\") pod \"iptables-alerter-bztfd\" (UID: \"019f7af9-37e7-4923-a370-a980a06b7377\") " pod="openshift-network-operator/iptables-alerter-bztfd" Apr 24 21:15:42.443576 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.443433 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8fp9\" (UniqueName: \"kubernetes.io/projected/019f7af9-37e7-4923-a370-a980a06b7377-kube-api-access-b8fp9\") pod \"iptables-alerter-bztfd\" (UID: \"019f7af9-37e7-4923-a370-a980a06b7377\") " pod="openshift-network-operator/iptables-alerter-bztfd" Apr 24 21:15:42.443576 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.443490 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a1ae49ae-a1e3-464e-a9db-3d0bad2349ab-hosts-file\") pod \"node-resolver-zscc7\" (UID: \"a1ae49ae-a1e3-464e-a9db-3d0bad2349ab\") " pod="openshift-dns/node-resolver-zscc7" Apr 24 21:15:42.443576 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.443539 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f855c2da-63b3-4393-85d5-d812d3b86100-cnibin\") pod \"multus-additional-cni-plugins-hz9sp\" (UID: \"f855c2da-63b3-4393-85d5-d812d3b86100\") " pod="openshift-multus/multus-additional-cni-plugins-hz9sp" Apr 24 21:15:42.443722 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.443599 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f855c2da-63b3-4393-85d5-d812d3b86100-cni-binary-copy\") pod \"multus-additional-cni-plugins-hz9sp\" (UID: \"f855c2da-63b3-4393-85d5-d812d3b86100\") " pod="openshift-multus/multus-additional-cni-plugins-hz9sp" Apr 24 21:15:42.443722 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.443611 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 21:15:42.443722 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.443629 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhksq\" (UniqueName: \"kubernetes.io/projected/f855c2da-63b3-4393-85d5-d812d3b86100-kube-api-access-fhksq\") pod \"multus-additional-cni-plugins-hz9sp\" (UID: \"f855c2da-63b3-4393-85d5-d812d3b86100\") " pod="openshift-multus/multus-additional-cni-plugins-hz9sp" Apr 24 21:15:42.443722 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.443683 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-946mn\" (UniqueName: \"kubernetes.io/projected/a1ae49ae-a1e3-464e-a9db-3d0bad2349ab-kube-api-access-946mn\") pod \"node-resolver-zscc7\" (UID: \"a1ae49ae-a1e3-464e-a9db-3d0bad2349ab\") " pod="openshift-dns/node-resolver-zscc7" Apr 24 21:15:42.443722 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.443716 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f855c2da-63b3-4393-85d5-d812d3b86100-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hz9sp\" (UID: \"f855c2da-63b3-4393-85d5-d812d3b86100\") " pod="openshift-multus/multus-additional-cni-plugins-hz9sp" Apr 24 21:15:42.444410 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.444219 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-bsw77\"" Apr 24 21:15:42.444495 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.444473 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f855c2da-63b3-4393-85d5-d812d3b86100-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hz9sp\" (UID: \"f855c2da-63b3-4393-85d5-d812d3b86100\") " pod="openshift-multus/multus-additional-cni-plugins-hz9sp" Apr 24 21:15:42.444551 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.444509 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a1ae49ae-a1e3-464e-a9db-3d0bad2349ab-tmp-dir\") pod \"node-resolver-zscc7\" (UID: \"a1ae49ae-a1e3-464e-a9db-3d0bad2349ab\") " pod="openshift-dns/node-resolver-zscc7" Apr 24 21:15:42.444551 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.444537 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f71747fd-1913-4d70-b833-4f352b05ba15-host\") pod \"node-ca-wv544\" (UID: \"f71747fd-1913-4d70-b833-4f352b05ba15\") " pod="openshift-image-registry/node-ca-wv544" Apr 24 21:15:42.444655 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.444561 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f71747fd-1913-4d70-b833-4f352b05ba15-serviceca\") pod \"node-ca-wv544\" (UID: \"f71747fd-1913-4d70-b833-4f352b05ba15\") " pod="openshift-image-registry/node-ca-wv544" Apr 24 21:15:42.444655 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.444606 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7b6l\" (UniqueName: \"kubernetes.io/projected/f71747fd-1913-4d70-b833-4f352b05ba15-kube-api-access-q7b6l\") pod \"node-ca-wv544\" (UID: \"f71747fd-1913-4d70-b833-4f352b05ba15\") " pod="openshift-image-registry/node-ca-wv544" Apr 24 21:15:42.444655 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.444636 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f855c2da-63b3-4393-85d5-d812d3b86100-system-cni-dir\") pod \"multus-additional-cni-plugins-hz9sp\" (UID: \"f855c2da-63b3-4393-85d5-d812d3b86100\") " pod="openshift-multus/multus-additional-cni-plugins-hz9sp" Apr 24 21:15:42.444855 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.444658 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f855c2da-63b3-4393-85d5-d812d3b86100-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hz9sp\" (UID: \"f855c2da-63b3-4393-85d5-d812d3b86100\") " pod="openshift-multus/multus-additional-cni-plugins-hz9sp" Apr 24 21:15:42.444855 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.444682 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/019f7af9-37e7-4923-a370-a980a06b7377-host-slash\") pod \"iptables-alerter-bztfd\" (UID: \"019f7af9-37e7-4923-a370-a980a06b7377\") " pod="openshift-network-operator/iptables-alerter-bztfd" Apr 24 21:15:42.444855 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.444706 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f855c2da-63b3-4393-85d5-d812d3b86100-os-release\") pod \"multus-additional-cni-plugins-hz9sp\" (UID: \"f855c2da-63b3-4393-85d5-d812d3b86100\") " pod="openshift-multus/multus-additional-cni-plugins-hz9sp" Apr 24 21:15:42.445105 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.445078 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.447137 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.447117 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vcw5r" Apr 24 21:15:42.449008 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.448992 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wtj7q" Apr 24 21:15:42.449096 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:42.449055 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wtj7q" podUID="ff449891-1658-40ff-a0bd-e08978c661e9" Apr 24 21:15:42.454644 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.454623 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-pmvlf\"" Apr 24 21:15:42.454831 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.454815 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 21:15:42.459943 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.459725 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 21:15:42.459943 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.459744 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 21:15:42.459943 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.459919 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 21:15:42.460131 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.460000 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 21:15:42.460131 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.460101 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 21:15:42.461080 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.460985 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 21:15:42.461762 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.461742 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 21:15:42.466504 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.466248 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-s5q26\"" Apr 24 21:15:42.466504 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.466262 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-k64g9\"" Apr 24 21:15:42.466504 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.466296 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 21:15:42.466504 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.466336 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-r4rp4\"" Apr 24 21:15:42.466504 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.466400 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 21:15:42.466504 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.466469 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:15:42.466818 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.466588 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 21:15:42.467124 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.467108 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 21:15:42.474268 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.474240 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:10:41 +0000 UTC" deadline="2028-02-08 20:52:07.145702876 +0000 UTC" Apr 24 21:15:42.474268 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.474270 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15719h36m24.671437016s" Apr 24 21:15:42.532556 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.532534 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 21:15:42.545164 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.545141 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6561db34-89c6-42c5-a92a-a20600534a7d-socket-dir\") pod \"aws-ebs-csi-driver-node-gqz96\" (UID: \"6561db34-89c6-42c5-a92a-a20600534a7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gqz96" Apr 24 21:15:42.545284 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.545181 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f855c2da-63b3-4393-85d5-d812d3b86100-os-release\") pod \"multus-additional-cni-plugins-hz9sp\" (UID: \"f855c2da-63b3-4393-85d5-d812d3b86100\") " pod="openshift-multus/multus-additional-cni-plugins-hz9sp" Apr 24 21:15:42.545284 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.545209 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-var-lib-openvswitch\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.545284 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.545259 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f855c2da-63b3-4393-85d5-d812d3b86100-cni-binary-copy\") pod \"multus-additional-cni-plugins-hz9sp\" (UID: \"f855c2da-63b3-4393-85d5-d812d3b86100\") " pod="openshift-multus/multus-additional-cni-plugins-hz9sp" Apr 24 21:15:42.545435 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.545291 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5e385e54-2661-4cd9-8bc1-ad9750b2e402-lib-modules\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.545435 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.545313 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f855c2da-63b3-4393-85d5-d812d3b86100-os-release\") pod \"multus-additional-cni-plugins-hz9sp\" (UID: \"f855c2da-63b3-4393-85d5-d812d3b86100\") " pod="openshift-multus/multus-additional-cni-plugins-hz9sp" Apr 24 21:15:42.545435 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.545317 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5e385e54-2661-4cd9-8bc1-ad9750b2e402-var-lib-kubelet\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.545435 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.545372 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/790d66ed-34eb-4ff4-b315-99c7cda83b63-konnectivity-ca\") pod \"konnectivity-agent-vcw5r\" (UID: \"790d66ed-34eb-4ff4-b315-99c7cda83b63\") " pod="kube-system/konnectivity-agent-vcw5r" Apr 24 21:15:42.545435 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.545399 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5e385e54-2661-4cd9-8bc1-ad9750b2e402-etc-kubernetes\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.545435 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.545424 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5e385e54-2661-4cd9-8bc1-ad9750b2e402-sys\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.545611 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.545448 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7mp6\" (UniqueName: \"kubernetes.io/projected/ff449891-1658-40ff-a0bd-e08978c661e9-kube-api-access-s7mp6\") pod \"network-check-target-wtj7q\" (UID: \"ff449891-1658-40ff-a0bd-e08978c661e9\") " pod="openshift-network-diagnostics/network-check-target-wtj7q" Apr 24 21:15:42.545611 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.545480 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-cni-binary-copy\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.545611 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.545502 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-host-run-netns\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.545611 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.545532 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a1ae49ae-a1e3-464e-a9db-3d0bad2349ab-tmp-dir\") pod \"node-resolver-zscc7\" (UID: \"a1ae49ae-a1e3-464e-a9db-3d0bad2349ab\") " pod="openshift-dns/node-resolver-zscc7" Apr 24 21:15:42.545611 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.545557 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f855c2da-63b3-4393-85d5-d812d3b86100-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hz9sp\" (UID: \"f855c2da-63b3-4393-85d5-d812d3b86100\") " pod="openshift-multus/multus-additional-cni-plugins-hz9sp" Apr 24 21:15:42.545611 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.545593 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-run-systemd\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.545818 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.545616 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-host-cni-bin\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.545818 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.545646 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6561db34-89c6-42c5-a92a-a20600534a7d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gqz96\" (UID: \"6561db34-89c6-42c5-a92a-a20600534a7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gqz96" Apr 24 21:15:42.545818 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.545673 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7swjs\" (UniqueName: \"kubernetes.io/projected/0a61c3b4-bf4f-42f7-afd7-075420c1040d-kube-api-access-7swjs\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.545818 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.545698 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-host-var-lib-cni-multus\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.545818 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.545730 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/019f7af9-37e7-4923-a370-a980a06b7377-host-slash\") pod \"iptables-alerter-bztfd\" (UID: \"019f7af9-37e7-4923-a370-a980a06b7377\") " pod="openshift-network-operator/iptables-alerter-bztfd" Apr 24 21:15:42.545818 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.545747 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f855c2da-63b3-4393-85d5-d812d3b86100-cni-binary-copy\") pod \"multus-additional-cni-plugins-hz9sp\" (UID: \"f855c2da-63b3-4393-85d5-d812d3b86100\") " pod="openshift-multus/multus-additional-cni-plugins-hz9sp" Apr 24 21:15:42.545818 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.545764 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-host-kubelet\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.545818 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.545789 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-etc-openvswitch\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.545818 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.545814 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-host-cni-netd\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.546213 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.545855 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0a61c3b4-bf4f-42f7-afd7-075420c1040d-ovnkube-config\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.546213 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.546109 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a1ae49ae-a1e3-464e-a9db-3d0bad2349ab-tmp-dir\") pod \"node-resolver-zscc7\" (UID: \"a1ae49ae-a1e3-464e-a9db-3d0bad2349ab\") " pod="openshift-dns/node-resolver-zscc7" Apr 24 21:15:42.546213 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.546175 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5e385e54-2661-4cd9-8bc1-ad9750b2e402-host\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.546213 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.546202 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fhksq\" (UniqueName: \"kubernetes.io/projected/f855c2da-63b3-4393-85d5-d812d3b86100-kube-api-access-fhksq\") pod \"multus-additional-cni-plugins-hz9sp\" (UID: \"f855c2da-63b3-4393-85d5-d812d3b86100\") " pod="openshift-multus/multus-additional-cni-plugins-hz9sp" Apr 24 21:15:42.546394 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.546228 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-multus-cni-dir\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.546394 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.546286 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-host-run-k8s-cni-cncf-io\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.546394 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.546312 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6561db34-89c6-42c5-a92a-a20600534a7d-sys-fs\") pod \"aws-ebs-csi-driver-node-gqz96\" (UID: \"6561db34-89c6-42c5-a92a-a20600534a7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gqz96" Apr 24 21:15:42.546394 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.546335 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5e385e54-2661-4cd9-8bc1-ad9750b2e402-etc-tuned\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.546394 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.546357 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5e385e54-2661-4cd9-8bc1-ad9750b2e402-tmp\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.546394 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.546375 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f855c2da-63b3-4393-85d5-d812d3b86100-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hz9sp\" (UID: \"f855c2da-63b3-4393-85d5-d812d3b86100\") " pod="openshift-multus/multus-additional-cni-plugins-hz9sp" Apr 24 21:15:42.546394 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.546382 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0a61c3b4-bf4f-42f7-afd7-075420c1040d-env-overrides\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.546686 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.546443 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/019f7af9-37e7-4923-a370-a980a06b7377-host-slash\") pod \"iptables-alerter-bztfd\" (UID: \"019f7af9-37e7-4923-a370-a980a06b7377\") " pod="openshift-network-operator/iptables-alerter-bztfd" Apr 24 21:15:42.546686 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.546479 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv8n6\" (UniqueName: \"kubernetes.io/projected/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-kube-api-access-sv8n6\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.546686 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.546506 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f71747fd-1913-4d70-b833-4f352b05ba15-serviceca\") pod \"node-ca-wv544\" (UID: \"f71747fd-1913-4d70-b833-4f352b05ba15\") " pod="openshift-image-registry/node-ca-wv544" Apr 24 21:15:42.546686 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.546533 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f855c2da-63b3-4393-85d5-d812d3b86100-system-cni-dir\") pod \"multus-additional-cni-plugins-hz9sp\" (UID: \"f855c2da-63b3-4393-85d5-d812d3b86100\") " pod="openshift-multus/multus-additional-cni-plugins-hz9sp" Apr 24 21:15:42.546686 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.546591 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-log-socket\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.546686 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.546620 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/790d66ed-34eb-4ff4-b315-99c7cda83b63-agent-certs\") pod \"konnectivity-agent-vcw5r\" (UID: \"790d66ed-34eb-4ff4-b315-99c7cda83b63\") " pod="kube-system/konnectivity-agent-vcw5r" Apr 24 21:15:42.546686 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.546648 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-host-var-lib-kubelet\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.546686 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.546674 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6561db34-89c6-42c5-a92a-a20600534a7d-device-dir\") pod \"aws-ebs-csi-driver-node-gqz96\" (UID: \"6561db34-89c6-42c5-a92a-a20600534a7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gqz96" Apr 24 21:15:42.546990 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.546698 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t9zd\" (UniqueName: \"kubernetes.io/projected/6561db34-89c6-42c5-a92a-a20600534a7d-kube-api-access-8t9zd\") pod \"aws-ebs-csi-driver-node-gqz96\" (UID: \"6561db34-89c6-42c5-a92a-a20600534a7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gqz96" Apr 24 21:15:42.546990 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.546707 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f855c2da-63b3-4393-85d5-d812d3b86100-system-cni-dir\") pod \"multus-additional-cni-plugins-hz9sp\" (UID: \"f855c2da-63b3-4393-85d5-d812d3b86100\") " pod="openshift-multus/multus-additional-cni-plugins-hz9sp" Apr 24 21:15:42.546990 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.546722 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5e385e54-2661-4cd9-8bc1-ad9750b2e402-etc-systemd\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.546990 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.546776 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-node-log\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.546990 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.546808 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqjj2\" (UniqueName: \"kubernetes.io/projected/223043ea-b132-4d5d-9a14-0496d53fdc53-kube-api-access-hqjj2\") pod \"network-metrics-daemon-m6d6n\" (UID: \"223043ea-b132-4d5d-9a14-0496d53fdc53\") " pod="openshift-multus/network-metrics-daemon-m6d6n" Apr 24 21:15:42.546990 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.546854 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-host-var-lib-cni-bin\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.546990 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.546900 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6561db34-89c6-42c5-a92a-a20600534a7d-etc-selinux\") pod \"aws-ebs-csi-driver-node-gqz96\" (UID: \"6561db34-89c6-42c5-a92a-a20600534a7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gqz96" Apr 24 21:15:42.546990 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.546927 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5e385e54-2661-4cd9-8bc1-ad9750b2e402-etc-sysconfig\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.546990 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.546955 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5e385e54-2661-4cd9-8bc1-ad9750b2e402-etc-sysctl-d\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.547417 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.547001 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5e385e54-2661-4cd9-8bc1-ad9750b2e402-run\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.547417 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.547030 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/019f7af9-37e7-4923-a370-a980a06b7377-iptables-alerter-script\") pod \"iptables-alerter-bztfd\" (UID: \"019f7af9-37e7-4923-a370-a980a06b7377\") " pod="openshift-network-operator/iptables-alerter-bztfd" Apr 24 21:15:42.547417 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.547081 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8fp9\" (UniqueName: \"kubernetes.io/projected/019f7af9-37e7-4923-a370-a980a06b7377-kube-api-access-b8fp9\") pod \"iptables-alerter-bztfd\" (UID: \"019f7af9-37e7-4923-a370-a980a06b7377\") " pod="openshift-network-operator/iptables-alerter-bztfd" Apr 24 21:15:42.547417 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.547108 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f855c2da-63b3-4393-85d5-d812d3b86100-cnibin\") pod \"multus-additional-cni-plugins-hz9sp\" (UID: \"f855c2da-63b3-4393-85d5-d812d3b86100\") " pod="openshift-multus/multus-additional-cni-plugins-hz9sp" Apr 24 21:15:42.547417 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.547123 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f71747fd-1913-4d70-b833-4f352b05ba15-serviceca\") pod \"node-ca-wv544\" (UID: \"f71747fd-1913-4d70-b833-4f352b05ba15\") " pod="openshift-image-registry/node-ca-wv544" Apr 24 21:15:42.547417 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.547135 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-run-ovn\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.547417 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.547169 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5e385e54-2661-4cd9-8bc1-ad9750b2e402-etc-modprobe-d\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.547417 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.547210 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f855c2da-63b3-4393-85d5-d812d3b86100-cnibin\") pod \"multus-additional-cni-plugins-hz9sp\" (UID: \"f855c2da-63b3-4393-85d5-d812d3b86100\") " pod="openshift-multus/multus-additional-cni-plugins-hz9sp" Apr 24 21:15:42.547417 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.547413 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-946mn\" (UniqueName: \"kubernetes.io/projected/a1ae49ae-a1e3-464e-a9db-3d0bad2349ab-kube-api-access-946mn\") pod \"node-resolver-zscc7\" (UID: \"a1ae49ae-a1e3-464e-a9db-3d0bad2349ab\") " pod="openshift-dns/node-resolver-zscc7" Apr 24 21:15:42.547822 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.547442 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f855c2da-63b3-4393-85d5-d812d3b86100-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hz9sp\" (UID: \"f855c2da-63b3-4393-85d5-d812d3b86100\") " pod="openshift-multus/multus-additional-cni-plugins-hz9sp" Apr 24 21:15:42.547822 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.547485 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-host-slash\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.547822 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.547518 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.547822 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.547580 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-hostroot\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.547822 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.547606 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-multus-daemon-config\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.547822 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.547641 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5e385e54-2661-4cd9-8bc1-ad9750b2e402-etc-sysctl-conf\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.547822 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.547669 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f71747fd-1913-4d70-b833-4f352b05ba15-host\") pod \"node-ca-wv544\" (UID: \"f71747fd-1913-4d70-b833-4f352b05ba15\") " pod="openshift-image-registry/node-ca-wv544" Apr 24 21:15:42.547822 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.547612 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/019f7af9-37e7-4923-a370-a980a06b7377-iptables-alerter-script\") pod \"iptables-alerter-bztfd\" (UID: \"019f7af9-37e7-4923-a370-a980a06b7377\") " pod="openshift-network-operator/iptables-alerter-bztfd" Apr 24 21:15:42.547822 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.547735 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-systemd-units\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.547822 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.547760 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-host-run-netns\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.547822 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.547785 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-host-run-ovn-kubernetes\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.548303 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.547826 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f71747fd-1913-4d70-b833-4f352b05ba15-host\") pod \"node-ca-wv544\" (UID: \"f71747fd-1913-4d70-b833-4f352b05ba15\") " pod="openshift-image-registry/node-ca-wv544" Apr 24 21:15:42.548303 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.547891 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f855c2da-63b3-4393-85d5-d812d3b86100-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hz9sp\" (UID: \"f855c2da-63b3-4393-85d5-d812d3b86100\") " pod="openshift-multus/multus-additional-cni-plugins-hz9sp" Apr 24 21:15:42.548303 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.547897 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-system-cni-dir\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.548303 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.547940 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-cnibin\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.548303 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.547998 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-multus-socket-dir-parent\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.548303 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.548054 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-run-openvswitch\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.548303 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.548089 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0a61c3b4-bf4f-42f7-afd7-075420c1040d-ovn-node-metrics-cert\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.548303 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.548138 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-etc-kubernetes\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.548303 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.548168 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a1ae49ae-a1e3-464e-a9db-3d0bad2349ab-hosts-file\") pod \"node-resolver-zscc7\" (UID: \"a1ae49ae-a1e3-464e-a9db-3d0bad2349ab\") " pod="openshift-dns/node-resolver-zscc7" Apr 24 21:15:42.548303 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.548195 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-os-release\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.548303 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.548226 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-multus-conf-dir\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.548303 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.548265 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a1ae49ae-a1e3-464e-a9db-3d0bad2349ab-hosts-file\") pod \"node-resolver-zscc7\" (UID: \"a1ae49ae-a1e3-464e-a9db-3d0bad2349ab\") " pod="openshift-dns/node-resolver-zscc7" Apr 24 21:15:42.548303 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.548308 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6561db34-89c6-42c5-a92a-a20600534a7d-registration-dir\") pod \"aws-ebs-csi-driver-node-gqz96\" (UID: \"6561db34-89c6-42c5-a92a-a20600534a7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gqz96" Apr 24 21:15:42.548915 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.548340 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f855c2da-63b3-4393-85d5-d812d3b86100-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hz9sp\" (UID: \"f855c2da-63b3-4393-85d5-d812d3b86100\") " pod="openshift-multus/multus-additional-cni-plugins-hz9sp" Apr 24 21:15:42.548915 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.548369 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqjxg\" (UniqueName: \"kubernetes.io/projected/5e385e54-2661-4cd9-8bc1-ad9750b2e402-kube-api-access-hqjxg\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.548915 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.548408 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q7b6l\" (UniqueName: \"kubernetes.io/projected/f71747fd-1913-4d70-b833-4f352b05ba15-kube-api-access-q7b6l\") pod \"node-ca-wv544\" (UID: \"f71747fd-1913-4d70-b833-4f352b05ba15\") " pod="openshift-image-registry/node-ca-wv544" Apr 24 21:15:42.548915 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.548484 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0a61c3b4-bf4f-42f7-afd7-075420c1040d-ovnkube-script-lib\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.548915 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.548595 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/223043ea-b132-4d5d-9a14-0496d53fdc53-metrics-certs\") pod \"network-metrics-daemon-m6d6n\" (UID: \"223043ea-b132-4d5d-9a14-0496d53fdc53\") " pod="openshift-multus/network-metrics-daemon-m6d6n" Apr 24 21:15:42.548915 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.548624 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-host-run-multus-certs\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.548915 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.548864 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f855c2da-63b3-4393-85d5-d812d3b86100-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hz9sp\" (UID: \"f855c2da-63b3-4393-85d5-d812d3b86100\") " pod="openshift-multus/multus-additional-cni-plugins-hz9sp" Apr 24 21:15:42.555309 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.555276 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 21:15:42.558649 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.558630 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhksq\" (UniqueName: \"kubernetes.io/projected/f855c2da-63b3-4393-85d5-d812d3b86100-kube-api-access-fhksq\") pod \"multus-additional-cni-plugins-hz9sp\" (UID: \"f855c2da-63b3-4393-85d5-d812d3b86100\") " pod="openshift-multus/multus-additional-cni-plugins-hz9sp" Apr 24 21:15:42.558738 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.558639 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-946mn\" (UniqueName: \"kubernetes.io/projected/a1ae49ae-a1e3-464e-a9db-3d0bad2349ab-kube-api-access-946mn\") pod \"node-resolver-zscc7\" (UID: \"a1ae49ae-a1e3-464e-a9db-3d0bad2349ab\") " pod="openshift-dns/node-resolver-zscc7" Apr 24 21:15:42.558808 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.558769 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7b6l\" (UniqueName: \"kubernetes.io/projected/f71747fd-1913-4d70-b833-4f352b05ba15-kube-api-access-q7b6l\") pod \"node-ca-wv544\" (UID: \"f71747fd-1913-4d70-b833-4f352b05ba15\") " pod="openshift-image-registry/node-ca-wv544" Apr 24 21:15:42.559101 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.559075 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8fp9\" (UniqueName: \"kubernetes.io/projected/019f7af9-37e7-4923-a370-a980a06b7377-kube-api-access-b8fp9\") pod \"iptables-alerter-bztfd\" (UID: \"019f7af9-37e7-4923-a370-a980a06b7377\") " pod="openshift-network-operator/iptables-alerter-bztfd" Apr 24 21:15:42.581886 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.581850 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:15:42.649819 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.649787 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0a61c3b4-bf4f-42f7-afd7-075420c1040d-ovnkube-script-lib\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.649819 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.649821 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/223043ea-b132-4d5d-9a14-0496d53fdc53-metrics-certs\") pod \"network-metrics-daemon-m6d6n\" (UID: \"223043ea-b132-4d5d-9a14-0496d53fdc53\") " pod="openshift-multus/network-metrics-daemon-m6d6n" Apr 24 21:15:42.650039 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.649836 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-host-run-multus-certs\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.650039 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.649851 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6561db34-89c6-42c5-a92a-a20600534a7d-socket-dir\") pod \"aws-ebs-csi-driver-node-gqz96\" (UID: \"6561db34-89c6-42c5-a92a-a20600534a7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gqz96" Apr 24 21:15:42.650039 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.649913 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-host-run-multus-certs\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.650039 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.649946 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-var-lib-openvswitch\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.650039 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:42.649966 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:15:42.650275 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.649973 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5e385e54-2661-4cd9-8bc1-ad9750b2e402-lib-modules\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.650275 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:42.650098 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/223043ea-b132-4d5d-9a14-0496d53fdc53-metrics-certs podName:223043ea-b132-4d5d-9a14-0496d53fdc53 nodeName:}" failed. No retries permitted until 2026-04-24 21:15:43.150064752 +0000 UTC m=+3.145815493 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/223043ea-b132-4d5d-9a14-0496d53fdc53-metrics-certs") pod "network-metrics-daemon-m6d6n" (UID: "223043ea-b132-4d5d-9a14-0496d53fdc53") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:15:42.650275 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.650115 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-var-lib-openvswitch\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.650275 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.650115 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6561db34-89c6-42c5-a92a-a20600534a7d-socket-dir\") pod \"aws-ebs-csi-driver-node-gqz96\" (UID: \"6561db34-89c6-42c5-a92a-a20600534a7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gqz96" Apr 24 21:15:42.650275 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.650136 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5e385e54-2661-4cd9-8bc1-ad9750b2e402-var-lib-kubelet\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.650275 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.650167 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5e385e54-2661-4cd9-8bc1-ad9750b2e402-lib-modules\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.650275 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.650183 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5e385e54-2661-4cd9-8bc1-ad9750b2e402-var-lib-kubelet\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.650275 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.650172 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/790d66ed-34eb-4ff4-b315-99c7cda83b63-konnectivity-ca\") pod \"konnectivity-agent-vcw5r\" (UID: \"790d66ed-34eb-4ff4-b315-99c7cda83b63\") " pod="kube-system/konnectivity-agent-vcw5r" Apr 24 21:15:42.650275 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.650213 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5e385e54-2661-4cd9-8bc1-ad9750b2e402-etc-kubernetes\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.650275 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.650248 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5e385e54-2661-4cd9-8bc1-ad9750b2e402-sys\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.650275 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.650274 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7mp6\" (UniqueName: \"kubernetes.io/projected/ff449891-1658-40ff-a0bd-e08978c661e9-kube-api-access-s7mp6\") pod \"network-check-target-wtj7q\" (UID: \"ff449891-1658-40ff-a0bd-e08978c661e9\") " pod="openshift-network-diagnostics/network-check-target-wtj7q" Apr 24 21:15:42.650618 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.650296 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5e385e54-2661-4cd9-8bc1-ad9750b2e402-sys\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.650618 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.650299 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-cni-binary-copy\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.650618 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.650399 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5e385e54-2661-4cd9-8bc1-ad9750b2e402-etc-kubernetes\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.650618 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.650461 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0a61c3b4-bf4f-42f7-afd7-075420c1040d-ovnkube-script-lib\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.650618 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.650491 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-host-run-netns\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.650618 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.650512 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-run-systemd\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.650618 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.650536 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-run-systemd\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.650618 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.650569 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-host-run-netns\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.650618 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.650613 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-host-cni-bin\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.651034 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.650639 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/790d66ed-34eb-4ff4-b315-99c7cda83b63-konnectivity-ca\") pod \"konnectivity-agent-vcw5r\" (UID: \"790d66ed-34eb-4ff4-b315-99c7cda83b63\") " pod="kube-system/konnectivity-agent-vcw5r" Apr 24 21:15:42.651034 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.650647 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6561db34-89c6-42c5-a92a-a20600534a7d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gqz96\" (UID: \"6561db34-89c6-42c5-a92a-a20600534a7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gqz96" Apr 24 21:15:42.651034 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.650659 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-host-cni-bin\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.651034 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.650677 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7swjs\" (UniqueName: \"kubernetes.io/projected/0a61c3b4-bf4f-42f7-afd7-075420c1040d-kube-api-access-7swjs\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.651034 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.650710 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6561db34-89c6-42c5-a92a-a20600534a7d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gqz96\" (UID: \"6561db34-89c6-42c5-a92a-a20600534a7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gqz96" Apr 24 21:15:42.651034 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.650759 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-host-var-lib-cni-multus\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.651034 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.650790 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-host-kubelet\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.651034 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.650797 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-cni-binary-copy\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.651034 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.650817 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-etc-openvswitch\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.651034 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.650850 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-host-var-lib-cni-multus\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.651034 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.650882 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-host-cni-netd\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.651034 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.650889 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-host-kubelet\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.651034 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.650909 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0a61c3b4-bf4f-42f7-afd7-075420c1040d-ovnkube-config\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.651034 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.650924 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-host-cni-netd\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.651034 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.650932 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5e385e54-2661-4cd9-8bc1-ad9750b2e402-host\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.651034 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.650911 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-etc-openvswitch\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.651034 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.650961 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-multus-cni-dir\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.651034 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.650991 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-host-run-k8s-cni-cncf-io\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.651863 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651036 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-host-run-k8s-cni-cncf-io\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.651863 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651042 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5e385e54-2661-4cd9-8bc1-ad9750b2e402-host\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.651863 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651078 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6561db34-89c6-42c5-a92a-a20600534a7d-sys-fs\") pod \"aws-ebs-csi-driver-node-gqz96\" (UID: \"6561db34-89c6-42c5-a92a-a20600534a7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gqz96" Apr 24 21:15:42.651863 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651088 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-multus-cni-dir\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.651863 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651105 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5e385e54-2661-4cd9-8bc1-ad9750b2e402-etc-tuned\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.651863 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651139 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5e385e54-2661-4cd9-8bc1-ad9750b2e402-tmp\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.651863 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651159 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6561db34-89c6-42c5-a92a-a20600534a7d-sys-fs\") pod \"aws-ebs-csi-driver-node-gqz96\" (UID: \"6561db34-89c6-42c5-a92a-a20600534a7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gqz96" Apr 24 21:15:42.651863 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651164 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0a61c3b4-bf4f-42f7-afd7-075420c1040d-env-overrides\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.651863 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651189 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sv8n6\" (UniqueName: \"kubernetes.io/projected/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-kube-api-access-sv8n6\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.651863 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651217 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-log-socket\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.651863 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651258 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/790d66ed-34eb-4ff4-b315-99c7cda83b63-agent-certs\") pod \"konnectivity-agent-vcw5r\" (UID: \"790d66ed-34eb-4ff4-b315-99c7cda83b63\") " pod="kube-system/konnectivity-agent-vcw5r" Apr 24 21:15:42.651863 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651283 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-host-var-lib-kubelet\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.651863 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651308 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6561db34-89c6-42c5-a92a-a20600534a7d-device-dir\") pod \"aws-ebs-csi-driver-node-gqz96\" (UID: \"6561db34-89c6-42c5-a92a-a20600534a7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gqz96" Apr 24 21:15:42.651863 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651332 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8t9zd\" (UniqueName: \"kubernetes.io/projected/6561db34-89c6-42c5-a92a-a20600534a7d-kube-api-access-8t9zd\") pod \"aws-ebs-csi-driver-node-gqz96\" (UID: \"6561db34-89c6-42c5-a92a-a20600534a7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gqz96" Apr 24 21:15:42.651863 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651341 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0a61c3b4-bf4f-42f7-afd7-075420c1040d-ovnkube-config\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.651863 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651358 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5e385e54-2661-4cd9-8bc1-ad9750b2e402-etc-systemd\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.651863 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651382 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-node-log\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.651863 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651393 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-host-var-lib-kubelet\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.652713 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651427 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqjj2\" (UniqueName: \"kubernetes.io/projected/223043ea-b132-4d5d-9a14-0496d53fdc53-kube-api-access-hqjj2\") pod \"network-metrics-daemon-m6d6n\" (UID: \"223043ea-b132-4d5d-9a14-0496d53fdc53\") " pod="openshift-multus/network-metrics-daemon-m6d6n" Apr 24 21:15:42.652713 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651432 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-log-socket\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.652713 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651455 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-host-var-lib-cni-bin\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.652713 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651478 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6561db34-89c6-42c5-a92a-a20600534a7d-etc-selinux\") pod \"aws-ebs-csi-driver-node-gqz96\" (UID: \"6561db34-89c6-42c5-a92a-a20600534a7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gqz96" Apr 24 21:15:42.652713 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651481 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6561db34-89c6-42c5-a92a-a20600534a7d-device-dir\") pod \"aws-ebs-csi-driver-node-gqz96\" (UID: \"6561db34-89c6-42c5-a92a-a20600534a7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gqz96" Apr 24 21:15:42.652713 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651503 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5e385e54-2661-4cd9-8bc1-ad9750b2e402-etc-sysconfig\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.652713 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651505 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-node-log\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.652713 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651526 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5e385e54-2661-4cd9-8bc1-ad9750b2e402-etc-sysctl-d\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.652713 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651532 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5e385e54-2661-4cd9-8bc1-ad9750b2e402-etc-systemd\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.652713 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651550 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5e385e54-2661-4cd9-8bc1-ad9750b2e402-run\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.652713 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651560 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-host-var-lib-cni-bin\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.652713 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651576 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0a61c3b4-bf4f-42f7-afd7-075420c1040d-env-overrides\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.652713 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651584 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-run-ovn\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.652713 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651612 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5e385e54-2661-4cd9-8bc1-ad9750b2e402-etc-modprobe-d\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.652713 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651687 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-host-slash\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.652713 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651700 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5e385e54-2661-4cd9-8bc1-ad9750b2e402-run\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.652713 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651718 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5e385e54-2661-4cd9-8bc1-ad9750b2e402-etc-sysctl-d\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.652713 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651723 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.653413 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651736 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-run-ovn\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.653413 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651759 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-hostroot\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.653413 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651764 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-host-slash\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.653413 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651784 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-multus-daemon-config\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.653413 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651811 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5e385e54-2661-4cd9-8bc1-ad9750b2e402-etc-sysctl-conf\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.653413 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651815 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5e385e54-2661-4cd9-8bc1-ad9750b2e402-etc-sysconfig\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.653413 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651826 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6561db34-89c6-42c5-a92a-a20600534a7d-etc-selinux\") pod \"aws-ebs-csi-driver-node-gqz96\" (UID: \"6561db34-89c6-42c5-a92a-a20600534a7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gqz96" Apr 24 21:15:42.653413 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651823 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5e385e54-2661-4cd9-8bc1-ad9750b2e402-etc-modprobe-d\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.653413 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651838 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-systemd-units\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.653413 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651864 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-host-run-netns\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.653413 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651866 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.653413 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651924 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-systemd-units\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.653413 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651929 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-host-run-ovn-kubernetes\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.653413 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651930 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-hostroot\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.653413 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651962 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-host-run-netns\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.653413 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651970 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-system-cni-dir\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.653413 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.651995 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-cnibin\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.653413 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.652001 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-host-run-ovn-kubernetes\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.654098 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.652021 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-multus-socket-dir-parent\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.654098 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.652047 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-run-openvswitch\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.654098 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.652056 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5e385e54-2661-4cd9-8bc1-ad9750b2e402-etc-sysctl-conf\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.654098 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.652073 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0a61c3b4-bf4f-42f7-afd7-075420c1040d-ovn-node-metrics-cert\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.654098 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.652086 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-multus-socket-dir-parent\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.654098 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.652048 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-system-cni-dir\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.654098 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.652098 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a61c3b4-bf4f-42f7-afd7-075420c1040d-run-openvswitch\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.654098 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.652097 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-etc-kubernetes\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.654098 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.652110 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-cnibin\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.654098 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.652132 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-os-release\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.654098 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.652132 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-etc-kubernetes\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.654098 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.652157 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-multus-conf-dir\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.654098 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.652183 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6561db34-89c6-42c5-a92a-a20600534a7d-registration-dir\") pod \"aws-ebs-csi-driver-node-gqz96\" (UID: \"6561db34-89c6-42c5-a92a-a20600534a7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gqz96" Apr 24 21:15:42.654098 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.652207 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-os-release\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.654098 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.652207 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqjxg\" (UniqueName: \"kubernetes.io/projected/5e385e54-2661-4cd9-8bc1-ad9750b2e402-kube-api-access-hqjxg\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.654098 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.652226 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-multus-conf-dir\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.654098 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.652251 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6561db34-89c6-42c5-a92a-a20600534a7d-registration-dir\") pod \"aws-ebs-csi-driver-node-gqz96\" (UID: \"6561db34-89c6-42c5-a92a-a20600534a7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gqz96" Apr 24 21:15:42.654098 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.652315 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-multus-daemon-config\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.654974 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.653923 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5e385e54-2661-4cd9-8bc1-ad9750b2e402-tmp\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.654974 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.654182 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5e385e54-2661-4cd9-8bc1-ad9750b2e402-etc-tuned\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.654974 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.654302 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/790d66ed-34eb-4ff4-b315-99c7cda83b63-agent-certs\") pod \"konnectivity-agent-vcw5r\" (UID: \"790d66ed-34eb-4ff4-b315-99c7cda83b63\") " pod="kube-system/konnectivity-agent-vcw5r" Apr 24 21:15:42.654974 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.654582 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0a61c3b4-bf4f-42f7-afd7-075420c1040d-ovn-node-metrics-cert\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.656881 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:42.656847 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:15:42.656981 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:42.656884 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:15:42.656981 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:42.656895 2573 projected.go:194] Error preparing data for projected volume kube-api-access-s7mp6 for pod openshift-network-diagnostics/network-check-target-wtj7q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:15:42.656981 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:42.656957 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff449891-1658-40ff-a0bd-e08978c661e9-kube-api-access-s7mp6 podName:ff449891-1658-40ff-a0bd-e08978c661e9 nodeName:}" failed. No retries permitted until 2026-04-24 21:15:43.156943818 +0000 UTC m=+3.152694557 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s7mp6" (UniqueName: "kubernetes.io/projected/ff449891-1658-40ff-a0bd-e08978c661e9-kube-api-access-s7mp6") pod "network-check-target-wtj7q" (UID: "ff449891-1658-40ff-a0bd-e08978c661e9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:15:42.659953 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.659929 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7swjs\" (UniqueName: \"kubernetes.io/projected/0a61c3b4-bf4f-42f7-afd7-075420c1040d-kube-api-access-7swjs\") pod \"ovnkube-node-zvr9w\" (UID: \"0a61c3b4-bf4f-42f7-afd7-075420c1040d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.660465 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.660422 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqjj2\" (UniqueName: \"kubernetes.io/projected/223043ea-b132-4d5d-9a14-0496d53fdc53-kube-api-access-hqjj2\") pod \"network-metrics-daemon-m6d6n\" (UID: \"223043ea-b132-4d5d-9a14-0496d53fdc53\") " pod="openshift-multus/network-metrics-daemon-m6d6n" Apr 24 21:15:42.660789 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.660774 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv8n6\" (UniqueName: \"kubernetes.io/projected/bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2-kube-api-access-sv8n6\") pod \"multus-4qhzb\" (UID: \"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2\") " pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.661063 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.661041 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t9zd\" (UniqueName: \"kubernetes.io/projected/6561db34-89c6-42c5-a92a-a20600534a7d-kube-api-access-8t9zd\") pod \"aws-ebs-csi-driver-node-gqz96\" (UID: \"6561db34-89c6-42c5-a92a-a20600534a7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gqz96" Apr 24 21:15:42.661381 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.661363 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqjxg\" (UniqueName: \"kubernetes.io/projected/5e385e54-2661-4cd9-8bc1-ad9750b2e402-kube-api-access-hqjxg\") pod \"tuned-t26r7\" (UID: \"5e385e54-2661-4cd9-8bc1-ad9750b2e402\") " pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.744105 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.744037 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-bztfd" Apr 24 21:15:42.751740 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.751716 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zscc7" Apr 24 21:15:42.761468 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.761452 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:15:42.768013 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.767997 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wv544" Apr 24 21:15:42.775502 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.775485 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hz9sp" Apr 24 21:15:42.782075 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.782057 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4qhzb" Apr 24 21:15:42.789644 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.789616 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gqz96" Apr 24 21:15:42.797219 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.797201 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-t26r7" Apr 24 21:15:42.800659 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:42.800643 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vcw5r" Apr 24 21:15:43.156758 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:43.156683 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/223043ea-b132-4d5d-9a14-0496d53fdc53-metrics-certs\") pod \"network-metrics-daemon-m6d6n\" (UID: \"223043ea-b132-4d5d-9a14-0496d53fdc53\") " pod="openshift-multus/network-metrics-daemon-m6d6n" Apr 24 21:15:43.156924 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:43.156853 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:15:43.156998 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:43.156930 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/223043ea-b132-4d5d-9a14-0496d53fdc53-metrics-certs podName:223043ea-b132-4d5d-9a14-0496d53fdc53 nodeName:}" failed. No retries permitted until 2026-04-24 21:15:44.156912388 +0000 UTC m=+4.152663124 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/223043ea-b132-4d5d-9a14-0496d53fdc53-metrics-certs") pod "network-metrics-daemon-m6d6n" (UID: "223043ea-b132-4d5d-9a14-0496d53fdc53") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:15:43.225447 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:43.225411 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc4f30e8_cb9e_4140_96c8_b6e37a3be1d2.slice/crio-0ebe5c502a7cb9110c398f51e93ed9c3c32d653c8a6917b5c630178f70cca62e WatchSource:0}: Error finding container 0ebe5c502a7cb9110c398f51e93ed9c3c32d653c8a6917b5c630178f70cca62e: Status 404 returned error can't find the container with id 0ebe5c502a7cb9110c398f51e93ed9c3c32d653c8a6917b5c630178f70cca62e Apr 24 21:15:43.228517 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:43.228497 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf855c2da_63b3_4393_85d5_d812d3b86100.slice/crio-ad738f9a4aafe24024f18a27d2e4174b2f828c8800d3a91e09b8f9635069eefc WatchSource:0}: Error finding container ad738f9a4aafe24024f18a27d2e4174b2f828c8800d3a91e09b8f9635069eefc: Status 404 returned error can't find the container with id ad738f9a4aafe24024f18a27d2e4174b2f828c8800d3a91e09b8f9635069eefc Apr 24 21:15:43.229827 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:43.229783 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod019f7af9_37e7_4923_a370_a980a06b7377.slice/crio-0a14c817a897af2bdc1edd64ddcfa4aba43636e96e4cadbf5ba8e46709989fe6 WatchSource:0}: Error finding container 0a14c817a897af2bdc1edd64ddcfa4aba43636e96e4cadbf5ba8e46709989fe6: Status 404 returned error can't find the container with id 0a14c817a897af2bdc1edd64ddcfa4aba43636e96e4cadbf5ba8e46709989fe6 Apr 24 21:15:43.230944 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:43.230569 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod790d66ed_34eb_4ff4_b315_99c7cda83b63.slice/crio-e33da2fb0da84b5c8f7c7f6eb88fc1ae78dc9aa5c9ceba0f4e333ad2dcbe2e63 WatchSource:0}: Error finding container e33da2fb0da84b5c8f7c7f6eb88fc1ae78dc9aa5c9ceba0f4e333ad2dcbe2e63: Status 404 returned error can't find the container with id e33da2fb0da84b5c8f7c7f6eb88fc1ae78dc9aa5c9ceba0f4e333ad2dcbe2e63 Apr 24 21:15:43.231273 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:43.231236 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf71747fd_1913_4d70_b833_4f352b05ba15.slice/crio-4cd5c2e90c5b8f519b8abd22279bb863a44f7b333c3f53dc5f677b5349badda2 WatchSource:0}: Error finding container 4cd5c2e90c5b8f519b8abd22279bb863a44f7b333c3f53dc5f677b5349badda2: Status 404 returned error can't find the container with id 4cd5c2e90c5b8f519b8abd22279bb863a44f7b333c3f53dc5f677b5349badda2 Apr 24 21:15:43.251740 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:43.251713 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6561db34_89c6_42c5_a92a_a20600534a7d.slice/crio-9e734b1849bcf348bf2d047cd9a5857dda810b1ad4e0440812c8b3dee4283293 WatchSource:0}: Error finding container 9e734b1849bcf348bf2d047cd9a5857dda810b1ad4e0440812c8b3dee4283293: Status 404 returned error can't find the container with id 9e734b1849bcf348bf2d047cd9a5857dda810b1ad4e0440812c8b3dee4283293 Apr 24 21:15:43.252670 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:43.252648 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1ae49ae_a1e3_464e_a9db_3d0bad2349ab.slice/crio-5517e67139a3242fc7a2ddd29b2d159673b445e0bc374c10aca8151864b599a5 WatchSource:0}: Error finding container 5517e67139a3242fc7a2ddd29b2d159673b445e0bc374c10aca8151864b599a5: Status 404 returned error can't find the container with id 5517e67139a3242fc7a2ddd29b2d159673b445e0bc374c10aca8151864b599a5 Apr 24 21:15:43.253665 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:15:43.253642 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a61c3b4_bf4f_42f7_afd7_075420c1040d.slice/crio-b389175b78f6dacc404af0640328c48990b49a7a15605845b8d60b9755466f16 WatchSource:0}: Error finding container b389175b78f6dacc404af0640328c48990b49a7a15605845b8d60b9755466f16: Status 404 returned error can't find the container with id b389175b78f6dacc404af0640328c48990b49a7a15605845b8d60b9755466f16 Apr 24 21:15:43.257335 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:43.257305 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7mp6\" (UniqueName: \"kubernetes.io/projected/ff449891-1658-40ff-a0bd-e08978c661e9-kube-api-access-s7mp6\") pod \"network-check-target-wtj7q\" (UID: \"ff449891-1658-40ff-a0bd-e08978c661e9\") " pod="openshift-network-diagnostics/network-check-target-wtj7q" Apr 24 21:15:43.257476 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:43.257459 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:15:43.257523 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:43.257476 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:15:43.257523 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:43.257495 2573 projected.go:194] Error preparing data for projected volume kube-api-access-s7mp6 for pod openshift-network-diagnostics/network-check-target-wtj7q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:15:43.257613 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:43.257561 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff449891-1658-40ff-a0bd-e08978c661e9-kube-api-access-s7mp6 podName:ff449891-1658-40ff-a0bd-e08978c661e9 nodeName:}" failed. No retries permitted until 2026-04-24 21:15:44.257539622 +0000 UTC m=+4.253290374 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s7mp6" (UniqueName: "kubernetes.io/projected/ff449891-1658-40ff-a0bd-e08978c661e9-kube-api-access-s7mp6") pod "network-check-target-wtj7q" (UID: "ff449891-1658-40ff-a0bd-e08978c661e9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:15:43.474896 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:43.474667 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:10:41 +0000 UTC" deadline="2027-10-12 14:27:25.525626572 +0000 UTC" Apr 24 21:15:43.474896 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:43.474829 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12857h11m42.050801212s" Apr 24 21:15:43.567362 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:43.567336 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m6d6n" Apr 24 21:15:43.567481 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:43.567441 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m6d6n" podUID="223043ea-b132-4d5d-9a14-0496d53fdc53" Apr 24 21:15:43.573724 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:43.573693 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4qhzb" event={"ID":"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2","Type":"ContainerStarted","Data":"0ebe5c502a7cb9110c398f51e93ed9c3c32d653c8a6917b5c630178f70cca62e"} Apr 24 21:15:43.574803 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:43.574765 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-t26r7" event={"ID":"5e385e54-2661-4cd9-8bc1-ad9750b2e402","Type":"ContainerStarted","Data":"187220f999e05cddd7ff7aaac898b0fff5b159b56974762c1d964bcfa896e2f8"} Apr 24 21:15:43.575858 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:43.575799 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" event={"ID":"0a61c3b4-bf4f-42f7-afd7-075420c1040d","Type":"ContainerStarted","Data":"b389175b78f6dacc404af0640328c48990b49a7a15605845b8d60b9755466f16"} Apr 24 21:15:43.577168 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:43.577146 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gqz96" event={"ID":"6561db34-89c6-42c5-a92a-a20600534a7d","Type":"ContainerStarted","Data":"9e734b1849bcf348bf2d047cd9a5857dda810b1ad4e0440812c8b3dee4283293"} Apr 24 21:15:43.578211 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:43.578178 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vcw5r" event={"ID":"790d66ed-34eb-4ff4-b315-99c7cda83b63","Type":"ContainerStarted","Data":"e33da2fb0da84b5c8f7c7f6eb88fc1ae78dc9aa5c9ceba0f4e333ad2dcbe2e63"} Apr 24 21:15:43.579188 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:43.579166 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-bztfd" event={"ID":"019f7af9-37e7-4923-a370-a980a06b7377","Type":"ContainerStarted","Data":"0a14c817a897af2bdc1edd64ddcfa4aba43636e96e4cadbf5ba8e46709989fe6"} Apr 24 21:15:43.580271 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:43.580246 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hz9sp" event={"ID":"f855c2da-63b3-4393-85d5-d812d3b86100","Type":"ContainerStarted","Data":"ad738f9a4aafe24024f18a27d2e4174b2f828c8800d3a91e09b8f9635069eefc"} Apr 24 21:15:43.582140 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:43.582117 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-147.ec2.internal" event={"ID":"c7195f4bce997b5527f5c21d5b6e5e49","Type":"ContainerStarted","Data":"b821854d1bb02d58b6293e541f3ea29cf530c7d333190683cb79c206c95ffb96"} Apr 24 21:15:43.583389 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:43.583355 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zscc7" event={"ID":"a1ae49ae-a1e3-464e-a9db-3d0bad2349ab","Type":"ContainerStarted","Data":"5517e67139a3242fc7a2ddd29b2d159673b445e0bc374c10aca8151864b599a5"} Apr 24 21:15:43.584439 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:43.584416 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wv544" event={"ID":"f71747fd-1913-4d70-b833-4f352b05ba15","Type":"ContainerStarted","Data":"4cd5c2e90c5b8f519b8abd22279bb863a44f7b333c3f53dc5f677b5349badda2"} Apr 24 21:15:44.164370 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:44.164336 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/223043ea-b132-4d5d-9a14-0496d53fdc53-metrics-certs\") pod \"network-metrics-daemon-m6d6n\" (UID: \"223043ea-b132-4d5d-9a14-0496d53fdc53\") " pod="openshift-multus/network-metrics-daemon-m6d6n" Apr 24 21:15:44.164530 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:44.164511 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:15:44.164587 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:44.164579 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/223043ea-b132-4d5d-9a14-0496d53fdc53-metrics-certs podName:223043ea-b132-4d5d-9a14-0496d53fdc53 nodeName:}" failed. No retries permitted until 2026-04-24 21:15:46.164561451 +0000 UTC m=+6.160312194 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/223043ea-b132-4d5d-9a14-0496d53fdc53-metrics-certs") pod "network-metrics-daemon-m6d6n" (UID: "223043ea-b132-4d5d-9a14-0496d53fdc53") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:15:44.265519 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:44.264903 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7mp6\" (UniqueName: \"kubernetes.io/projected/ff449891-1658-40ff-a0bd-e08978c661e9-kube-api-access-s7mp6\") pod \"network-check-target-wtj7q\" (UID: \"ff449891-1658-40ff-a0bd-e08978c661e9\") " pod="openshift-network-diagnostics/network-check-target-wtj7q" Apr 24 21:15:44.265519 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:44.265066 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:15:44.265519 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:44.265086 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:15:44.265519 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:44.265098 2573 projected.go:194] Error preparing data for projected volume kube-api-access-s7mp6 for pod openshift-network-diagnostics/network-check-target-wtj7q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:15:44.265519 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:44.265159 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff449891-1658-40ff-a0bd-e08978c661e9-kube-api-access-s7mp6 podName:ff449891-1658-40ff-a0bd-e08978c661e9 nodeName:}" failed. No retries permitted until 2026-04-24 21:15:46.26514151 +0000 UTC m=+6.260892253 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s7mp6" (UniqueName: "kubernetes.io/projected/ff449891-1658-40ff-a0bd-e08978c661e9-kube-api-access-s7mp6") pod "network-check-target-wtj7q" (UID: "ff449891-1658-40ff-a0bd-e08978c661e9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:15:44.290751 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:44.290696 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:15:44.568101 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:44.568020 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wtj7q" Apr 24 21:15:44.568525 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:44.568139 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wtj7q" podUID="ff449891-1658-40ff-a0bd-e08978c661e9" Apr 24 21:15:44.627072 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:44.627038 2573 generic.go:358] "Generic (PLEG): container finished" podID="e662b2546977ef7b16e2af25ff1dfef2" containerID="b04e8c059001b7bc69ad433b19229854620592006d45d60088b2b17e694b2aa0" exitCode=0 Apr 24 21:15:44.627229 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:44.627143 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-147.ec2.internal" event={"ID":"e662b2546977ef7b16e2af25ff1dfef2","Type":"ContainerDied","Data":"b04e8c059001b7bc69ad433b19229854620592006d45d60088b2b17e694b2aa0"} Apr 24 21:15:44.646262 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:44.646213 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-147.ec2.internal" podStartSLOduration=2.646195391 podStartE2EDuration="2.646195391s" podCreationTimestamp="2026-04-24 21:15:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:15:43.59457117 +0000 UTC m=+3.590321927" watchObservedRunningTime="2026-04-24 21:15:44.646195391 +0000 UTC m=+4.641946151" Apr 24 21:15:45.567171 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:45.567136 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m6d6n" Apr 24 21:15:45.567338 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:45.567282 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m6d6n" podUID="223043ea-b132-4d5d-9a14-0496d53fdc53" Apr 24 21:15:45.648496 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:45.648369 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-147.ec2.internal" event={"ID":"e662b2546977ef7b16e2af25ff1dfef2","Type":"ContainerStarted","Data":"59970ef0b7335915b9081200c10717edbebb48aaf82ccb7dedcf5c3cab9272d4"} Apr 24 21:15:46.187255 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:46.186565 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/223043ea-b132-4d5d-9a14-0496d53fdc53-metrics-certs\") pod \"network-metrics-daemon-m6d6n\" (UID: \"223043ea-b132-4d5d-9a14-0496d53fdc53\") " pod="openshift-multus/network-metrics-daemon-m6d6n" Apr 24 21:15:46.187255 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:46.186760 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:15:46.187255 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:46.186820 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/223043ea-b132-4d5d-9a14-0496d53fdc53-metrics-certs podName:223043ea-b132-4d5d-9a14-0496d53fdc53 nodeName:}" failed. No retries permitted until 2026-04-24 21:15:50.186803889 +0000 UTC m=+10.182554624 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/223043ea-b132-4d5d-9a14-0496d53fdc53-metrics-certs") pod "network-metrics-daemon-m6d6n" (UID: "223043ea-b132-4d5d-9a14-0496d53fdc53") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:15:46.287778 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:46.287740 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7mp6\" (UniqueName: \"kubernetes.io/projected/ff449891-1658-40ff-a0bd-e08978c661e9-kube-api-access-s7mp6\") pod \"network-check-target-wtj7q\" (UID: \"ff449891-1658-40ff-a0bd-e08978c661e9\") " pod="openshift-network-diagnostics/network-check-target-wtj7q" Apr 24 21:15:46.287998 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:46.287978 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:15:46.288078 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:46.288004 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:15:46.288078 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:46.288017 2573 projected.go:194] Error preparing data for projected volume kube-api-access-s7mp6 for pod openshift-network-diagnostics/network-check-target-wtj7q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:15:46.288194 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:46.288086 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff449891-1658-40ff-a0bd-e08978c661e9-kube-api-access-s7mp6 podName:ff449891-1658-40ff-a0bd-e08978c661e9 nodeName:}" failed. No retries permitted until 2026-04-24 21:15:50.288066011 +0000 UTC m=+10.283816754 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s7mp6" (UniqueName: "kubernetes.io/projected/ff449891-1658-40ff-a0bd-e08978c661e9-kube-api-access-s7mp6") pod "network-check-target-wtj7q" (UID: "ff449891-1658-40ff-a0bd-e08978c661e9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:15:46.567338 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:46.567308 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wtj7q" Apr 24 21:15:46.567513 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:46.567433 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wtj7q" podUID="ff449891-1658-40ff-a0bd-e08978c661e9" Apr 24 21:15:47.177043 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:47.176054 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-147.ec2.internal" podStartSLOduration=5.176023983 podStartE2EDuration="5.176023983s" podCreationTimestamp="2026-04-24 21:15:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:15:45.674070176 +0000 UTC m=+5.669820940" watchObservedRunningTime="2026-04-24 21:15:47.176023983 +0000 UTC m=+7.171774742" Apr 24 21:15:47.177043 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:47.176297 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-mt97w"] Apr 24 21:15:47.184550 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:47.184215 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mt97w" Apr 24 21:15:47.184550 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:47.184294 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mt97w" podUID="d143aeb1-2388-4e2b-94e5-feca18fa8e79" Apr 24 21:15:47.295111 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:47.295073 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d143aeb1-2388-4e2b-94e5-feca18fa8e79-kubelet-config\") pod \"global-pull-secret-syncer-mt97w\" (UID: \"d143aeb1-2388-4e2b-94e5-feca18fa8e79\") " pod="kube-system/global-pull-secret-syncer-mt97w" Apr 24 21:15:47.295291 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:47.295133 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d143aeb1-2388-4e2b-94e5-feca18fa8e79-dbus\") pod \"global-pull-secret-syncer-mt97w\" (UID: \"d143aeb1-2388-4e2b-94e5-feca18fa8e79\") " pod="kube-system/global-pull-secret-syncer-mt97w" Apr 24 21:15:47.295291 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:47.295180 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d143aeb1-2388-4e2b-94e5-feca18fa8e79-original-pull-secret\") pod \"global-pull-secret-syncer-mt97w\" (UID: \"d143aeb1-2388-4e2b-94e5-feca18fa8e79\") " pod="kube-system/global-pull-secret-syncer-mt97w" Apr 24 21:15:47.395975 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:47.395939 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d143aeb1-2388-4e2b-94e5-feca18fa8e79-kubelet-config\") pod \"global-pull-secret-syncer-mt97w\" (UID: \"d143aeb1-2388-4e2b-94e5-feca18fa8e79\") " pod="kube-system/global-pull-secret-syncer-mt97w" Apr 24 21:15:47.396127 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:47.395991 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d143aeb1-2388-4e2b-94e5-feca18fa8e79-dbus\") pod \"global-pull-secret-syncer-mt97w\" (UID: \"d143aeb1-2388-4e2b-94e5-feca18fa8e79\") " pod="kube-system/global-pull-secret-syncer-mt97w" Apr 24 21:15:47.396127 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:47.396027 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d143aeb1-2388-4e2b-94e5-feca18fa8e79-original-pull-secret\") pod \"global-pull-secret-syncer-mt97w\" (UID: \"d143aeb1-2388-4e2b-94e5-feca18fa8e79\") " pod="kube-system/global-pull-secret-syncer-mt97w" Apr 24 21:15:47.396238 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:47.396142 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:15:47.396238 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:47.396202 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d143aeb1-2388-4e2b-94e5-feca18fa8e79-original-pull-secret podName:d143aeb1-2388-4e2b-94e5-feca18fa8e79 nodeName:}" failed. No retries permitted until 2026-04-24 21:15:47.896184278 +0000 UTC m=+7.891935031 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d143aeb1-2388-4e2b-94e5-feca18fa8e79-original-pull-secret") pod "global-pull-secret-syncer-mt97w" (UID: "d143aeb1-2388-4e2b-94e5-feca18fa8e79") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:15:47.396238 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:47.396211 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d143aeb1-2388-4e2b-94e5-feca18fa8e79-kubelet-config\") pod \"global-pull-secret-syncer-mt97w\" (UID: \"d143aeb1-2388-4e2b-94e5-feca18fa8e79\") " pod="kube-system/global-pull-secret-syncer-mt97w" Apr 24 21:15:47.396375 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:47.396302 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d143aeb1-2388-4e2b-94e5-feca18fa8e79-dbus\") pod \"global-pull-secret-syncer-mt97w\" (UID: \"d143aeb1-2388-4e2b-94e5-feca18fa8e79\") " pod="kube-system/global-pull-secret-syncer-mt97w" Apr 24 21:15:47.567209 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:47.567174 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m6d6n" Apr 24 21:15:47.567346 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:47.567318 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m6d6n" podUID="223043ea-b132-4d5d-9a14-0496d53fdc53" Apr 24 21:15:47.900149 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:47.900049 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d143aeb1-2388-4e2b-94e5-feca18fa8e79-original-pull-secret\") pod \"global-pull-secret-syncer-mt97w\" (UID: \"d143aeb1-2388-4e2b-94e5-feca18fa8e79\") " pod="kube-system/global-pull-secret-syncer-mt97w" Apr 24 21:15:47.900316 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:47.900193 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:15:47.900316 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:47.900257 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d143aeb1-2388-4e2b-94e5-feca18fa8e79-original-pull-secret podName:d143aeb1-2388-4e2b-94e5-feca18fa8e79 nodeName:}" failed. No retries permitted until 2026-04-24 21:15:48.900237233 +0000 UTC m=+8.895987975 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d143aeb1-2388-4e2b-94e5-feca18fa8e79-original-pull-secret") pod "global-pull-secret-syncer-mt97w" (UID: "d143aeb1-2388-4e2b-94e5-feca18fa8e79") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:15:48.567384 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:48.566772 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mt97w" Apr 24 21:15:48.567384 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:48.566912 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mt97w" podUID="d143aeb1-2388-4e2b-94e5-feca18fa8e79" Apr 24 21:15:48.567384 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:48.567252 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wtj7q" Apr 24 21:15:48.567384 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:48.567346 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wtj7q" podUID="ff449891-1658-40ff-a0bd-e08978c661e9" Apr 24 21:15:48.908176 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:48.908103 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d143aeb1-2388-4e2b-94e5-feca18fa8e79-original-pull-secret\") pod \"global-pull-secret-syncer-mt97w\" (UID: \"d143aeb1-2388-4e2b-94e5-feca18fa8e79\") " pod="kube-system/global-pull-secret-syncer-mt97w" Apr 24 21:15:48.908334 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:48.908277 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:15:48.908394 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:48.908348 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d143aeb1-2388-4e2b-94e5-feca18fa8e79-original-pull-secret podName:d143aeb1-2388-4e2b-94e5-feca18fa8e79 nodeName:}" failed. No retries permitted until 2026-04-24 21:15:50.90832782 +0000 UTC m=+10.904078562 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d143aeb1-2388-4e2b-94e5-feca18fa8e79-original-pull-secret") pod "global-pull-secret-syncer-mt97w" (UID: "d143aeb1-2388-4e2b-94e5-feca18fa8e79") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:15:49.567209 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:49.567177 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m6d6n" Apr 24 21:15:49.567399 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:49.567325 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m6d6n" podUID="223043ea-b132-4d5d-9a14-0496d53fdc53" Apr 24 21:15:50.219174 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:50.218775 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/223043ea-b132-4d5d-9a14-0496d53fdc53-metrics-certs\") pod \"network-metrics-daemon-m6d6n\" (UID: \"223043ea-b132-4d5d-9a14-0496d53fdc53\") " pod="openshift-multus/network-metrics-daemon-m6d6n" Apr 24 21:15:50.219174 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:50.218991 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:15:50.219174 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:50.219057 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/223043ea-b132-4d5d-9a14-0496d53fdc53-metrics-certs podName:223043ea-b132-4d5d-9a14-0496d53fdc53 nodeName:}" failed. No retries permitted until 2026-04-24 21:15:58.219037053 +0000 UTC m=+18.214787806 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/223043ea-b132-4d5d-9a14-0496d53fdc53-metrics-certs") pod "network-metrics-daemon-m6d6n" (UID: "223043ea-b132-4d5d-9a14-0496d53fdc53") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:15:50.320027 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:50.319624 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7mp6\" (UniqueName: \"kubernetes.io/projected/ff449891-1658-40ff-a0bd-e08978c661e9-kube-api-access-s7mp6\") pod \"network-check-target-wtj7q\" (UID: \"ff449891-1658-40ff-a0bd-e08978c661e9\") " pod="openshift-network-diagnostics/network-check-target-wtj7q" Apr 24 21:15:50.320027 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:50.319810 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:15:50.320027 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:50.319832 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:15:50.320027 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:50.319845 2573 projected.go:194] Error preparing data for projected volume kube-api-access-s7mp6 for pod openshift-network-diagnostics/network-check-target-wtj7q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:15:50.320027 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:50.319921 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff449891-1658-40ff-a0bd-e08978c661e9-kube-api-access-s7mp6 podName:ff449891-1658-40ff-a0bd-e08978c661e9 nodeName:}" failed. No retries permitted until 2026-04-24 21:15:58.319901784 +0000 UTC m=+18.315652521 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s7mp6" (UniqueName: "kubernetes.io/projected/ff449891-1658-40ff-a0bd-e08978c661e9-kube-api-access-s7mp6") pod "network-check-target-wtj7q" (UID: "ff449891-1658-40ff-a0bd-e08978c661e9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:15:50.568143 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:50.568070 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wtj7q" Apr 24 21:15:50.568286 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:50.568189 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wtj7q" podUID="ff449891-1658-40ff-a0bd-e08978c661e9" Apr 24 21:15:50.568286 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:50.568212 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mt97w" Apr 24 21:15:50.568390 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:50.568307 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mt97w" podUID="d143aeb1-2388-4e2b-94e5-feca18fa8e79" Apr 24 21:15:50.924722 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:50.924631 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d143aeb1-2388-4e2b-94e5-feca18fa8e79-original-pull-secret\") pod \"global-pull-secret-syncer-mt97w\" (UID: \"d143aeb1-2388-4e2b-94e5-feca18fa8e79\") " pod="kube-system/global-pull-secret-syncer-mt97w" Apr 24 21:15:50.924941 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:50.924780 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:15:50.924941 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:50.924842 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d143aeb1-2388-4e2b-94e5-feca18fa8e79-original-pull-secret podName:d143aeb1-2388-4e2b-94e5-feca18fa8e79 nodeName:}" failed. No retries permitted until 2026-04-24 21:15:54.92482418 +0000 UTC m=+14.920574918 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d143aeb1-2388-4e2b-94e5-feca18fa8e79-original-pull-secret") pod "global-pull-secret-syncer-mt97w" (UID: "d143aeb1-2388-4e2b-94e5-feca18fa8e79") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:15:51.567216 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:51.567183 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m6d6n" Apr 24 21:15:51.567744 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:51.567313 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m6d6n" podUID="223043ea-b132-4d5d-9a14-0496d53fdc53" Apr 24 21:15:52.566692 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:52.566661 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wtj7q" Apr 24 21:15:52.566896 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:52.566702 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mt97w" Apr 24 21:15:52.566896 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:52.566789 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wtj7q" podUID="ff449891-1658-40ff-a0bd-e08978c661e9" Apr 24 21:15:52.567086 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:52.566935 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mt97w" podUID="d143aeb1-2388-4e2b-94e5-feca18fa8e79" Apr 24 21:15:53.566691 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:53.566658 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m6d6n" Apr 24 21:15:53.567165 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:53.566784 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m6d6n" podUID="223043ea-b132-4d5d-9a14-0496d53fdc53" Apr 24 21:15:54.566772 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:54.566744 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mt97w" Apr 24 21:15:54.567165 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:54.566753 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wtj7q" Apr 24 21:15:54.567165 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:54.566884 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mt97w" podUID="d143aeb1-2388-4e2b-94e5-feca18fa8e79" Apr 24 21:15:54.567165 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:54.566951 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wtj7q" podUID="ff449891-1658-40ff-a0bd-e08978c661e9" Apr 24 21:15:54.956115 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:54.956081 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d143aeb1-2388-4e2b-94e5-feca18fa8e79-original-pull-secret\") pod \"global-pull-secret-syncer-mt97w\" (UID: \"d143aeb1-2388-4e2b-94e5-feca18fa8e79\") " pod="kube-system/global-pull-secret-syncer-mt97w" Apr 24 21:15:54.956264 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:54.956240 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:15:54.956324 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:54.956311 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d143aeb1-2388-4e2b-94e5-feca18fa8e79-original-pull-secret podName:d143aeb1-2388-4e2b-94e5-feca18fa8e79 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:02.956291883 +0000 UTC m=+22.952042631 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d143aeb1-2388-4e2b-94e5-feca18fa8e79-original-pull-secret") pod "global-pull-secret-syncer-mt97w" (UID: "d143aeb1-2388-4e2b-94e5-feca18fa8e79") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:15:55.567365 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:55.567334 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m6d6n" Apr 24 21:15:55.567707 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:55.567464 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m6d6n" podUID="223043ea-b132-4d5d-9a14-0496d53fdc53" Apr 24 21:15:56.567037 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:56.566994 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wtj7q" Apr 24 21:15:56.567222 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:56.567107 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wtj7q" podUID="ff449891-1658-40ff-a0bd-e08978c661e9" Apr 24 21:15:56.567222 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:56.567191 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mt97w" Apr 24 21:15:56.567327 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:56.567287 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mt97w" podUID="d143aeb1-2388-4e2b-94e5-feca18fa8e79" Apr 24 21:15:57.566943 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:57.566905 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m6d6n" Apr 24 21:15:57.567492 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:57.567039 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m6d6n" podUID="223043ea-b132-4d5d-9a14-0496d53fdc53" Apr 24 21:15:58.280384 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:58.280346 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/223043ea-b132-4d5d-9a14-0496d53fdc53-metrics-certs\") pod \"network-metrics-daemon-m6d6n\" (UID: \"223043ea-b132-4d5d-9a14-0496d53fdc53\") " pod="openshift-multus/network-metrics-daemon-m6d6n" Apr 24 21:15:58.280561 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:58.280538 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:15:58.280643 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:58.280630 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/223043ea-b132-4d5d-9a14-0496d53fdc53-metrics-certs podName:223043ea-b132-4d5d-9a14-0496d53fdc53 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:14.280607437 +0000 UTC m=+34.276358193 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/223043ea-b132-4d5d-9a14-0496d53fdc53-metrics-certs") pod "network-metrics-daemon-m6d6n" (UID: "223043ea-b132-4d5d-9a14-0496d53fdc53") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:15:58.381690 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:58.381657 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7mp6\" (UniqueName: \"kubernetes.io/projected/ff449891-1658-40ff-a0bd-e08978c661e9-kube-api-access-s7mp6\") pod \"network-check-target-wtj7q\" (UID: \"ff449891-1658-40ff-a0bd-e08978c661e9\") " pod="openshift-network-diagnostics/network-check-target-wtj7q" Apr 24 21:15:58.381845 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:58.381827 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:15:58.381911 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:58.381854 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:15:58.381911 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:58.381865 2573 projected.go:194] Error preparing data for projected volume kube-api-access-s7mp6 for pod openshift-network-diagnostics/network-check-target-wtj7q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:15:58.382003 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:58.381932 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff449891-1658-40ff-a0bd-e08978c661e9-kube-api-access-s7mp6 podName:ff449891-1658-40ff-a0bd-e08978c661e9 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:14.3819176 +0000 UTC m=+34.377668342 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s7mp6" (UniqueName: "kubernetes.io/projected/ff449891-1658-40ff-a0bd-e08978c661e9-kube-api-access-s7mp6") pod "network-check-target-wtj7q" (UID: "ff449891-1658-40ff-a0bd-e08978c661e9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:15:58.570594 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:58.570518 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wtj7q" Apr 24 21:15:58.571060 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:58.570518 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mt97w" Apr 24 21:15:58.571060 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:58.570651 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wtj7q" podUID="ff449891-1658-40ff-a0bd-e08978c661e9" Apr 24 21:15:58.571060 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:58.570740 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mt97w" podUID="d143aeb1-2388-4e2b-94e5-feca18fa8e79" Apr 24 21:15:59.566754 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:15:59.566726 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m6d6n" Apr 24 21:15:59.566929 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:15:59.566830 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m6d6n" podUID="223043ea-b132-4d5d-9a14-0496d53fdc53" Apr 24 21:16:00.568324 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:00.568069 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wtj7q" Apr 24 21:16:00.568926 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:00.568126 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mt97w" Apr 24 21:16:00.568926 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:00.568433 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wtj7q" podUID="ff449891-1658-40ff-a0bd-e08978c661e9" Apr 24 21:16:00.568926 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:00.568501 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mt97w" podUID="d143aeb1-2388-4e2b-94e5-feca18fa8e79" Apr 24 21:16:00.673000 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:00.672976 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zscc7" event={"ID":"a1ae49ae-a1e3-464e-a9db-3d0bad2349ab","Type":"ContainerStarted","Data":"b3ffe74b603e216f5d8529d70e963bc3848db1bd9be71ebcff2079acf419ddc1"} Apr 24 21:16:00.674172 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:00.674149 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wv544" event={"ID":"f71747fd-1913-4d70-b833-4f352b05ba15","Type":"ContainerStarted","Data":"c546eb57c197dddaeb6a2cb9dc9564cbfd27382a1c7ed3558311556d7de9ae70"} Apr 24 21:16:00.675208 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:00.675191 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4qhzb" event={"ID":"bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2","Type":"ContainerStarted","Data":"eefe0b8506e4ed7c3656ed6e241e5d44b64bb373f6cb99dd0bec5cd808430784"} Apr 24 21:16:00.676300 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:00.676276 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-t26r7" event={"ID":"5e385e54-2661-4cd9-8bc1-ad9750b2e402","Type":"ContainerStarted","Data":"f8c51736606ec42162f917c9b8fe7daea2492ed09ec5288f1cc0de8654f934df"} Apr 24 21:16:00.677629 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:00.677609 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" event={"ID":"0a61c3b4-bf4f-42f7-afd7-075420c1040d","Type":"ContainerStarted","Data":"798a6cd071befb2e5f36616ed0e92cfe9fbc434afa3cb4c5e338a51f02219f81"} Apr 24 21:16:00.677729 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:00.677634 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" event={"ID":"0a61c3b4-bf4f-42f7-afd7-075420c1040d","Type":"ContainerStarted","Data":"fce719d1527258e75eb149f6678d4cac1c0d79f22aed5c7c9dcb7deb3aacaf74"} Apr 24 21:16:00.678711 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:00.678692 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gqz96" event={"ID":"6561db34-89c6-42c5-a92a-a20600534a7d","Type":"ContainerStarted","Data":"d99ee792805e377fd11751a8d9556d68d068d2ac14646a0206a915e4409d14ee"} Apr 24 21:16:00.679795 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:00.679779 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vcw5r" event={"ID":"790d66ed-34eb-4ff4-b315-99c7cda83b63","Type":"ContainerStarted","Data":"786a6a660aaa5ea70946a944f51ce10ace43a5e11ff4b1ea51e05cff7507cb10"} Apr 24 21:16:00.680980 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:00.680962 2573 generic.go:358] "Generic (PLEG): container finished" podID="f855c2da-63b3-4393-85d5-d812d3b86100" containerID="a910adb2eac7cf9ce2d68caaf51b822f36ea7eae0a6ddd5132016bf9f1e2d05c" exitCode=0 Apr 24 21:16:00.681038 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:00.680991 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hz9sp" event={"ID":"f855c2da-63b3-4393-85d5-d812d3b86100","Type":"ContainerDied","Data":"a910adb2eac7cf9ce2d68caaf51b822f36ea7eae0a6ddd5132016bf9f1e2d05c"} Apr 24 21:16:00.728443 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:00.727923 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-zscc7" podStartSLOduration=4.008695725 podStartE2EDuration="20.72790388s" podCreationTimestamp="2026-04-24 21:15:40 +0000 UTC" firstStartedPulling="2026-04-24 21:15:43.255427029 +0000 UTC m=+3.251177765" lastFinishedPulling="2026-04-24 21:15:59.974635179 +0000 UTC m=+19.970385920" observedRunningTime="2026-04-24 21:16:00.727167639 +0000 UTC m=+20.722918409" watchObservedRunningTime="2026-04-24 21:16:00.72790388 +0000 UTC m=+20.723654660" Apr 24 21:16:00.751758 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:00.751588 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-t26r7" podStartSLOduration=4.036842991 podStartE2EDuration="20.751570551s" podCreationTimestamp="2026-04-24 21:15:40 +0000 UTC" firstStartedPulling="2026-04-24 21:15:43.26158409 +0000 UTC m=+3.257334827" lastFinishedPulling="2026-04-24 21:15:59.976311636 +0000 UTC m=+19.972062387" observedRunningTime="2026-04-24 21:16:00.75071838 +0000 UTC m=+20.746469337" watchObservedRunningTime="2026-04-24 21:16:00.751570551 +0000 UTC m=+20.747321311" Apr 24 21:16:00.766609 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:00.766559 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-wv544" podStartSLOduration=4.042744068 podStartE2EDuration="20.766547241s" podCreationTimestamp="2026-04-24 21:15:40 +0000 UTC" firstStartedPulling="2026-04-24 21:15:43.250790755 +0000 UTC m=+3.246541496" lastFinishedPulling="2026-04-24 21:15:59.974593924 +0000 UTC m=+19.970344669" observedRunningTime="2026-04-24 21:16:00.766023387 +0000 UTC m=+20.761774143" watchObservedRunningTime="2026-04-24 21:16:00.766547241 +0000 UTC m=+20.762297998" Apr 24 21:16:00.796188 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:00.796147 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-4qhzb" podStartSLOduration=4.015182759 podStartE2EDuration="20.796135037s" podCreationTimestamp="2026-04-24 21:15:40 +0000 UTC" firstStartedPulling="2026-04-24 21:15:43.227339056 +0000 UTC m=+3.223089792" lastFinishedPulling="2026-04-24 21:16:00.008291322 +0000 UTC m=+20.004042070" observedRunningTime="2026-04-24 21:16:00.795911082 +0000 UTC m=+20.791661842" watchObservedRunningTime="2026-04-24 21:16:00.796135037 +0000 UTC m=+20.791885794" Apr 24 21:16:01.566852 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:01.566829 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m6d6n" Apr 24 21:16:01.566984 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:01.566962 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m6d6n" podUID="223043ea-b132-4d5d-9a14-0496d53fdc53" Apr 24 21:16:01.643941 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:01.643920 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 21:16:01.686166 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:01.686140 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/ovn-acl-logging/0.log" Apr 24 21:16:01.686471 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:01.686445 2573 generic.go:358] "Generic (PLEG): container finished" podID="0a61c3b4-bf4f-42f7-afd7-075420c1040d" containerID="798a6cd071befb2e5f36616ed0e92cfe9fbc434afa3cb4c5e338a51f02219f81" exitCode=1 Apr 24 21:16:01.686571 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:01.686504 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" event={"ID":"0a61c3b4-bf4f-42f7-afd7-075420c1040d","Type":"ContainerDied","Data":"798a6cd071befb2e5f36616ed0e92cfe9fbc434afa3cb4c5e338a51f02219f81"} Apr 24 21:16:01.686571 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:01.686533 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" event={"ID":"0a61c3b4-bf4f-42f7-afd7-075420c1040d","Type":"ContainerStarted","Data":"f7e3356e402bd3b18b06fd6dc3909f495115895905655e1ce33721e33be6d7bd"} Apr 24 21:16:01.686571 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:01.686545 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" event={"ID":"0a61c3b4-bf4f-42f7-afd7-075420c1040d","Type":"ContainerStarted","Data":"a3a63906c930da333feab91901e91c7a27695efc19f99318c5833657350cfa35"} Apr 24 21:16:01.686571 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:01.686554 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" event={"ID":"0a61c3b4-bf4f-42f7-afd7-075420c1040d","Type":"ContainerStarted","Data":"898d1dbf6f1f30be28fc4940f3634adc76cae0d51eac77db0c62f8743abcc313"} Apr 24 21:16:01.686571 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:01.686562 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" event={"ID":"0a61c3b4-bf4f-42f7-afd7-075420c1040d","Type":"ContainerStarted","Data":"64b916670090cffa8adb617d7a6b804d5e6e0c999b53942c96bae8d70ed93ae8"} Apr 24 21:16:01.688194 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:01.688170 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gqz96" event={"ID":"6561db34-89c6-42c5-a92a-a20600534a7d","Type":"ContainerStarted","Data":"db0695ccf85c771daad0e2df4c2f5fdf77a0e99a24fdcbf2f5f8b49830ed6dbf"} Apr 24 21:16:01.689690 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:01.689622 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-bztfd" event={"ID":"019f7af9-37e7-4923-a370-a980a06b7377","Type":"ContainerStarted","Data":"5630e380febc4f32605a96911b9752128110b743029659f2dbb4a78ce0c1e9e3"} Apr 24 21:16:01.705528 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:01.705483 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-bztfd" podStartSLOduration=4.981623506 podStartE2EDuration="21.705466344s" podCreationTimestamp="2026-04-24 21:15:40 +0000 UTC" firstStartedPulling="2026-04-24 21:15:43.250674997 +0000 UTC m=+3.246425737" lastFinishedPulling="2026-04-24 21:15:59.974517825 +0000 UTC m=+19.970268575" observedRunningTime="2026-04-24 21:16:01.704697713 +0000 UTC m=+21.700448472" watchObservedRunningTime="2026-04-24 21:16:01.705466344 +0000 UTC m=+21.701217099" Apr 24 21:16:01.705882 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:01.705838 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-vcw5r" podStartSLOduration=4.981903728 podStartE2EDuration="21.705827973s" podCreationTimestamp="2026-04-24 21:15:40 +0000 UTC" firstStartedPulling="2026-04-24 21:15:43.250661023 +0000 UTC m=+3.246411773" lastFinishedPulling="2026-04-24 21:15:59.974585282 +0000 UTC m=+19.970336018" observedRunningTime="2026-04-24 21:16:00.818296024 +0000 UTC m=+20.814046793" watchObservedRunningTime="2026-04-24 21:16:01.705827973 +0000 UTC m=+21.701578733" Apr 24 21:16:02.506756 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:02.506629 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T21:16:01.643938209Z","UUID":"df17e181-a736-4d6e-8637-1f2434a0311d","Handler":null,"Name":"","Endpoint":""} Apr 24 21:16:02.509312 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:02.509081 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 21:16:02.509312 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:02.509117 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 21:16:02.567669 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:02.567641 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wtj7q" Apr 24 21:16:02.567838 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:02.567756 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wtj7q" podUID="ff449891-1658-40ff-a0bd-e08978c661e9" Apr 24 21:16:02.567918 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:02.567845 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mt97w" Apr 24 21:16:02.567990 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:02.567962 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mt97w" podUID="d143aeb1-2388-4e2b-94e5-feca18fa8e79" Apr 24 21:16:03.016320 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:03.016283 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d143aeb1-2388-4e2b-94e5-feca18fa8e79-original-pull-secret\") pod \"global-pull-secret-syncer-mt97w\" (UID: \"d143aeb1-2388-4e2b-94e5-feca18fa8e79\") " pod="kube-system/global-pull-secret-syncer-mt97w" Apr 24 21:16:03.016683 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:03.016455 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:03.016683 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:03.016524 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d143aeb1-2388-4e2b-94e5-feca18fa8e79-original-pull-secret podName:d143aeb1-2388-4e2b-94e5-feca18fa8e79 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:19.016505727 +0000 UTC m=+39.012256477 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d143aeb1-2388-4e2b-94e5-feca18fa8e79-original-pull-secret") pod "global-pull-secret-syncer-mt97w" (UID: "d143aeb1-2388-4e2b-94e5-feca18fa8e79") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:03.566772 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:03.566595 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m6d6n" Apr 24 21:16:03.566954 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:03.566843 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m6d6n" podUID="223043ea-b132-4d5d-9a14-0496d53fdc53" Apr 24 21:16:03.695435 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:03.695375 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/ovn-acl-logging/0.log" Apr 24 21:16:03.695743 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:03.695720 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" event={"ID":"0a61c3b4-bf4f-42f7-afd7-075420c1040d","Type":"ContainerStarted","Data":"b034278c96e5a684271db9aa446352dd2a1d7c09e75a544b0a5e650bbb479328"} Apr 24 21:16:03.697262 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:03.697240 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gqz96" event={"ID":"6561db34-89c6-42c5-a92a-a20600534a7d","Type":"ContainerStarted","Data":"00405afeb391b8a839a6252a358d329fe0ef5b0bc4bcf465cb9df240c15fbc74"} Apr 24 21:16:03.714358 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:03.714319 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gqz96" podStartSLOduration=4.185752364 podStartE2EDuration="23.714309431s" podCreationTimestamp="2026-04-24 21:15:40 +0000 UTC" firstStartedPulling="2026-04-24 21:15:43.253617523 +0000 UTC m=+3.249368272" lastFinishedPulling="2026-04-24 21:16:02.782174599 +0000 UTC m=+22.777925339" observedRunningTime="2026-04-24 21:16:03.714106556 +0000 UTC m=+23.709857314" watchObservedRunningTime="2026-04-24 21:16:03.714309431 +0000 UTC m=+23.710060213" Apr 24 21:16:04.567297 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:04.567208 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mt97w" Apr 24 21:16:04.567817 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:04.567322 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mt97w" podUID="d143aeb1-2388-4e2b-94e5-feca18fa8e79" Apr 24 21:16:04.567817 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:04.567381 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wtj7q" Apr 24 21:16:04.567817 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:04.567488 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wtj7q" podUID="ff449891-1658-40ff-a0bd-e08978c661e9" Apr 24 21:16:04.866093 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:04.866021 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-vcw5r" Apr 24 21:16:04.866679 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:04.866651 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-vcw5r" Apr 24 21:16:05.566541 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:05.566518 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m6d6n" Apr 24 21:16:05.566632 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:05.566620 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m6d6n" podUID="223043ea-b132-4d5d-9a14-0496d53fdc53" Apr 24 21:16:05.703849 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:05.703684 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/ovn-acl-logging/0.log" Apr 24 21:16:05.704556 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:05.704200 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" event={"ID":"0a61c3b4-bf4f-42f7-afd7-075420c1040d","Type":"ContainerStarted","Data":"e1a4c84cff5e8fe3fcd0f83f943c8f613147e04bb9cb6e2924fd19ba6192e362"} Apr 24 21:16:05.704616 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:05.704561 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:16:05.704616 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:05.704591 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:16:05.704810 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:05.704785 2573 scope.go:117] "RemoveContainer" containerID="798a6cd071befb2e5f36616ed0e92cfe9fbc434afa3cb4c5e338a51f02219f81" Apr 24 21:16:05.705991 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:05.705854 2573 generic.go:358] "Generic (PLEG): container finished" podID="f855c2da-63b3-4393-85d5-d812d3b86100" containerID="274b9956b305073508ed2746d166b3a9cfd89c57df8f216d4d9fba5a3e02837a" exitCode=0 Apr 24 21:16:05.705991 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:05.705900 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hz9sp" event={"ID":"f855c2da-63b3-4393-85d5-d812d3b86100","Type":"ContainerDied","Data":"274b9956b305073508ed2746d166b3a9cfd89c57df8f216d4d9fba5a3e02837a"} Apr 24 21:16:05.706367 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:05.706273 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-vcw5r" Apr 24 21:16:05.706850 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:05.706815 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-vcw5r" Apr 24 21:16:05.720479 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:05.720452 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:16:06.566899 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:06.566863 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mt97w" Apr 24 21:16:06.566990 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:06.566943 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wtj7q" Apr 24 21:16:06.567073 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:06.567053 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wtj7q" podUID="ff449891-1658-40ff-a0bd-e08978c661e9" Apr 24 21:16:06.567162 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:06.567144 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mt97w" podUID="d143aeb1-2388-4e2b-94e5-feca18fa8e79" Apr 24 21:16:06.711022 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:06.710998 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/ovn-acl-logging/0.log" Apr 24 21:16:06.711411 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:06.711321 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" event={"ID":"0a61c3b4-bf4f-42f7-afd7-075420c1040d","Type":"ContainerStarted","Data":"9ba7cf295512979fdf112d11e9af2696b321e6ec9d73092f96434fdd2344145c"} Apr 24 21:16:06.711594 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:06.711574 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:16:06.713659 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:06.713637 2573 generic.go:358] "Generic (PLEG): container finished" podID="f855c2da-63b3-4393-85d5-d812d3b86100" containerID="c4d797be1fa0fb65c0ab391aacbb54857ccadd7d6e12c7dbc5275583c6115325" exitCode=0 Apr 24 21:16:06.713863 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:06.713816 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hz9sp" event={"ID":"f855c2da-63b3-4393-85d5-d812d3b86100","Type":"ContainerDied","Data":"c4d797be1fa0fb65c0ab391aacbb54857ccadd7d6e12c7dbc5275583c6115325"} Apr 24 21:16:06.728364 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:06.728335 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:16:06.771778 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:06.771715 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" podStartSLOduration=9.665362837 podStartE2EDuration="26.771695267s" podCreationTimestamp="2026-04-24 21:15:40 +0000 UTC" firstStartedPulling="2026-04-24 21:15:43.261476749 +0000 UTC m=+3.257227485" lastFinishedPulling="2026-04-24 21:16:00.367809163 +0000 UTC m=+20.363559915" observedRunningTime="2026-04-24 21:16:06.747703497 +0000 UTC m=+26.743454271" watchObservedRunningTime="2026-04-24 21:16:06.771695267 +0000 UTC m=+26.767446031" Apr 24 21:16:07.490110 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:07.490086 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-wtj7q"] Apr 24 21:16:07.490208 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:07.490199 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wtj7q" Apr 24 21:16:07.490325 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:07.490302 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wtj7q" podUID="ff449891-1658-40ff-a0bd-e08978c661e9" Apr 24 21:16:07.492805 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:07.492781 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-mt97w"] Apr 24 21:16:07.492919 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:07.492896 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mt97w" Apr 24 21:16:07.493018 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:07.492996 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mt97w" podUID="d143aeb1-2388-4e2b-94e5-feca18fa8e79" Apr 24 21:16:07.493376 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:07.493351 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-m6d6n"] Apr 24 21:16:07.493486 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:07.493440 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m6d6n" Apr 24 21:16:07.493544 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:07.493511 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m6d6n" podUID="223043ea-b132-4d5d-9a14-0496d53fdc53" Apr 24 21:16:07.718072 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:07.718043 2573 generic.go:358] "Generic (PLEG): container finished" podID="f855c2da-63b3-4393-85d5-d812d3b86100" containerID="821f16c7240d6779287c0549a3b1fdef028a441f77338d9c93515be26fcb6368" exitCode=0 Apr 24 21:16:07.718556 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:07.718138 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hz9sp" event={"ID":"f855c2da-63b3-4393-85d5-d812d3b86100","Type":"ContainerDied","Data":"821f16c7240d6779287c0549a3b1fdef028a441f77338d9c93515be26fcb6368"} Apr 24 21:16:09.567032 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:09.567002 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mt97w" Apr 24 21:16:09.567454 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:09.567002 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wtj7q" Apr 24 21:16:09.567454 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:09.567113 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mt97w" podUID="d143aeb1-2388-4e2b-94e5-feca18fa8e79" Apr 24 21:16:09.567454 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:09.567006 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m6d6n" Apr 24 21:16:09.567454 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:09.567201 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wtj7q" podUID="ff449891-1658-40ff-a0bd-e08978c661e9" Apr 24 21:16:09.567454 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:09.567306 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m6d6n" podUID="223043ea-b132-4d5d-9a14-0496d53fdc53" Apr 24 21:16:11.567049 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:11.566790 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wtj7q" Apr 24 21:16:11.567456 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:11.566823 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mt97w" Apr 24 21:16:11.567456 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:11.567087 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wtj7q" podUID="ff449891-1658-40ff-a0bd-e08978c661e9" Apr 24 21:16:11.567456 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:11.566853 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m6d6n" Apr 24 21:16:11.567456 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:11.567193 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mt97w" podUID="d143aeb1-2388-4e2b-94e5-feca18fa8e79" Apr 24 21:16:11.567456 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:11.567268 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m6d6n" podUID="223043ea-b132-4d5d-9a14-0496d53fdc53" Apr 24 21:16:12.807716 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:12.807690 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-147.ec2.internal" event="NodeReady" Apr 24 21:16:12.808237 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:12.807830 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 21:16:12.840372 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:12.840345 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7675dc97f4-6p6nx"] Apr 24 21:16:12.844959 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:12.844941 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:16:12.847396 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:12.847372 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 21:16:12.847509 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:12.847442 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-bc8qn\"" Apr 24 21:16:12.847509 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:12.847375 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 21:16:12.847618 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:12.847478 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 21:16:12.852973 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:12.852954 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 21:16:12.858354 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:12.858160 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-4fr27"] Apr 24 21:16:12.862404 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:12.862349 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7675dc97f4-6p6nx"] Apr 24 21:16:12.862503 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:12.862475 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4fr27" Apr 24 21:16:12.864658 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:12.864638 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-gp5s2"] Apr 24 21:16:12.864759 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:12.864737 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 21:16:12.864929 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:12.864908 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 21:16:12.865263 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:12.865167 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-txjjd\"" Apr 24 21:16:12.866963 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:12.866942 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gp5s2" Apr 24 21:16:12.869221 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:12.869134 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-qvvhg\"" Apr 24 21:16:12.869221 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:12.869138 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 21:16:12.869377 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:12.869255 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 21:16:12.869464 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:12.869449 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 21:16:12.874298 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:12.874276 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4fr27"] Apr 24 21:16:12.878508 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:12.878487 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gp5s2"] Apr 24 21:16:13.001881 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.001846 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-ca-trust-extracted\") pod \"image-registry-7675dc97f4-6p6nx\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:16:13.002050 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.001896 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-installation-pull-secrets\") pod \"image-registry-7675dc97f4-6p6nx\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:16:13.002050 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.001925 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/022d6343-7dfc-470e-8e3c-3380ea630933-tmp-dir\") pod \"dns-default-4fr27\" (UID: \"022d6343-7dfc-470e-8e3c-3380ea630933\") " pod="openshift-dns/dns-default-4fr27" Apr 24 21:16:13.002050 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.002007 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-registry-tls\") pod \"image-registry-7675dc97f4-6p6nx\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:16:13.002050 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.002032 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-trusted-ca\") pod \"image-registry-7675dc97f4-6p6nx\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:16:13.002222 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.002054 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/022d6343-7dfc-470e-8e3c-3380ea630933-config-volume\") pod \"dns-default-4fr27\" (UID: \"022d6343-7dfc-470e-8e3c-3380ea630933\") " pod="openshift-dns/dns-default-4fr27" Apr 24 21:16:13.002222 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.002088 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c51aa96-bca7-47fa-bba2-badf0e22ee4d-cert\") pod \"ingress-canary-gp5s2\" (UID: \"7c51aa96-bca7-47fa-bba2-badf0e22ee4d\") " pod="openshift-ingress-canary/ingress-canary-gp5s2" Apr 24 21:16:13.002222 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.002154 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkw66\" (UniqueName: \"kubernetes.io/projected/7c51aa96-bca7-47fa-bba2-badf0e22ee4d-kube-api-access-nkw66\") pod \"ingress-canary-gp5s2\" (UID: \"7c51aa96-bca7-47fa-bba2-badf0e22ee4d\") " pod="openshift-ingress-canary/ingress-canary-gp5s2" Apr 24 21:16:13.002361 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.002230 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-image-registry-private-configuration\") pod \"image-registry-7675dc97f4-6p6nx\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:16:13.002361 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.002266 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/022d6343-7dfc-470e-8e3c-3380ea630933-metrics-tls\") pod \"dns-default-4fr27\" (UID: \"022d6343-7dfc-470e-8e3c-3380ea630933\") " pod="openshift-dns/dns-default-4fr27" Apr 24 21:16:13.002361 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.002299 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-registry-certificates\") pod \"image-registry-7675dc97f4-6p6nx\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:16:13.002361 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.002326 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8l8l\" (UniqueName: \"kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-kube-api-access-m8l8l\") pod \"image-registry-7675dc97f4-6p6nx\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:16:13.002361 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.002350 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7k8x\" (UniqueName: \"kubernetes.io/projected/022d6343-7dfc-470e-8e3c-3380ea630933-kube-api-access-j7k8x\") pod \"dns-default-4fr27\" (UID: \"022d6343-7dfc-470e-8e3c-3380ea630933\") " pod="openshift-dns/dns-default-4fr27" Apr 24 21:16:13.002572 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.002386 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-bound-sa-token\") pod \"image-registry-7675dc97f4-6p6nx\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:16:13.103203 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.103124 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-ca-trust-extracted\") pod \"image-registry-7675dc97f4-6p6nx\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:16:13.103203 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.103165 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-installation-pull-secrets\") pod \"image-registry-7675dc97f4-6p6nx\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:16:13.103203 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.103194 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/022d6343-7dfc-470e-8e3c-3380ea630933-tmp-dir\") pod \"dns-default-4fr27\" (UID: \"022d6343-7dfc-470e-8e3c-3380ea630933\") " pod="openshift-dns/dns-default-4fr27" Apr 24 21:16:13.103470 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.103240 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-registry-tls\") pod \"image-registry-7675dc97f4-6p6nx\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:16:13.103470 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.103266 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-trusted-ca\") pod \"image-registry-7675dc97f4-6p6nx\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:16:13.103470 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.103289 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/022d6343-7dfc-470e-8e3c-3380ea630933-config-volume\") pod \"dns-default-4fr27\" (UID: \"022d6343-7dfc-470e-8e3c-3380ea630933\") " pod="openshift-dns/dns-default-4fr27" Apr 24 21:16:13.103470 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.103321 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c51aa96-bca7-47fa-bba2-badf0e22ee4d-cert\") pod \"ingress-canary-gp5s2\" (UID: \"7c51aa96-bca7-47fa-bba2-badf0e22ee4d\") " pod="openshift-ingress-canary/ingress-canary-gp5s2" Apr 24 21:16:13.103470 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.103341 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nkw66\" (UniqueName: \"kubernetes.io/projected/7c51aa96-bca7-47fa-bba2-badf0e22ee4d-kube-api-access-nkw66\") pod \"ingress-canary-gp5s2\" (UID: \"7c51aa96-bca7-47fa-bba2-badf0e22ee4d\") " pod="openshift-ingress-canary/ingress-canary-gp5s2" Apr 24 21:16:13.103470 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.103371 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-image-registry-private-configuration\") pod \"image-registry-7675dc97f4-6p6nx\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:16:13.103470 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:13.103382 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:16:13.103470 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:13.103404 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7675dc97f4-6p6nx: secret "image-registry-tls" not found Apr 24 21:16:13.103470 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:13.103437 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:13.103470 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:13.103467 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-registry-tls podName:f5c73ec4-3e98-4b0a-a1ae-236998f28fb1 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:13.603449566 +0000 UTC m=+33.599200302 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-registry-tls") pod "image-registry-7675dc97f4-6p6nx" (UID: "f5c73ec4-3e98-4b0a-a1ae-236998f28fb1") : secret "image-registry-tls" not found Apr 24 21:16:13.103470 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:13.103479 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/022d6343-7dfc-470e-8e3c-3380ea630933-metrics-tls podName:022d6343-7dfc-470e-8e3c-3380ea630933 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:13.603473441 +0000 UTC m=+33.599224177 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/022d6343-7dfc-470e-8e3c-3380ea630933-metrics-tls") pod "dns-default-4fr27" (UID: "022d6343-7dfc-470e-8e3c-3380ea630933") : secret "dns-default-metrics-tls" not found Apr 24 21:16:13.103966 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.103386 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/022d6343-7dfc-470e-8e3c-3380ea630933-metrics-tls\") pod \"dns-default-4fr27\" (UID: \"022d6343-7dfc-470e-8e3c-3380ea630933\") " pod="openshift-dns/dns-default-4fr27" Apr 24 21:16:13.103966 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.103519 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-registry-certificates\") pod \"image-registry-7675dc97f4-6p6nx\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:16:13.103966 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.103532 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/022d6343-7dfc-470e-8e3c-3380ea630933-tmp-dir\") pod \"dns-default-4fr27\" (UID: \"022d6343-7dfc-470e-8e3c-3380ea630933\") " pod="openshift-dns/dns-default-4fr27" Apr 24 21:16:13.103966 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.103538 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m8l8l\" (UniqueName: \"kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-kube-api-access-m8l8l\") pod \"image-registry-7675dc97f4-6p6nx\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:16:13.103966 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.103558 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-ca-trust-extracted\") pod \"image-registry-7675dc97f4-6p6nx\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:16:13.103966 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.103594 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j7k8x\" (UniqueName: \"kubernetes.io/projected/022d6343-7dfc-470e-8e3c-3380ea630933-kube-api-access-j7k8x\") pod \"dns-default-4fr27\" (UID: \"022d6343-7dfc-470e-8e3c-3380ea630933\") " pod="openshift-dns/dns-default-4fr27" Apr 24 21:16:13.103966 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.103627 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-bound-sa-token\") pod \"image-registry-7675dc97f4-6p6nx\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:16:13.103966 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:13.103640 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:13.103966 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:13.103690 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c51aa96-bca7-47fa-bba2-badf0e22ee4d-cert podName:7c51aa96-bca7-47fa-bba2-badf0e22ee4d nodeName:}" failed. No retries permitted until 2026-04-24 21:16:13.603675327 +0000 UTC m=+33.599426071 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7c51aa96-bca7-47fa-bba2-badf0e22ee4d-cert") pod "ingress-canary-gp5s2" (UID: "7c51aa96-bca7-47fa-bba2-badf0e22ee4d") : secret "canary-serving-cert" not found Apr 24 21:16:13.104353 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.104191 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/022d6343-7dfc-470e-8e3c-3380ea630933-config-volume\") pod \"dns-default-4fr27\" (UID: \"022d6343-7dfc-470e-8e3c-3380ea630933\") " pod="openshift-dns/dns-default-4fr27" Apr 24 21:16:13.104353 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.104243 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-trusted-ca\") pod \"image-registry-7675dc97f4-6p6nx\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:16:13.104438 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.104351 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-registry-certificates\") pod \"image-registry-7675dc97f4-6p6nx\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:16:13.107837 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.107817 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-installation-pull-secrets\") pod \"image-registry-7675dc97f4-6p6nx\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:16:13.107837 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.107823 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-image-registry-private-configuration\") pod \"image-registry-7675dc97f4-6p6nx\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:16:13.112931 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.112855 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7k8x\" (UniqueName: \"kubernetes.io/projected/022d6343-7dfc-470e-8e3c-3380ea630933-kube-api-access-j7k8x\") pod \"dns-default-4fr27\" (UID: \"022d6343-7dfc-470e-8e3c-3380ea630933\") " pod="openshift-dns/dns-default-4fr27" Apr 24 21:16:13.113044 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.112962 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkw66\" (UniqueName: \"kubernetes.io/projected/7c51aa96-bca7-47fa-bba2-badf0e22ee4d-kube-api-access-nkw66\") pod \"ingress-canary-gp5s2\" (UID: \"7c51aa96-bca7-47fa-bba2-badf0e22ee4d\") " pod="openshift-ingress-canary/ingress-canary-gp5s2" Apr 24 21:16:13.113111 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.113091 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-bound-sa-token\") pod \"image-registry-7675dc97f4-6p6nx\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:16:13.114215 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.114196 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8l8l\" (UniqueName: \"kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-kube-api-access-m8l8l\") pod \"image-registry-7675dc97f4-6p6nx\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:16:13.567369 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.567334 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mt97w" Apr 24 21:16:13.567524 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.567335 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m6d6n" Apr 24 21:16:13.567524 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.567335 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wtj7q" Apr 24 21:16:13.570149 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.570128 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:16:13.570149 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.570157 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:16:13.570350 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.570209 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:16:13.570350 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.570249 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-l2rwk\"" Apr 24 21:16:13.570350 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.570267 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-m7v99\"" Apr 24 21:16:13.570864 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.570847 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 21:16:13.607453 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.607433 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-registry-tls\") pod \"image-registry-7675dc97f4-6p6nx\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:16:13.607539 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.607468 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c51aa96-bca7-47fa-bba2-badf0e22ee4d-cert\") pod \"ingress-canary-gp5s2\" (UID: \"7c51aa96-bca7-47fa-bba2-badf0e22ee4d\") " pod="openshift-ingress-canary/ingress-canary-gp5s2" Apr 24 21:16:13.607539 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:13.607496 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/022d6343-7dfc-470e-8e3c-3380ea630933-metrics-tls\") pod \"dns-default-4fr27\" (UID: \"022d6343-7dfc-470e-8e3c-3380ea630933\") " pod="openshift-dns/dns-default-4fr27" Apr 24 21:16:13.607624 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:13.607557 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:16:13.607624 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:13.607571 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7675dc97f4-6p6nx: secret "image-registry-tls" not found Apr 24 21:16:13.607624 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:13.607578 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:13.607624 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:13.607597 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:13.607624 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:13.607616 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-registry-tls podName:f5c73ec4-3e98-4b0a-a1ae-236998f28fb1 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:14.607602788 +0000 UTC m=+34.603353524 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-registry-tls") pod "image-registry-7675dc97f4-6p6nx" (UID: "f5c73ec4-3e98-4b0a-a1ae-236998f28fb1") : secret "image-registry-tls" not found Apr 24 21:16:13.607792 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:13.607629 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c51aa96-bca7-47fa-bba2-badf0e22ee4d-cert podName:7c51aa96-bca7-47fa-bba2-badf0e22ee4d nodeName:}" failed. No retries permitted until 2026-04-24 21:16:14.607623114 +0000 UTC m=+34.603373850 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7c51aa96-bca7-47fa-bba2-badf0e22ee4d-cert") pod "ingress-canary-gp5s2" (UID: "7c51aa96-bca7-47fa-bba2-badf0e22ee4d") : secret "canary-serving-cert" not found Apr 24 21:16:13.607792 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:13.607640 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/022d6343-7dfc-470e-8e3c-3380ea630933-metrics-tls podName:022d6343-7dfc-470e-8e3c-3380ea630933 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:14.607634591 +0000 UTC m=+34.603385327 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/022d6343-7dfc-470e-8e3c-3380ea630933-metrics-tls") pod "dns-default-4fr27" (UID: "022d6343-7dfc-470e-8e3c-3380ea630933") : secret "dns-default-metrics-tls" not found Apr 24 21:16:14.314319 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:14.314290 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/223043ea-b132-4d5d-9a14-0496d53fdc53-metrics-certs\") pod \"network-metrics-daemon-m6d6n\" (UID: \"223043ea-b132-4d5d-9a14-0496d53fdc53\") " pod="openshift-multus/network-metrics-daemon-m6d6n" Apr 24 21:16:14.315025 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:14.314450 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 21:16:14.315025 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:14.314524 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/223043ea-b132-4d5d-9a14-0496d53fdc53-metrics-certs podName:223043ea-b132-4d5d-9a14-0496d53fdc53 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:46.314505754 +0000 UTC m=+66.310256491 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/223043ea-b132-4d5d-9a14-0496d53fdc53-metrics-certs") pod "network-metrics-daemon-m6d6n" (UID: "223043ea-b132-4d5d-9a14-0496d53fdc53") : secret "metrics-daemon-secret" not found Apr 24 21:16:14.414653 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:14.414626 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7mp6\" (UniqueName: \"kubernetes.io/projected/ff449891-1658-40ff-a0bd-e08978c661e9-kube-api-access-s7mp6\") pod \"network-check-target-wtj7q\" (UID: \"ff449891-1658-40ff-a0bd-e08978c661e9\") " pod="openshift-network-diagnostics/network-check-target-wtj7q" Apr 24 21:16:14.426763 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:14.426738 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7mp6\" (UniqueName: \"kubernetes.io/projected/ff449891-1658-40ff-a0bd-e08978c661e9-kube-api-access-s7mp6\") pod \"network-check-target-wtj7q\" (UID: \"ff449891-1658-40ff-a0bd-e08978c661e9\") " pod="openshift-network-diagnostics/network-check-target-wtj7q" Apr 24 21:16:14.486956 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:14.486930 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wtj7q" Apr 24 21:16:14.616303 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:14.616275 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-registry-tls\") pod \"image-registry-7675dc97f4-6p6nx\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:16:14.616405 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:14.616315 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c51aa96-bca7-47fa-bba2-badf0e22ee4d-cert\") pod \"ingress-canary-gp5s2\" (UID: \"7c51aa96-bca7-47fa-bba2-badf0e22ee4d\") " pod="openshift-ingress-canary/ingress-canary-gp5s2" Apr 24 21:16:14.616405 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:14.616345 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/022d6343-7dfc-470e-8e3c-3380ea630933-metrics-tls\") pod \"dns-default-4fr27\" (UID: \"022d6343-7dfc-470e-8e3c-3380ea630933\") " pod="openshift-dns/dns-default-4fr27" Apr 24 21:16:14.616483 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:14.616419 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:16:14.616483 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:14.616432 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:14.616483 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:14.616439 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7675dc97f4-6p6nx: secret "image-registry-tls" not found Apr 24 21:16:14.616483 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:14.616469 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:14.616483 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:14.616482 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/022d6343-7dfc-470e-8e3c-3380ea630933-metrics-tls podName:022d6343-7dfc-470e-8e3c-3380ea630933 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:16.616468932 +0000 UTC m=+36.612219668 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/022d6343-7dfc-470e-8e3c-3380ea630933-metrics-tls") pod "dns-default-4fr27" (UID: "022d6343-7dfc-470e-8e3c-3380ea630933") : secret "dns-default-metrics-tls" not found Apr 24 21:16:14.616628 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:14.616494 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-registry-tls podName:f5c73ec4-3e98-4b0a-a1ae-236998f28fb1 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:16.616488524 +0000 UTC m=+36.612239260 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-registry-tls") pod "image-registry-7675dc97f4-6p6nx" (UID: "f5c73ec4-3e98-4b0a-a1ae-236998f28fb1") : secret "image-registry-tls" not found Apr 24 21:16:14.616628 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:14.616516 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c51aa96-bca7-47fa-bba2-badf0e22ee4d-cert podName:7c51aa96-bca7-47fa-bba2-badf0e22ee4d nodeName:}" failed. No retries permitted until 2026-04-24 21:16:16.616499574 +0000 UTC m=+36.612250312 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7c51aa96-bca7-47fa-bba2-badf0e22ee4d-cert") pod "ingress-canary-gp5s2" (UID: "7c51aa96-bca7-47fa-bba2-badf0e22ee4d") : secret "canary-serving-cert" not found Apr 24 21:16:14.649622 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:14.649590 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-wtj7q"] Apr 24 21:16:14.661105 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:16:14.661083 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff449891_1658_40ff_a0bd_e08978c661e9.slice/crio-7200009312c4dc9f223fa3216569829ece941dde5077d64c4b4c187b2290a493 WatchSource:0}: Error finding container 7200009312c4dc9f223fa3216569829ece941dde5077d64c4b4c187b2290a493: Status 404 returned error can't find the container with id 7200009312c4dc9f223fa3216569829ece941dde5077d64c4b4c187b2290a493 Apr 24 21:16:14.732602 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:14.732574 2573 generic.go:358] "Generic (PLEG): container finished" podID="f855c2da-63b3-4393-85d5-d812d3b86100" containerID="922678d5d4dc99c07b65f1afc7307a1dfd88393899944046320d19f1f7a89a33" exitCode=0 Apr 24 21:16:14.732706 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:14.732633 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hz9sp" event={"ID":"f855c2da-63b3-4393-85d5-d812d3b86100","Type":"ContainerDied","Data":"922678d5d4dc99c07b65f1afc7307a1dfd88393899944046320d19f1f7a89a33"} Apr 24 21:16:14.733626 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:14.733602 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-wtj7q" event={"ID":"ff449891-1658-40ff-a0bd-e08978c661e9","Type":"ContainerStarted","Data":"7200009312c4dc9f223fa3216569829ece941dde5077d64c4b4c187b2290a493"} Apr 24 21:16:15.738618 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:15.738584 2573 generic.go:358] "Generic (PLEG): container finished" podID="f855c2da-63b3-4393-85d5-d812d3b86100" containerID="78ec1c41ec0924395fabcb9fbdeaade78e9635d95b39e43abc0078bae6ce133e" exitCode=0 Apr 24 21:16:15.739114 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:15.738634 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hz9sp" event={"ID":"f855c2da-63b3-4393-85d5-d812d3b86100","Type":"ContainerDied","Data":"78ec1c41ec0924395fabcb9fbdeaade78e9635d95b39e43abc0078bae6ce133e"} Apr 24 21:16:16.632948 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:16.632911 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-registry-tls\") pod \"image-registry-7675dc97f4-6p6nx\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:16:16.633133 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:16.632968 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c51aa96-bca7-47fa-bba2-badf0e22ee4d-cert\") pod \"ingress-canary-gp5s2\" (UID: \"7c51aa96-bca7-47fa-bba2-badf0e22ee4d\") " pod="openshift-ingress-canary/ingress-canary-gp5s2" Apr 24 21:16:16.633133 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:16.633015 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/022d6343-7dfc-470e-8e3c-3380ea630933-metrics-tls\") pod \"dns-default-4fr27\" (UID: \"022d6343-7dfc-470e-8e3c-3380ea630933\") " pod="openshift-dns/dns-default-4fr27" Apr 24 21:16:16.633133 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:16.633096 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:16:16.633133 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:16.633118 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7675dc97f4-6p6nx: secret "image-registry-tls" not found Apr 24 21:16:16.633337 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:16.633138 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:16.633337 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:16.633138 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:16.633337 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:16.633184 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-registry-tls podName:f5c73ec4-3e98-4b0a-a1ae-236998f28fb1 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:20.633163347 +0000 UTC m=+40.628914087 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-registry-tls") pod "image-registry-7675dc97f4-6p6nx" (UID: "f5c73ec4-3e98-4b0a-a1ae-236998f28fb1") : secret "image-registry-tls" not found Apr 24 21:16:16.633337 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:16.633206 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/022d6343-7dfc-470e-8e3c-3380ea630933-metrics-tls podName:022d6343-7dfc-470e-8e3c-3380ea630933 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:20.633193405 +0000 UTC m=+40.628944148 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/022d6343-7dfc-470e-8e3c-3380ea630933-metrics-tls") pod "dns-default-4fr27" (UID: "022d6343-7dfc-470e-8e3c-3380ea630933") : secret "dns-default-metrics-tls" not found Apr 24 21:16:16.633337 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:16.633221 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c51aa96-bca7-47fa-bba2-badf0e22ee4d-cert podName:7c51aa96-bca7-47fa-bba2-badf0e22ee4d nodeName:}" failed. No retries permitted until 2026-04-24 21:16:20.633212281 +0000 UTC m=+40.628963020 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7c51aa96-bca7-47fa-bba2-badf0e22ee4d-cert") pod "ingress-canary-gp5s2" (UID: "7c51aa96-bca7-47fa-bba2-badf0e22ee4d") : secret "canary-serving-cert" not found Apr 24 21:16:16.744331 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:16.744286 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hz9sp" event={"ID":"f855c2da-63b3-4393-85d5-d812d3b86100","Type":"ContainerStarted","Data":"ab8b47f877c5e5fe53e5dc5d3e26e1fea322abd2b8f413f5643c53e3141c888d"} Apr 24 21:16:16.768368 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:16.768320 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-hz9sp" podStartSLOduration=6.244716388 podStartE2EDuration="36.768303409s" podCreationTimestamp="2026-04-24 21:15:40 +0000 UTC" firstStartedPulling="2026-04-24 21:15:43.230653099 +0000 UTC m=+3.226403848" lastFinishedPulling="2026-04-24 21:16:13.754240122 +0000 UTC m=+33.749990869" observedRunningTime="2026-04-24 21:16:16.767220434 +0000 UTC m=+36.762971205" watchObservedRunningTime="2026-04-24 21:16:16.768303409 +0000 UTC m=+36.764054168" Apr 24 21:16:18.750456 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:18.750270 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-wtj7q" event={"ID":"ff449891-1658-40ff-a0bd-e08978c661e9","Type":"ContainerStarted","Data":"d4716c8723eefa67762f785dee42b44887ad4cc8ee7de955db56166ea197f88b"} Apr 24 21:16:18.751015 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:18.750467 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-wtj7q" Apr 24 21:16:18.767755 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:18.767714 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-wtj7q" podStartSLOduration=34.734168435 podStartE2EDuration="37.767700303s" podCreationTimestamp="2026-04-24 21:15:41 +0000 UTC" firstStartedPulling="2026-04-24 21:16:14.662861061 +0000 UTC m=+34.658611798" lastFinishedPulling="2026-04-24 21:16:17.69639293 +0000 UTC m=+37.692143666" observedRunningTime="2026-04-24 21:16:18.767008042 +0000 UTC m=+38.762758801" watchObservedRunningTime="2026-04-24 21:16:18.767700303 +0000 UTC m=+38.763451127" Apr 24 21:16:19.050792 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:19.050716 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d143aeb1-2388-4e2b-94e5-feca18fa8e79-original-pull-secret\") pod \"global-pull-secret-syncer-mt97w\" (UID: \"d143aeb1-2388-4e2b-94e5-feca18fa8e79\") " pod="kube-system/global-pull-secret-syncer-mt97w" Apr 24 21:16:19.054694 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:19.054666 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d143aeb1-2388-4e2b-94e5-feca18fa8e79-original-pull-secret\") pod \"global-pull-secret-syncer-mt97w\" (UID: \"d143aeb1-2388-4e2b-94e5-feca18fa8e79\") " pod="kube-system/global-pull-secret-syncer-mt97w" Apr 24 21:16:19.277109 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:19.277078 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mt97w" Apr 24 21:16:19.386152 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:19.386126 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-mt97w"] Apr 24 21:16:19.389223 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:16:19.389193 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd143aeb1_2388_4e2b_94e5_feca18fa8e79.slice/crio-c9d441c66ef4e1b062347211d940df4c44d810aaace2023fa510d76be5379053 WatchSource:0}: Error finding container c9d441c66ef4e1b062347211d940df4c44d810aaace2023fa510d76be5379053: Status 404 returned error can't find the container with id c9d441c66ef4e1b062347211d940df4c44d810aaace2023fa510d76be5379053 Apr 24 21:16:19.753573 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:19.753541 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-mt97w" event={"ID":"d143aeb1-2388-4e2b-94e5-feca18fa8e79","Type":"ContainerStarted","Data":"c9d441c66ef4e1b062347211d940df4c44d810aaace2023fa510d76be5379053"} Apr 24 21:16:20.661860 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:20.661828 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-registry-tls\") pod \"image-registry-7675dc97f4-6p6nx\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:16:20.661860 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:20.661867 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c51aa96-bca7-47fa-bba2-badf0e22ee4d-cert\") pod \"ingress-canary-gp5s2\" (UID: \"7c51aa96-bca7-47fa-bba2-badf0e22ee4d\") " pod="openshift-ingress-canary/ingress-canary-gp5s2" Apr 24 21:16:20.662120 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:20.661922 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/022d6343-7dfc-470e-8e3c-3380ea630933-metrics-tls\") pod \"dns-default-4fr27\" (UID: \"022d6343-7dfc-470e-8e3c-3380ea630933\") " pod="openshift-dns/dns-default-4fr27" Apr 24 21:16:20.662120 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:20.661992 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:16:20.662120 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:20.662011 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7675dc97f4-6p6nx: secret "image-registry-tls" not found Apr 24 21:16:20.662120 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:20.662032 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:20.662120 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:20.662070 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-registry-tls podName:f5c73ec4-3e98-4b0a-a1ae-236998f28fb1 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:28.662050272 +0000 UTC m=+48.657801022 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-registry-tls") pod "image-registry-7675dc97f4-6p6nx" (UID: "f5c73ec4-3e98-4b0a-a1ae-236998f28fb1") : secret "image-registry-tls" not found Apr 24 21:16:20.662120 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:20.662090 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/022d6343-7dfc-470e-8e3c-3380ea630933-metrics-tls podName:022d6343-7dfc-470e-8e3c-3380ea630933 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:28.662079744 +0000 UTC m=+48.657830483 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/022d6343-7dfc-470e-8e3c-3380ea630933-metrics-tls") pod "dns-default-4fr27" (UID: "022d6343-7dfc-470e-8e3c-3380ea630933") : secret "dns-default-metrics-tls" not found Apr 24 21:16:20.662120 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:20.662096 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:20.662434 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:20.662131 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c51aa96-bca7-47fa-bba2-badf0e22ee4d-cert podName:7c51aa96-bca7-47fa-bba2-badf0e22ee4d nodeName:}" failed. No retries permitted until 2026-04-24 21:16:28.662116481 +0000 UTC m=+48.657867222 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7c51aa96-bca7-47fa-bba2-badf0e22ee4d-cert") pod "ingress-canary-gp5s2" (UID: "7c51aa96-bca7-47fa-bba2-badf0e22ee4d") : secret "canary-serving-cert" not found Apr 24 21:16:24.764778 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:24.764742 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-mt97w" event={"ID":"d143aeb1-2388-4e2b-94e5-feca18fa8e79","Type":"ContainerStarted","Data":"68f695bb1033c7239c8eb69d9e37f51d23353dd36272fcd3375b80f6d8aadbcc"} Apr 24 21:16:24.779569 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:24.779526 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-mt97w" podStartSLOduration=33.224599782 podStartE2EDuration="37.779513432s" podCreationTimestamp="2026-04-24 21:15:47 +0000 UTC" firstStartedPulling="2026-04-24 21:16:19.391000842 +0000 UTC m=+39.386751578" lastFinishedPulling="2026-04-24 21:16:23.945914492 +0000 UTC m=+43.941665228" observedRunningTime="2026-04-24 21:16:24.778825531 +0000 UTC m=+44.774576288" watchObservedRunningTime="2026-04-24 21:16:24.779513432 +0000 UTC m=+44.775264190" Apr 24 21:16:28.718078 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:28.718043 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-registry-tls\") pod \"image-registry-7675dc97f4-6p6nx\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:16:28.718442 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:28.718087 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c51aa96-bca7-47fa-bba2-badf0e22ee4d-cert\") pod \"ingress-canary-gp5s2\" (UID: \"7c51aa96-bca7-47fa-bba2-badf0e22ee4d\") " pod="openshift-ingress-canary/ingress-canary-gp5s2" Apr 24 21:16:28.718442 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:28.718113 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/022d6343-7dfc-470e-8e3c-3380ea630933-metrics-tls\") pod \"dns-default-4fr27\" (UID: \"022d6343-7dfc-470e-8e3c-3380ea630933\") " pod="openshift-dns/dns-default-4fr27" Apr 24 21:16:28.718442 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:28.718182 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:16:28.718442 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:28.718198 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7675dc97f4-6p6nx: secret "image-registry-tls" not found Apr 24 21:16:28.718442 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:28.718204 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:28.718442 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:28.718210 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:28.718442 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:28.718260 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-registry-tls podName:f5c73ec4-3e98-4b0a-a1ae-236998f28fb1 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:44.718244837 +0000 UTC m=+64.713995582 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-registry-tls") pod "image-registry-7675dc97f4-6p6nx" (UID: "f5c73ec4-3e98-4b0a-a1ae-236998f28fb1") : secret "image-registry-tls" not found Apr 24 21:16:28.718442 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:28.718273 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c51aa96-bca7-47fa-bba2-badf0e22ee4d-cert podName:7c51aa96-bca7-47fa-bba2-badf0e22ee4d nodeName:}" failed. No retries permitted until 2026-04-24 21:16:44.71826739 +0000 UTC m=+64.714018126 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7c51aa96-bca7-47fa-bba2-badf0e22ee4d-cert") pod "ingress-canary-gp5s2" (UID: "7c51aa96-bca7-47fa-bba2-badf0e22ee4d") : secret "canary-serving-cert" not found Apr 24 21:16:28.718442 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:28.718284 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/022d6343-7dfc-470e-8e3c-3380ea630933-metrics-tls podName:022d6343-7dfc-470e-8e3c-3380ea630933 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:44.718279309 +0000 UTC m=+64.714030045 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/022d6343-7dfc-470e-8e3c-3380ea630933-metrics-tls") pod "dns-default-4fr27" (UID: "022d6343-7dfc-470e-8e3c-3380ea630933") : secret "dns-default-metrics-tls" not found Apr 24 21:16:38.730259 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:38.730228 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zvr9w" Apr 24 21:16:44.727097 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:44.727066 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c51aa96-bca7-47fa-bba2-badf0e22ee4d-cert\") pod \"ingress-canary-gp5s2\" (UID: \"7c51aa96-bca7-47fa-bba2-badf0e22ee4d\") " pod="openshift-ingress-canary/ingress-canary-gp5s2" Apr 24 21:16:44.727097 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:44.727107 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/022d6343-7dfc-470e-8e3c-3380ea630933-metrics-tls\") pod \"dns-default-4fr27\" (UID: \"022d6343-7dfc-470e-8e3c-3380ea630933\") " pod="openshift-dns/dns-default-4fr27" Apr 24 21:16:44.727531 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:44.727212 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:16:44.727531 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:44.727218 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:16:44.727531 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:44.727258 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/022d6343-7dfc-470e-8e3c-3380ea630933-metrics-tls podName:022d6343-7dfc-470e-8e3c-3380ea630933 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:16.727245886 +0000 UTC m=+96.722996621 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/022d6343-7dfc-470e-8e3c-3380ea630933-metrics-tls") pod "dns-default-4fr27" (UID: "022d6343-7dfc-470e-8e3c-3380ea630933") : secret "dns-default-metrics-tls" not found Apr 24 21:16:44.727531 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:44.727272 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c51aa96-bca7-47fa-bba2-badf0e22ee4d-cert podName:7c51aa96-bca7-47fa-bba2-badf0e22ee4d nodeName:}" failed. No retries permitted until 2026-04-24 21:17:16.727265665 +0000 UTC m=+96.723016402 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7c51aa96-bca7-47fa-bba2-badf0e22ee4d-cert") pod "ingress-canary-gp5s2" (UID: "7c51aa96-bca7-47fa-bba2-badf0e22ee4d") : secret "canary-serving-cert" not found Apr 24 21:16:44.727531 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:44.727284 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-registry-tls\") pod \"image-registry-7675dc97f4-6p6nx\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:16:44.727531 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:44.727339 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:16:44.727531 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:44.727346 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7675dc97f4-6p6nx: secret "image-registry-tls" not found Apr 24 21:16:44.727531 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:44.727371 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-registry-tls podName:f5c73ec4-3e98-4b0a-a1ae-236998f28fb1 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:16.727365722 +0000 UTC m=+96.723116458 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-registry-tls") pod "image-registry-7675dc97f4-6p6nx" (UID: "f5c73ec4-3e98-4b0a-a1ae-236998f28fb1") : secret "image-registry-tls" not found Apr 24 21:16:46.337391 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:46.337338 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/223043ea-b132-4d5d-9a14-0496d53fdc53-metrics-certs\") pod \"network-metrics-daemon-m6d6n\" (UID: \"223043ea-b132-4d5d-9a14-0496d53fdc53\") " pod="openshift-multus/network-metrics-daemon-m6d6n" Apr 24 21:16:46.337796 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:46.337493 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 21:16:46.337796 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:16:46.337569 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/223043ea-b132-4d5d-9a14-0496d53fdc53-metrics-certs podName:223043ea-b132-4d5d-9a14-0496d53fdc53 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:50.337551829 +0000 UTC m=+130.333302565 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/223043ea-b132-4d5d-9a14-0496d53fdc53-metrics-certs") pod "network-metrics-daemon-m6d6n" (UID: "223043ea-b132-4d5d-9a14-0496d53fdc53") : secret "metrics-daemon-secret" not found Apr 24 21:16:49.756258 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:16:49.756231 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-wtj7q" Apr 24 21:17:16.748197 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:16.748165 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c51aa96-bca7-47fa-bba2-badf0e22ee4d-cert\") pod \"ingress-canary-gp5s2\" (UID: \"7c51aa96-bca7-47fa-bba2-badf0e22ee4d\") " pod="openshift-ingress-canary/ingress-canary-gp5s2" Apr 24 21:17:16.748706 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:16.748203 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/022d6343-7dfc-470e-8e3c-3380ea630933-metrics-tls\") pod \"dns-default-4fr27\" (UID: \"022d6343-7dfc-470e-8e3c-3380ea630933\") " pod="openshift-dns/dns-default-4fr27" Apr 24 21:17:16.748706 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:16.748271 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-registry-tls\") pod \"image-registry-7675dc97f4-6p6nx\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:17:16.748706 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:17:16.748310 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:17:16.748706 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:17:16.748355 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:17:16.748706 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:17:16.748367 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7675dc97f4-6p6nx: secret "image-registry-tls" not found Apr 24 21:17:16.748706 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:17:16.748388 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c51aa96-bca7-47fa-bba2-badf0e22ee4d-cert podName:7c51aa96-bca7-47fa-bba2-badf0e22ee4d nodeName:}" failed. No retries permitted until 2026-04-24 21:18:20.748367664 +0000 UTC m=+160.744118405 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7c51aa96-bca7-47fa-bba2-badf0e22ee4d-cert") pod "ingress-canary-gp5s2" (UID: "7c51aa96-bca7-47fa-bba2-badf0e22ee4d") : secret "canary-serving-cert" not found Apr 24 21:17:16.748706 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:17:16.748408 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-registry-tls podName:f5c73ec4-3e98-4b0a-a1ae-236998f28fb1 nodeName:}" failed. No retries permitted until 2026-04-24 21:18:20.74839752 +0000 UTC m=+160.744148257 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-registry-tls") pod "image-registry-7675dc97f4-6p6nx" (UID: "f5c73ec4-3e98-4b0a-a1ae-236998f28fb1") : secret "image-registry-tls" not found Apr 24 21:17:16.748706 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:17:16.748421 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:17:16.748706 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:17:16.748506 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/022d6343-7dfc-470e-8e3c-3380ea630933-metrics-tls podName:022d6343-7dfc-470e-8e3c-3380ea630933 nodeName:}" failed. No retries permitted until 2026-04-24 21:18:20.748484984 +0000 UTC m=+160.744235720 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/022d6343-7dfc-470e-8e3c-3380ea630933-metrics-tls") pod "dns-default-4fr27" (UID: "022d6343-7dfc-470e-8e3c-3380ea630933") : secret "dns-default-metrics-tls" not found Apr 24 21:17:30.087743 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.087707 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-nl9ms"] Apr 24 21:17:30.090455 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.090438 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nl9ms" Apr 24 21:17:30.092957 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.092936 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 21:17:30.093187 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.093170 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-gdt8q\"" Apr 24 21:17:30.110455 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.110433 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 21:17:30.110562 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.110481 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 24 21:17:30.110562 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.110489 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 24 21:17:30.111314 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.111297 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-nl9ms"] Apr 24 21:17:30.142021 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.141994 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/772a4257-de5c-42f7-8b8c-0ee2404f99a6-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-nl9ms\" (UID: \"772a4257-de5c-42f7-8b8c-0ee2404f99a6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nl9ms" Apr 24 21:17:30.142111 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.142023 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77dhk\" (UniqueName: \"kubernetes.io/projected/772a4257-de5c-42f7-8b8c-0ee2404f99a6-kube-api-access-77dhk\") pod \"cluster-monitoring-operator-75587bd455-nl9ms\" (UID: \"772a4257-de5c-42f7-8b8c-0ee2404f99a6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nl9ms" Apr 24 21:17:30.142111 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.142051 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/772a4257-de5c-42f7-8b8c-0ee2404f99a6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-nl9ms\" (UID: \"772a4257-de5c-42f7-8b8c-0ee2404f99a6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nl9ms" Apr 24 21:17:30.200250 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.200226 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sqcwt"] Apr 24 21:17:30.202993 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.202979 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sqcwt" Apr 24 21:17:30.206324 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.206301 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-xsrdx"] Apr 24 21:17:30.208148 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.208129 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 24 21:17:30.208230 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.208127 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 24 21:17:30.208584 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.208565 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:17:30.209012 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.208997 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gmz9s"] Apr 24 21:17:30.209161 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.209145 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-xsrdx" Apr 24 21:17:30.210633 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.210616 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-v6jxq\"" Apr 24 21:17:30.211641 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.211627 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-dln5m"] Apr 24 21:17:30.211738 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.211724 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gmz9s" Apr 24 21:17:30.212619 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.212600 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 21:17:30.212706 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.212647 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 24 21:17:30.212822 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.212811 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 24 21:17:30.212856 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.212821 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-grwpd\"" Apr 24 21:17:30.212926 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.212821 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 21:17:30.213795 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.213779 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 24 21:17:30.214004 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.213988 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 24 21:17:30.214004 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.214000 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:17:30.214248 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.214236 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 24 21:17:30.214408 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.214392 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-dln5m" Apr 24 21:17:30.215426 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.215410 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-h5fqh\"" Apr 24 21:17:30.216585 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.216560 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 24 21:17:30.216836 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.216724 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:17:30.216836 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.216834 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 24 21:17:30.217003 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.216826 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 24 21:17:30.217093 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.217039 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-zlqqk\"" Apr 24 21:17:30.226088 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.226069 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sqcwt"] Apr 24 21:17:30.227689 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.227661 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gmz9s"] Apr 24 21:17:30.228020 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.228002 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 24 21:17:30.228151 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.228133 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 24 21:17:30.229461 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.229443 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-xsrdx"] Apr 24 21:17:30.230467 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.230450 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-dln5m"] Apr 24 21:17:30.242668 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.242647 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/772a4257-de5c-42f7-8b8c-0ee2404f99a6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-nl9ms\" (UID: \"772a4257-de5c-42f7-8b8c-0ee2404f99a6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nl9ms" Apr 24 21:17:30.242755 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.242713 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/772a4257-de5c-42f7-8b8c-0ee2404f99a6-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-nl9ms\" (UID: \"772a4257-de5c-42f7-8b8c-0ee2404f99a6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nl9ms" Apr 24 21:17:30.242755 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:17:30.242730 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:17:30.242835 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:17:30.242779 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/772a4257-de5c-42f7-8b8c-0ee2404f99a6-cluster-monitoring-operator-tls podName:772a4257-de5c-42f7-8b8c-0ee2404f99a6 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:30.742765536 +0000 UTC m=+110.738516277 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/772a4257-de5c-42f7-8b8c-0ee2404f99a6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-nl9ms" (UID: "772a4257-de5c-42f7-8b8c-0ee2404f99a6") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:17:30.242835 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.242732 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77dhk\" (UniqueName: \"kubernetes.io/projected/772a4257-de5c-42f7-8b8c-0ee2404f99a6-kube-api-access-77dhk\") pod \"cluster-monitoring-operator-75587bd455-nl9ms\" (UID: \"772a4257-de5c-42f7-8b8c-0ee2404f99a6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nl9ms" Apr 24 21:17:30.243636 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.243613 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/772a4257-de5c-42f7-8b8c-0ee2404f99a6-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-nl9ms\" (UID: \"772a4257-de5c-42f7-8b8c-0ee2404f99a6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nl9ms" Apr 24 21:17:30.262958 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.262931 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-77dhk\" (UniqueName: \"kubernetes.io/projected/772a4257-de5c-42f7-8b8c-0ee2404f99a6-kube-api-access-77dhk\") pod \"cluster-monitoring-operator-75587bd455-nl9ms\" (UID: \"772a4257-de5c-42f7-8b8c-0ee2404f99a6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nl9ms" Apr 24 21:17:30.343921 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.343824 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb24c7be-a2bf-47fd-a8da-4bcaf272012a-service-ca-bundle\") pod \"insights-operator-585dfdc468-xsrdx\" (UID: \"fb24c7be-a2bf-47fd-a8da-4bcaf272012a\") " pod="openshift-insights/insights-operator-585dfdc468-xsrdx" Apr 24 21:17:30.343921 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.343865 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/621c1634-3d26-4e3e-8a2e-f735fa5423f9-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-gmz9s\" (UID: \"621c1634-3d26-4e3e-8a2e-f735fa5423f9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gmz9s" Apr 24 21:17:30.344123 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.343925 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/fb24c7be-a2bf-47fd-a8da-4bcaf272012a-snapshots\") pod \"insights-operator-585dfdc468-xsrdx\" (UID: \"fb24c7be-a2bf-47fd-a8da-4bcaf272012a\") " pod="openshift-insights/insights-operator-585dfdc468-xsrdx" Apr 24 21:17:30.344123 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.343954 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72ttm\" (UniqueName: \"kubernetes.io/projected/c1a58610-8ce3-4b65-8ceb-500127ff5a26-kube-api-access-72ttm\") pod \"console-operator-9d4b6777b-dln5m\" (UID: \"c1a58610-8ce3-4b65-8ceb-500127ff5a26\") " pod="openshift-console-operator/console-operator-9d4b6777b-dln5m" Apr 24 21:17:30.344123 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.344011 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1a58610-8ce3-4b65-8ceb-500127ff5a26-serving-cert\") pod \"console-operator-9d4b6777b-dln5m\" (UID: \"c1a58610-8ce3-4b65-8ceb-500127ff5a26\") " pod="openshift-console-operator/console-operator-9d4b6777b-dln5m" Apr 24 21:17:30.344123 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.344038 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rntzv\" (UniqueName: \"kubernetes.io/projected/1706f63c-2ef2-44a4-9a58-455c69e1901d-kube-api-access-rntzv\") pod \"cluster-samples-operator-6dc5bdb6b4-sqcwt\" (UID: \"1706f63c-2ef2-44a4-9a58-455c69e1901d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sqcwt" Apr 24 21:17:30.344123 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.344069 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/621c1634-3d26-4e3e-8a2e-f735fa5423f9-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-gmz9s\" (UID: \"621c1634-3d26-4e3e-8a2e-f735fa5423f9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gmz9s" Apr 24 21:17:30.344123 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.344104 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5rkm\" (UniqueName: \"kubernetes.io/projected/fb24c7be-a2bf-47fd-a8da-4bcaf272012a-kube-api-access-f5rkm\") pod \"insights-operator-585dfdc468-xsrdx\" (UID: \"fb24c7be-a2bf-47fd-a8da-4bcaf272012a\") " pod="openshift-insights/insights-operator-585dfdc468-xsrdx" Apr 24 21:17:30.344410 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.344133 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fb24c7be-a2bf-47fd-a8da-4bcaf272012a-tmp\") pod \"insights-operator-585dfdc468-xsrdx\" (UID: \"fb24c7be-a2bf-47fd-a8da-4bcaf272012a\") " pod="openshift-insights/insights-operator-585dfdc468-xsrdx" Apr 24 21:17:30.344410 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.344159 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb24c7be-a2bf-47fd-a8da-4bcaf272012a-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-xsrdx\" (UID: \"fb24c7be-a2bf-47fd-a8da-4bcaf272012a\") " pod="openshift-insights/insights-operator-585dfdc468-xsrdx" Apr 24 21:17:30.344410 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.344183 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c1a58610-8ce3-4b65-8ceb-500127ff5a26-trusted-ca\") pod \"console-operator-9d4b6777b-dln5m\" (UID: \"c1a58610-8ce3-4b65-8ceb-500127ff5a26\") " pod="openshift-console-operator/console-operator-9d4b6777b-dln5m" Apr 24 21:17:30.344410 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.344254 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1a58610-8ce3-4b65-8ceb-500127ff5a26-config\") pod \"console-operator-9d4b6777b-dln5m\" (UID: \"c1a58610-8ce3-4b65-8ceb-500127ff5a26\") " pod="openshift-console-operator/console-operator-9d4b6777b-dln5m" Apr 24 21:17:30.344410 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.344315 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1706f63c-2ef2-44a4-9a58-455c69e1901d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-sqcwt\" (UID: \"1706f63c-2ef2-44a4-9a58-455c69e1901d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sqcwt" Apr 24 21:17:30.344410 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.344338 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb24c7be-a2bf-47fd-a8da-4bcaf272012a-serving-cert\") pod \"insights-operator-585dfdc468-xsrdx\" (UID: \"fb24c7be-a2bf-47fd-a8da-4bcaf272012a\") " pod="openshift-insights/insights-operator-585dfdc468-xsrdx" Apr 24 21:17:30.344606 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.344410 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qld5w\" (UniqueName: \"kubernetes.io/projected/621c1634-3d26-4e3e-8a2e-f735fa5423f9-kube-api-access-qld5w\") pod \"kube-storage-version-migrator-operator-6769c5d45-gmz9s\" (UID: \"621c1634-3d26-4e3e-8a2e-f735fa5423f9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gmz9s" Apr 24 21:17:30.445393 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.445350 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1706f63c-2ef2-44a4-9a58-455c69e1901d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-sqcwt\" (UID: \"1706f63c-2ef2-44a4-9a58-455c69e1901d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sqcwt" Apr 24 21:17:30.445393 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.445398 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb24c7be-a2bf-47fd-a8da-4bcaf272012a-serving-cert\") pod \"insights-operator-585dfdc468-xsrdx\" (UID: \"fb24c7be-a2bf-47fd-a8da-4bcaf272012a\") " pod="openshift-insights/insights-operator-585dfdc468-xsrdx" Apr 24 21:17:30.445612 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.445436 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qld5w\" (UniqueName: \"kubernetes.io/projected/621c1634-3d26-4e3e-8a2e-f735fa5423f9-kube-api-access-qld5w\") pod \"kube-storage-version-migrator-operator-6769c5d45-gmz9s\" (UID: \"621c1634-3d26-4e3e-8a2e-f735fa5423f9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gmz9s" Apr 24 21:17:30.445612 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.445465 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb24c7be-a2bf-47fd-a8da-4bcaf272012a-service-ca-bundle\") pod \"insights-operator-585dfdc468-xsrdx\" (UID: \"fb24c7be-a2bf-47fd-a8da-4bcaf272012a\") " pod="openshift-insights/insights-operator-585dfdc468-xsrdx" Apr 24 21:17:30.445612 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.445488 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/621c1634-3d26-4e3e-8a2e-f735fa5423f9-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-gmz9s\" (UID: \"621c1634-3d26-4e3e-8a2e-f735fa5423f9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gmz9s" Apr 24 21:17:30.445612 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:17:30.445491 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:17:30.445612 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:17:30.445558 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1706f63c-2ef2-44a4-9a58-455c69e1901d-samples-operator-tls podName:1706f63c-2ef2-44a4-9a58-455c69e1901d nodeName:}" failed. No retries permitted until 2026-04-24 21:17:30.945537971 +0000 UTC m=+110.941288728 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1706f63c-2ef2-44a4-9a58-455c69e1901d-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-sqcwt" (UID: "1706f63c-2ef2-44a4-9a58-455c69e1901d") : secret "samples-operator-tls" not found Apr 24 21:17:30.446346 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.445618 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/fb24c7be-a2bf-47fd-a8da-4bcaf272012a-snapshots\") pod \"insights-operator-585dfdc468-xsrdx\" (UID: \"fb24c7be-a2bf-47fd-a8da-4bcaf272012a\") " pod="openshift-insights/insights-operator-585dfdc468-xsrdx" Apr 24 21:17:30.446346 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.445669 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-72ttm\" (UniqueName: \"kubernetes.io/projected/c1a58610-8ce3-4b65-8ceb-500127ff5a26-kube-api-access-72ttm\") pod \"console-operator-9d4b6777b-dln5m\" (UID: \"c1a58610-8ce3-4b65-8ceb-500127ff5a26\") " pod="openshift-console-operator/console-operator-9d4b6777b-dln5m" Apr 24 21:17:30.446346 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.445709 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1a58610-8ce3-4b65-8ceb-500127ff5a26-serving-cert\") pod \"console-operator-9d4b6777b-dln5m\" (UID: \"c1a58610-8ce3-4b65-8ceb-500127ff5a26\") " pod="openshift-console-operator/console-operator-9d4b6777b-dln5m" Apr 24 21:17:30.446346 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.445737 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rntzv\" (UniqueName: \"kubernetes.io/projected/1706f63c-2ef2-44a4-9a58-455c69e1901d-kube-api-access-rntzv\") pod \"cluster-samples-operator-6dc5bdb6b4-sqcwt\" (UID: \"1706f63c-2ef2-44a4-9a58-455c69e1901d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sqcwt" Apr 24 21:17:30.446346 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.445764 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/621c1634-3d26-4e3e-8a2e-f735fa5423f9-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-gmz9s\" (UID: \"621c1634-3d26-4e3e-8a2e-f735fa5423f9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gmz9s" Apr 24 21:17:30.446346 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.445799 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5rkm\" (UniqueName: \"kubernetes.io/projected/fb24c7be-a2bf-47fd-a8da-4bcaf272012a-kube-api-access-f5rkm\") pod \"insights-operator-585dfdc468-xsrdx\" (UID: \"fb24c7be-a2bf-47fd-a8da-4bcaf272012a\") " pod="openshift-insights/insights-operator-585dfdc468-xsrdx" Apr 24 21:17:30.446346 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.445828 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fb24c7be-a2bf-47fd-a8da-4bcaf272012a-tmp\") pod \"insights-operator-585dfdc468-xsrdx\" (UID: \"fb24c7be-a2bf-47fd-a8da-4bcaf272012a\") " pod="openshift-insights/insights-operator-585dfdc468-xsrdx" Apr 24 21:17:30.446346 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.445856 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb24c7be-a2bf-47fd-a8da-4bcaf272012a-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-xsrdx\" (UID: \"fb24c7be-a2bf-47fd-a8da-4bcaf272012a\") " pod="openshift-insights/insights-operator-585dfdc468-xsrdx" Apr 24 21:17:30.446346 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.445903 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c1a58610-8ce3-4b65-8ceb-500127ff5a26-trusted-ca\") pod \"console-operator-9d4b6777b-dln5m\" (UID: \"c1a58610-8ce3-4b65-8ceb-500127ff5a26\") " pod="openshift-console-operator/console-operator-9d4b6777b-dln5m" Apr 24 21:17:30.446346 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.445963 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1a58610-8ce3-4b65-8ceb-500127ff5a26-config\") pod \"console-operator-9d4b6777b-dln5m\" (UID: \"c1a58610-8ce3-4b65-8ceb-500127ff5a26\") " pod="openshift-console-operator/console-operator-9d4b6777b-dln5m" Apr 24 21:17:30.446346 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.446242 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb24c7be-a2bf-47fd-a8da-4bcaf272012a-service-ca-bundle\") pod \"insights-operator-585dfdc468-xsrdx\" (UID: \"fb24c7be-a2bf-47fd-a8da-4bcaf272012a\") " pod="openshift-insights/insights-operator-585dfdc468-xsrdx" Apr 24 21:17:30.446346 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.446343 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/fb24c7be-a2bf-47fd-a8da-4bcaf272012a-snapshots\") pod \"insights-operator-585dfdc468-xsrdx\" (UID: \"fb24c7be-a2bf-47fd-a8da-4bcaf272012a\") " pod="openshift-insights/insights-operator-585dfdc468-xsrdx" Apr 24 21:17:30.446985 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.446576 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/621c1634-3d26-4e3e-8a2e-f735fa5423f9-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-gmz9s\" (UID: \"621c1634-3d26-4e3e-8a2e-f735fa5423f9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gmz9s" Apr 24 21:17:30.446985 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.446597 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fb24c7be-a2bf-47fd-a8da-4bcaf272012a-tmp\") pod \"insights-operator-585dfdc468-xsrdx\" (UID: \"fb24c7be-a2bf-47fd-a8da-4bcaf272012a\") " pod="openshift-insights/insights-operator-585dfdc468-xsrdx" Apr 24 21:17:30.446985 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.446670 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb24c7be-a2bf-47fd-a8da-4bcaf272012a-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-xsrdx\" (UID: \"fb24c7be-a2bf-47fd-a8da-4bcaf272012a\") " pod="openshift-insights/insights-operator-585dfdc468-xsrdx" Apr 24 21:17:30.446985 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.446723 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1a58610-8ce3-4b65-8ceb-500127ff5a26-config\") pod \"console-operator-9d4b6777b-dln5m\" (UID: \"c1a58610-8ce3-4b65-8ceb-500127ff5a26\") " pod="openshift-console-operator/console-operator-9d4b6777b-dln5m" Apr 24 21:17:30.447407 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.447386 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c1a58610-8ce3-4b65-8ceb-500127ff5a26-trusted-ca\") pod \"console-operator-9d4b6777b-dln5m\" (UID: \"c1a58610-8ce3-4b65-8ceb-500127ff5a26\") " pod="openshift-console-operator/console-operator-9d4b6777b-dln5m" Apr 24 21:17:30.448004 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.447988 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/621c1634-3d26-4e3e-8a2e-f735fa5423f9-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-gmz9s\" (UID: \"621c1634-3d26-4e3e-8a2e-f735fa5423f9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gmz9s" Apr 24 21:17:30.448070 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.448044 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb24c7be-a2bf-47fd-a8da-4bcaf272012a-serving-cert\") pod \"insights-operator-585dfdc468-xsrdx\" (UID: \"fb24c7be-a2bf-47fd-a8da-4bcaf272012a\") " pod="openshift-insights/insights-operator-585dfdc468-xsrdx" Apr 24 21:17:30.448480 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.448459 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1a58610-8ce3-4b65-8ceb-500127ff5a26-serving-cert\") pod \"console-operator-9d4b6777b-dln5m\" (UID: \"c1a58610-8ce3-4b65-8ceb-500127ff5a26\") " pod="openshift-console-operator/console-operator-9d4b6777b-dln5m" Apr 24 21:17:30.455508 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.455488 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qld5w\" (UniqueName: \"kubernetes.io/projected/621c1634-3d26-4e3e-8a2e-f735fa5423f9-kube-api-access-qld5w\") pod \"kube-storage-version-migrator-operator-6769c5d45-gmz9s\" (UID: \"621c1634-3d26-4e3e-8a2e-f735fa5423f9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gmz9s" Apr 24 21:17:30.455707 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.455679 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-72ttm\" (UniqueName: \"kubernetes.io/projected/c1a58610-8ce3-4b65-8ceb-500127ff5a26-kube-api-access-72ttm\") pod \"console-operator-9d4b6777b-dln5m\" (UID: \"c1a58610-8ce3-4b65-8ceb-500127ff5a26\") " pod="openshift-console-operator/console-operator-9d4b6777b-dln5m" Apr 24 21:17:30.455912 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.455895 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rntzv\" (UniqueName: \"kubernetes.io/projected/1706f63c-2ef2-44a4-9a58-455c69e1901d-kube-api-access-rntzv\") pod \"cluster-samples-operator-6dc5bdb6b4-sqcwt\" (UID: \"1706f63c-2ef2-44a4-9a58-455c69e1901d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sqcwt" Apr 24 21:17:30.456000 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.455986 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5rkm\" (UniqueName: \"kubernetes.io/projected/fb24c7be-a2bf-47fd-a8da-4bcaf272012a-kube-api-access-f5rkm\") pod \"insights-operator-585dfdc468-xsrdx\" (UID: \"fb24c7be-a2bf-47fd-a8da-4bcaf272012a\") " pod="openshift-insights/insights-operator-585dfdc468-xsrdx" Apr 24 21:17:30.519963 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.519943 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-xsrdx" Apr 24 21:17:30.530477 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.530456 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gmz9s" Apr 24 21:17:30.536627 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.536521 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-dln5m" Apr 24 21:17:30.649508 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.649483 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-xsrdx"] Apr 24 21:17:30.652611 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:17:30.652585 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb24c7be_a2bf_47fd_a8da_4bcaf272012a.slice/crio-25771404fe77b1b02f34942cddae0ea51c7a66776060f633ef218c75b1c99341 WatchSource:0}: Error finding container 25771404fe77b1b02f34942cddae0ea51c7a66776060f633ef218c75b1c99341: Status 404 returned error can't find the container with id 25771404fe77b1b02f34942cddae0ea51c7a66776060f633ef218c75b1c99341 Apr 24 21:17:30.748908 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.748863 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/772a4257-de5c-42f7-8b8c-0ee2404f99a6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-nl9ms\" (UID: \"772a4257-de5c-42f7-8b8c-0ee2404f99a6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nl9ms" Apr 24 21:17:30.749059 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:17:30.748997 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:17:30.749059 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:17:30.749052 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/772a4257-de5c-42f7-8b8c-0ee2404f99a6-cluster-monitoring-operator-tls podName:772a4257-de5c-42f7-8b8c-0ee2404f99a6 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:31.749037185 +0000 UTC m=+111.744787920 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/772a4257-de5c-42f7-8b8c-0ee2404f99a6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-nl9ms" (UID: "772a4257-de5c-42f7-8b8c-0ee2404f99a6") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:17:30.871390 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.871330 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gmz9s"] Apr 24 21:17:30.874098 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:17:30.874071 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod621c1634_3d26_4e3e_8a2e_f735fa5423f9.slice/crio-26bf2a57256345f92d686fde9156c4cfa24f069bd64cb749f7c3cc0a4818402b WatchSource:0}: Error finding container 26bf2a57256345f92d686fde9156c4cfa24f069bd64cb749f7c3cc0a4818402b: Status 404 returned error can't find the container with id 26bf2a57256345f92d686fde9156c4cfa24f069bd64cb749f7c3cc0a4818402b Apr 24 21:17:30.874495 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.874479 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-dln5m"] Apr 24 21:17:30.877306 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:17:30.877285 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1a58610_8ce3_4b65_8ceb_500127ff5a26.slice/crio-b085ddf693cedde48ef3e9940de153c716f51e6b809a09fd13b5d35ac1e0f27c WatchSource:0}: Error finding container b085ddf693cedde48ef3e9940de153c716f51e6b809a09fd13b5d35ac1e0f27c: Status 404 returned error can't find the container with id b085ddf693cedde48ef3e9940de153c716f51e6b809a09fd13b5d35ac1e0f27c Apr 24 21:17:30.882102 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.882079 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-dln5m" event={"ID":"c1a58610-8ce3-4b65-8ceb-500127ff5a26","Type":"ContainerStarted","Data":"b085ddf693cedde48ef3e9940de153c716f51e6b809a09fd13b5d35ac1e0f27c"} Apr 24 21:17:30.882992 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.882962 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gmz9s" event={"ID":"621c1634-3d26-4e3e-8a2e-f735fa5423f9","Type":"ContainerStarted","Data":"26bf2a57256345f92d686fde9156c4cfa24f069bd64cb749f7c3cc0a4818402b"} Apr 24 21:17:30.883805 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.883781 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-xsrdx" event={"ID":"fb24c7be-a2bf-47fd-a8da-4bcaf272012a","Type":"ContainerStarted","Data":"25771404fe77b1b02f34942cddae0ea51c7a66776060f633ef218c75b1c99341"} Apr 24 21:17:30.950502 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:30.950475 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1706f63c-2ef2-44a4-9a58-455c69e1901d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-sqcwt\" (UID: \"1706f63c-2ef2-44a4-9a58-455c69e1901d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sqcwt" Apr 24 21:17:30.950634 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:17:30.950616 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:17:30.950686 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:17:30.950676 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1706f63c-2ef2-44a4-9a58-455c69e1901d-samples-operator-tls podName:1706f63c-2ef2-44a4-9a58-455c69e1901d nodeName:}" failed. No retries permitted until 2026-04-24 21:17:31.950658749 +0000 UTC m=+111.946409488 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1706f63c-2ef2-44a4-9a58-455c69e1901d-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-sqcwt" (UID: "1706f63c-2ef2-44a4-9a58-455c69e1901d") : secret "samples-operator-tls" not found Apr 24 21:17:31.756908 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:31.756524 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/772a4257-de5c-42f7-8b8c-0ee2404f99a6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-nl9ms\" (UID: \"772a4257-de5c-42f7-8b8c-0ee2404f99a6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nl9ms" Apr 24 21:17:31.756908 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:17:31.756770 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:17:31.756908 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:17:31.756853 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/772a4257-de5c-42f7-8b8c-0ee2404f99a6-cluster-monitoring-operator-tls podName:772a4257-de5c-42f7-8b8c-0ee2404f99a6 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:33.756832888 +0000 UTC m=+113.752583626 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/772a4257-de5c-42f7-8b8c-0ee2404f99a6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-nl9ms" (UID: "772a4257-de5c-42f7-8b8c-0ee2404f99a6") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:17:31.879610 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:31.879558 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-bg2dl"] Apr 24 21:17:31.882855 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:31.882834 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-bg2dl" Apr 24 21:17:31.885888 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:31.885653 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-4jxgj\"" Apr 24 21:17:31.894049 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:31.892995 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-bg2dl"] Apr 24 21:17:31.960059 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:31.959368 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xljfw\" (UniqueName: \"kubernetes.io/projected/593cf1e8-79f1-4dd1-a9db-ebd333078407-kube-api-access-xljfw\") pod \"network-check-source-8894fc9bd-bg2dl\" (UID: \"593cf1e8-79f1-4dd1-a9db-ebd333078407\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-bg2dl" Apr 24 21:17:31.960059 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:31.959441 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1706f63c-2ef2-44a4-9a58-455c69e1901d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-sqcwt\" (UID: \"1706f63c-2ef2-44a4-9a58-455c69e1901d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sqcwt" Apr 24 21:17:31.960059 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:17:31.959539 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:17:31.960059 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:17:31.959614 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1706f63c-2ef2-44a4-9a58-455c69e1901d-samples-operator-tls podName:1706f63c-2ef2-44a4-9a58-455c69e1901d nodeName:}" failed. No retries permitted until 2026-04-24 21:17:33.959587284 +0000 UTC m=+113.955338035 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1706f63c-2ef2-44a4-9a58-455c69e1901d-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-sqcwt" (UID: "1706f63c-2ef2-44a4-9a58-455c69e1901d") : secret "samples-operator-tls" not found Apr 24 21:17:32.061153 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:32.061006 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xljfw\" (UniqueName: \"kubernetes.io/projected/593cf1e8-79f1-4dd1-a9db-ebd333078407-kube-api-access-xljfw\") pod \"network-check-source-8894fc9bd-bg2dl\" (UID: \"593cf1e8-79f1-4dd1-a9db-ebd333078407\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-bg2dl" Apr 24 21:17:32.074750 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:32.074718 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xljfw\" (UniqueName: \"kubernetes.io/projected/593cf1e8-79f1-4dd1-a9db-ebd333078407-kube-api-access-xljfw\") pod \"network-check-source-8894fc9bd-bg2dl\" (UID: \"593cf1e8-79f1-4dd1-a9db-ebd333078407\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-bg2dl" Apr 24 21:17:32.202640 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:32.202606 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-bg2dl" Apr 24 21:17:33.536207 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:33.536162 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-bg2dl"] Apr 24 21:17:33.541143 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:17:33.541112 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod593cf1e8_79f1_4dd1_a9db_ebd333078407.slice/crio-d73bc4e7dc430ab167c061fe85ecb6f2c29e2186012615c1efbe1cf68c6bfdaf WatchSource:0}: Error finding container d73bc4e7dc430ab167c061fe85ecb6f2c29e2186012615c1efbe1cf68c6bfdaf: Status 404 returned error can't find the container with id d73bc4e7dc430ab167c061fe85ecb6f2c29e2186012615c1efbe1cf68c6bfdaf Apr 24 21:17:33.774382 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:33.774350 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/772a4257-de5c-42f7-8b8c-0ee2404f99a6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-nl9ms\" (UID: \"772a4257-de5c-42f7-8b8c-0ee2404f99a6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nl9ms" Apr 24 21:17:33.774557 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:17:33.774524 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:17:33.774615 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:17:33.774603 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/772a4257-de5c-42f7-8b8c-0ee2404f99a6-cluster-monitoring-operator-tls podName:772a4257-de5c-42f7-8b8c-0ee2404f99a6 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:37.774583342 +0000 UTC m=+117.770334084 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/772a4257-de5c-42f7-8b8c-0ee2404f99a6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-nl9ms" (UID: "772a4257-de5c-42f7-8b8c-0ee2404f99a6") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:17:33.894202 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:33.894174 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/0.log" Apr 24 21:17:33.894347 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:33.894216 2573 generic.go:358] "Generic (PLEG): container finished" podID="c1a58610-8ce3-4b65-8ceb-500127ff5a26" containerID="2dfeed97410927ec819ab97242214285a3b39017557c43014b561e6923c1884f" exitCode=255 Apr 24 21:17:33.894347 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:33.894249 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-dln5m" event={"ID":"c1a58610-8ce3-4b65-8ceb-500127ff5a26","Type":"ContainerDied","Data":"2dfeed97410927ec819ab97242214285a3b39017557c43014b561e6923c1884f"} Apr 24 21:17:33.894531 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:33.894515 2573 scope.go:117] "RemoveContainer" containerID="2dfeed97410927ec819ab97242214285a3b39017557c43014b561e6923c1884f" Apr 24 21:17:33.895930 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:33.895903 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-bg2dl" event={"ID":"593cf1e8-79f1-4dd1-a9db-ebd333078407","Type":"ContainerStarted","Data":"186148262f5646aaa8c22baaf52549490f222ad4414dd4305d5f4ee84b245a51"} Apr 24 21:17:33.896029 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:33.895936 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-bg2dl" event={"ID":"593cf1e8-79f1-4dd1-a9db-ebd333078407","Type":"ContainerStarted","Data":"d73bc4e7dc430ab167c061fe85ecb6f2c29e2186012615c1efbe1cf68c6bfdaf"} Apr 24 21:17:33.897174 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:33.897151 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gmz9s" event={"ID":"621c1634-3d26-4e3e-8a2e-f735fa5423f9","Type":"ContainerStarted","Data":"990d51c271f077ba75c202ef6760de4d5fb3e586a9a11257a796a550c5110ab4"} Apr 24 21:17:33.898687 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:33.898666 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-xsrdx" event={"ID":"fb24c7be-a2bf-47fd-a8da-4bcaf272012a","Type":"ContainerStarted","Data":"feed3bc54a3ffa1c1f70ff627276e496fcedceaa233158ee727b3a1b055cb691"} Apr 24 21:17:33.930847 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:33.930805 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-bg2dl" podStartSLOduration=2.930790847 podStartE2EDuration="2.930790847s" podCreationTimestamp="2026-04-24 21:17:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:17:33.930279847 +0000 UTC m=+113.926030606" watchObservedRunningTime="2026-04-24 21:17:33.930790847 +0000 UTC m=+113.926541607" Apr 24 21:17:33.950509 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:33.950469 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gmz9s" podStartSLOduration=1.407236507 podStartE2EDuration="3.95045527s" podCreationTimestamp="2026-04-24 21:17:30 +0000 UTC" firstStartedPulling="2026-04-24 21:17:30.875758659 +0000 UTC m=+110.871509395" lastFinishedPulling="2026-04-24 21:17:33.418977422 +0000 UTC m=+113.414728158" observedRunningTime="2026-04-24 21:17:33.949214057 +0000 UTC m=+113.944964816" watchObservedRunningTime="2026-04-24 21:17:33.95045527 +0000 UTC m=+113.946206029" Apr 24 21:17:33.975424 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:33.975380 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-xsrdx" podStartSLOduration=1.21805616 podStartE2EDuration="3.975365676s" podCreationTimestamp="2026-04-24 21:17:30 +0000 UTC" firstStartedPulling="2026-04-24 21:17:30.655216275 +0000 UTC m=+110.650967014" lastFinishedPulling="2026-04-24 21:17:33.412525786 +0000 UTC m=+113.408276530" observedRunningTime="2026-04-24 21:17:33.975218025 +0000 UTC m=+113.970968788" watchObservedRunningTime="2026-04-24 21:17:33.975365676 +0000 UTC m=+113.971116435" Apr 24 21:17:33.976942 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:33.976555 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1706f63c-2ef2-44a4-9a58-455c69e1901d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-sqcwt\" (UID: \"1706f63c-2ef2-44a4-9a58-455c69e1901d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sqcwt" Apr 24 21:17:33.980418 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:17:33.980391 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:17:33.980515 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:17:33.980474 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1706f63c-2ef2-44a4-9a58-455c69e1901d-samples-operator-tls podName:1706f63c-2ef2-44a4-9a58-455c69e1901d nodeName:}" failed. No retries permitted until 2026-04-24 21:17:37.980454937 +0000 UTC m=+117.976205676 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1706f63c-2ef2-44a4-9a58-455c69e1901d-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-sqcwt" (UID: "1706f63c-2ef2-44a4-9a58-455c69e1901d") : secret "samples-operator-tls" not found Apr 24 21:17:34.902238 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:34.902206 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/1.log" Apr 24 21:17:34.902670 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:34.902573 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/0.log" Apr 24 21:17:34.902670 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:34.902609 2573 generic.go:358] "Generic (PLEG): container finished" podID="c1a58610-8ce3-4b65-8ceb-500127ff5a26" containerID="0703844da8487429d581535178c2146fe72a2691d700a59d71afa177ab814ca9" exitCode=255 Apr 24 21:17:34.902782 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:34.902724 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-dln5m" event={"ID":"c1a58610-8ce3-4b65-8ceb-500127ff5a26","Type":"ContainerDied","Data":"0703844da8487429d581535178c2146fe72a2691d700a59d71afa177ab814ca9"} Apr 24 21:17:34.902782 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:34.902775 2573 scope.go:117] "RemoveContainer" containerID="2dfeed97410927ec819ab97242214285a3b39017557c43014b561e6923c1884f" Apr 24 21:17:34.903146 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:34.903127 2573 scope.go:117] "RemoveContainer" containerID="0703844da8487429d581535178c2146fe72a2691d700a59d71afa177ab814ca9" Apr 24 21:17:34.903338 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:17:34.903315 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-dln5m_openshift-console-operator(c1a58610-8ce3-4b65-8ceb-500127ff5a26)\"" pod="openshift-console-operator/console-operator-9d4b6777b-dln5m" podUID="c1a58610-8ce3-4b65-8ceb-500127ff5a26" Apr 24 21:17:35.906126 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:35.906100 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/1.log" Apr 24 21:17:35.906664 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:35.906500 2573 scope.go:117] "RemoveContainer" containerID="0703844da8487429d581535178c2146fe72a2691d700a59d71afa177ab814ca9" Apr 24 21:17:35.906780 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:17:35.906761 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-dln5m_openshift-console-operator(c1a58610-8ce3-4b65-8ceb-500127ff5a26)\"" pod="openshift-console-operator/console-operator-9d4b6777b-dln5m" podUID="c1a58610-8ce3-4b65-8ceb-500127ff5a26" Apr 24 21:17:36.300264 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:36.300227 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-mrzsk"] Apr 24 21:17:36.304400 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:36.304377 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mrzsk" Apr 24 21:17:36.307193 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:36.307175 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 24 21:17:36.307301 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:36.307242 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-tzktt\"" Apr 24 21:17:36.310357 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:36.310340 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 24 21:17:36.316101 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:36.316081 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-mrzsk"] Apr 24 21:17:36.395761 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:36.395735 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgvhs\" (UniqueName: \"kubernetes.io/projected/4dae161f-cd3c-4605-b787-8a713855dd4c-kube-api-access-lgvhs\") pod \"migrator-74bb7799d9-mrzsk\" (UID: \"4dae161f-cd3c-4605-b787-8a713855dd4c\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mrzsk" Apr 24 21:17:36.496282 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:36.496256 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lgvhs\" (UniqueName: \"kubernetes.io/projected/4dae161f-cd3c-4605-b787-8a713855dd4c-kube-api-access-lgvhs\") pod \"migrator-74bb7799d9-mrzsk\" (UID: \"4dae161f-cd3c-4605-b787-8a713855dd4c\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mrzsk" Apr 24 21:17:36.506376 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:36.506355 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgvhs\" (UniqueName: \"kubernetes.io/projected/4dae161f-cd3c-4605-b787-8a713855dd4c-kube-api-access-lgvhs\") pod \"migrator-74bb7799d9-mrzsk\" (UID: \"4dae161f-cd3c-4605-b787-8a713855dd4c\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mrzsk" Apr 24 21:17:36.613020 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:36.612935 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mrzsk" Apr 24 21:17:36.728123 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:36.728092 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-mrzsk"] Apr 24 21:17:36.730609 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:17:36.730577 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dae161f_cd3c_4605_b787_8a713855dd4c.slice/crio-ec51411a02084ab101180daf443c5c2798cb0abe89c4b01aa72547c3690ff6f0 WatchSource:0}: Error finding container ec51411a02084ab101180daf443c5c2798cb0abe89c4b01aa72547c3690ff6f0: Status 404 returned error can't find the container with id ec51411a02084ab101180daf443c5c2798cb0abe89c4b01aa72547c3690ff6f0 Apr 24 21:17:36.909603 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:36.909533 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mrzsk" event={"ID":"4dae161f-cd3c-4605-b787-8a713855dd4c","Type":"ContainerStarted","Data":"ec51411a02084ab101180daf443c5c2798cb0abe89c4b01aa72547c3690ff6f0"} Apr 24 21:17:37.271997 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:37.271967 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-tg7vx"] Apr 24 21:17:37.274978 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:37.274958 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-tg7vx" Apr 24 21:17:37.282631 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:37.282540 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 24 21:17:37.282735 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:37.282644 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 24 21:17:37.282801 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:37.282540 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 24 21:17:37.282927 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:37.282908 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 24 21:17:37.289462 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:37.289447 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-8gxw7\"" Apr 24 21:17:37.303407 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:37.303385 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-tg7vx"] Apr 24 21:17:37.404145 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:37.404109 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1075d842-efff-4daf-bcac-f98aeb77665d-signing-key\") pod \"service-ca-865cb79987-tg7vx\" (UID: \"1075d842-efff-4daf-bcac-f98aeb77665d\") " pod="openshift-service-ca/service-ca-865cb79987-tg7vx" Apr 24 21:17:37.404342 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:37.404185 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr5vz\" (UniqueName: \"kubernetes.io/projected/1075d842-efff-4daf-bcac-f98aeb77665d-kube-api-access-pr5vz\") pod \"service-ca-865cb79987-tg7vx\" (UID: \"1075d842-efff-4daf-bcac-f98aeb77665d\") " pod="openshift-service-ca/service-ca-865cb79987-tg7vx" Apr 24 21:17:37.404342 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:37.404255 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1075d842-efff-4daf-bcac-f98aeb77665d-signing-cabundle\") pod \"service-ca-865cb79987-tg7vx\" (UID: \"1075d842-efff-4daf-bcac-f98aeb77665d\") " pod="openshift-service-ca/service-ca-865cb79987-tg7vx" Apr 24 21:17:37.504942 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:37.504907 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1075d842-efff-4daf-bcac-f98aeb77665d-signing-key\") pod \"service-ca-865cb79987-tg7vx\" (UID: \"1075d842-efff-4daf-bcac-f98aeb77665d\") " pod="openshift-service-ca/service-ca-865cb79987-tg7vx" Apr 24 21:17:37.505091 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:37.504965 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pr5vz\" (UniqueName: \"kubernetes.io/projected/1075d842-efff-4daf-bcac-f98aeb77665d-kube-api-access-pr5vz\") pod \"service-ca-865cb79987-tg7vx\" (UID: \"1075d842-efff-4daf-bcac-f98aeb77665d\") " pod="openshift-service-ca/service-ca-865cb79987-tg7vx" Apr 24 21:17:37.505091 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:37.504994 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1075d842-efff-4daf-bcac-f98aeb77665d-signing-cabundle\") pod \"service-ca-865cb79987-tg7vx\" (UID: \"1075d842-efff-4daf-bcac-f98aeb77665d\") " pod="openshift-service-ca/service-ca-865cb79987-tg7vx" Apr 24 21:17:37.505621 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:37.505597 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1075d842-efff-4daf-bcac-f98aeb77665d-signing-cabundle\") pod \"service-ca-865cb79987-tg7vx\" (UID: \"1075d842-efff-4daf-bcac-f98aeb77665d\") " pod="openshift-service-ca/service-ca-865cb79987-tg7vx" Apr 24 21:17:37.507208 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:37.507188 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1075d842-efff-4daf-bcac-f98aeb77665d-signing-key\") pod \"service-ca-865cb79987-tg7vx\" (UID: \"1075d842-efff-4daf-bcac-f98aeb77665d\") " pod="openshift-service-ca/service-ca-865cb79987-tg7vx" Apr 24 21:17:37.516472 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:37.516450 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr5vz\" (UniqueName: \"kubernetes.io/projected/1075d842-efff-4daf-bcac-f98aeb77665d-kube-api-access-pr5vz\") pod \"service-ca-865cb79987-tg7vx\" (UID: \"1075d842-efff-4daf-bcac-f98aeb77665d\") " pod="openshift-service-ca/service-ca-865cb79987-tg7vx" Apr 24 21:17:37.585851 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:37.585785 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-tg7vx" Apr 24 21:17:37.808056 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:37.807848 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/772a4257-de5c-42f7-8b8c-0ee2404f99a6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-nl9ms\" (UID: \"772a4257-de5c-42f7-8b8c-0ee2404f99a6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nl9ms" Apr 24 21:17:37.808056 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:17:37.808044 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:17:37.808246 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:17:37.808112 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/772a4257-de5c-42f7-8b8c-0ee2404f99a6-cluster-monitoring-operator-tls podName:772a4257-de5c-42f7-8b8c-0ee2404f99a6 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:45.808090165 +0000 UTC m=+125.803840915 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/772a4257-de5c-42f7-8b8c-0ee2404f99a6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-nl9ms" (UID: "772a4257-de5c-42f7-8b8c-0ee2404f99a6") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:17:37.859010 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:37.858818 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-tg7vx"] Apr 24 21:17:37.862539 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:17:37.862515 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1075d842_efff_4daf_bcac_f98aeb77665d.slice/crio-399a1a145fa8913cd4e14bdc5e92bcbb6a9d669bfd27e5fed1044750dc139ed0 WatchSource:0}: Error finding container 399a1a145fa8913cd4e14bdc5e92bcbb6a9d669bfd27e5fed1044750dc139ed0: Status 404 returned error can't find the container with id 399a1a145fa8913cd4e14bdc5e92bcbb6a9d669bfd27e5fed1044750dc139ed0 Apr 24 21:17:37.913578 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:37.913552 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mrzsk" event={"ID":"4dae161f-cd3c-4605-b787-8a713855dd4c","Type":"ContainerStarted","Data":"7af6f0462d62e57868cf6f54a2d4e06b90add68f81e74ee640e01de94b8daf76"} Apr 24 21:17:37.913902 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:37.913586 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mrzsk" event={"ID":"4dae161f-cd3c-4605-b787-8a713855dd4c","Type":"ContainerStarted","Data":"e51557a931de7a51ef73542d21901034a2cdeba30f971e1e44e0a64bd1a1bd8d"} Apr 24 21:17:37.914516 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:37.914494 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-tg7vx" event={"ID":"1075d842-efff-4daf-bcac-f98aeb77665d","Type":"ContainerStarted","Data":"399a1a145fa8913cd4e14bdc5e92bcbb6a9d669bfd27e5fed1044750dc139ed0"} Apr 24 21:17:37.955579 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:37.955534 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mrzsk" podStartSLOduration=0.900420196 podStartE2EDuration="1.955521668s" podCreationTimestamp="2026-04-24 21:17:36 +0000 UTC" firstStartedPulling="2026-04-24 21:17:36.732508894 +0000 UTC m=+116.728259630" lastFinishedPulling="2026-04-24 21:17:37.787610345 +0000 UTC m=+117.783361102" observedRunningTime="2026-04-24 21:17:37.954083929 +0000 UTC m=+117.949834686" watchObservedRunningTime="2026-04-24 21:17:37.955521668 +0000 UTC m=+117.951272492" Apr 24 21:17:38.009944 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:38.009918 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1706f63c-2ef2-44a4-9a58-455c69e1901d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-sqcwt\" (UID: \"1706f63c-2ef2-44a4-9a58-455c69e1901d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sqcwt" Apr 24 21:17:38.010067 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:17:38.010051 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:17:38.010130 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:17:38.010115 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1706f63c-2ef2-44a4-9a58-455c69e1901d-samples-operator-tls podName:1706f63c-2ef2-44a4-9a58-455c69e1901d nodeName:}" failed. No retries permitted until 2026-04-24 21:17:46.01009588 +0000 UTC m=+126.005846619 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1706f63c-2ef2-44a4-9a58-455c69e1901d-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-sqcwt" (UID: "1706f63c-2ef2-44a4-9a58-455c69e1901d") : secret "samples-operator-tls" not found Apr 24 21:17:38.089910 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:38.089888 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zscc7_a1ae49ae-a1e3-464e-a9db-3d0bad2349ab/dns-node-resolver/0.log" Apr 24 21:17:38.878099 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:38.878065 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wv544_f71747fd-1913-4d70-b833-4f352b05ba15/node-ca/0.log" Apr 24 21:17:39.921301 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:39.921229 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-tg7vx" event={"ID":"1075d842-efff-4daf-bcac-f98aeb77665d","Type":"ContainerStarted","Data":"191d3ee4a3fd6d7ee94dbcf3d61a64aa610e5040fd0d210650d9f0ede55dfac1"} Apr 24 21:17:39.970188 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:39.970147 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-tg7vx" podStartSLOduration=1.245706822 podStartE2EDuration="2.970134752s" podCreationTimestamp="2026-04-24 21:17:37 +0000 UTC" firstStartedPulling="2026-04-24 21:17:37.864676182 +0000 UTC m=+117.860426918" lastFinishedPulling="2026-04-24 21:17:39.589104108 +0000 UTC m=+119.584854848" observedRunningTime="2026-04-24 21:17:39.969863874 +0000 UTC m=+119.965614632" watchObservedRunningTime="2026-04-24 21:17:39.970134752 +0000 UTC m=+119.965885552" Apr 24 21:17:40.537449 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:40.537417 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-dln5m" Apr 24 21:17:40.537630 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:40.537464 2573 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-dln5m" Apr 24 21:17:40.537938 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:40.537925 2573 scope.go:117] "RemoveContainer" containerID="0703844da8487429d581535178c2146fe72a2691d700a59d71afa177ab814ca9" Apr 24 21:17:40.538145 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:17:40.538125 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-dln5m_openshift-console-operator(c1a58610-8ce3-4b65-8ceb-500127ff5a26)\"" pod="openshift-console-operator/console-operator-9d4b6777b-dln5m" podUID="c1a58610-8ce3-4b65-8ceb-500127ff5a26" Apr 24 21:17:45.876093 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:45.876054 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/772a4257-de5c-42f7-8b8c-0ee2404f99a6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-nl9ms\" (UID: \"772a4257-de5c-42f7-8b8c-0ee2404f99a6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nl9ms" Apr 24 21:17:45.876457 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:17:45.876201 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:17:45.876457 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:17:45.876273 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/772a4257-de5c-42f7-8b8c-0ee2404f99a6-cluster-monitoring-operator-tls podName:772a4257-de5c-42f7-8b8c-0ee2404f99a6 nodeName:}" failed. No retries permitted until 2026-04-24 21:18:01.876256967 +0000 UTC m=+141.872007702 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/772a4257-de5c-42f7-8b8c-0ee2404f99a6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-nl9ms" (UID: "772a4257-de5c-42f7-8b8c-0ee2404f99a6") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:17:46.077129 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:46.077093 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1706f63c-2ef2-44a4-9a58-455c69e1901d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-sqcwt\" (UID: \"1706f63c-2ef2-44a4-9a58-455c69e1901d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sqcwt" Apr 24 21:17:46.080062 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:46.080040 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1706f63c-2ef2-44a4-9a58-455c69e1901d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-sqcwt\" (UID: \"1706f63c-2ef2-44a4-9a58-455c69e1901d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sqcwt" Apr 24 21:17:46.119244 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:46.119214 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-v6jxq\"" Apr 24 21:17:46.123301 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:46.123285 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sqcwt" Apr 24 21:17:46.265816 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:46.265780 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sqcwt"] Apr 24 21:17:46.939734 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:46.939699 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sqcwt" event={"ID":"1706f63c-2ef2-44a4-9a58-455c69e1901d","Type":"ContainerStarted","Data":"575256a1a483898c649616f13a350c7b1030150400efc8963e1a00effa3f6ace"} Apr 24 21:17:48.945544 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:48.945513 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sqcwt" event={"ID":"1706f63c-2ef2-44a4-9a58-455c69e1901d","Type":"ContainerStarted","Data":"1457913f70244955c62b0b5ddf58fdc5108748ed1e0f06f3c7b534bb0b474c3a"} Apr 24 21:17:48.945544 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:48.945549 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sqcwt" event={"ID":"1706f63c-2ef2-44a4-9a58-455c69e1901d","Type":"ContainerStarted","Data":"05a7ebf0944de02fe6d3703c441e91802c49937a9e18a95e254bb2945d39a9c4"} Apr 24 21:17:48.963432 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:48.963390 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sqcwt" podStartSLOduration=17.315225955 podStartE2EDuration="18.963378275s" podCreationTimestamp="2026-04-24 21:17:30 +0000 UTC" firstStartedPulling="2026-04-24 21:17:46.308988064 +0000 UTC m=+126.304738800" lastFinishedPulling="2026-04-24 21:17:47.95714038 +0000 UTC m=+127.952891120" observedRunningTime="2026-04-24 21:17:48.962234493 +0000 UTC m=+128.957985251" watchObservedRunningTime="2026-04-24 21:17:48.963378275 +0000 UTC m=+128.959129033" Apr 24 21:17:50.410989 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:50.410954 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/223043ea-b132-4d5d-9a14-0496d53fdc53-metrics-certs\") pod \"network-metrics-daemon-m6d6n\" (UID: \"223043ea-b132-4d5d-9a14-0496d53fdc53\") " pod="openshift-multus/network-metrics-daemon-m6d6n" Apr 24 21:17:50.413158 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:50.413138 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/223043ea-b132-4d5d-9a14-0496d53fdc53-metrics-certs\") pod \"network-metrics-daemon-m6d6n\" (UID: \"223043ea-b132-4d5d-9a14-0496d53fdc53\") " pod="openshift-multus/network-metrics-daemon-m6d6n" Apr 24 21:17:50.485534 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:50.485504 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-l2rwk\"" Apr 24 21:17:50.493562 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:50.493546 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m6d6n" Apr 24 21:17:50.610548 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:50.610501 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-m6d6n"] Apr 24 21:17:50.612906 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:17:50.612866 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod223043ea_b132_4d5d_9a14_0496d53fdc53.slice/crio-3839ae89c52de6181591b24c4a2c074a46856e6ef757f96aa00178b73de62010 WatchSource:0}: Error finding container 3839ae89c52de6181591b24c4a2c074a46856e6ef757f96aa00178b73de62010: Status 404 returned error can't find the container with id 3839ae89c52de6181591b24c4a2c074a46856e6ef757f96aa00178b73de62010 Apr 24 21:17:50.951652 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:50.951603 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-m6d6n" event={"ID":"223043ea-b132-4d5d-9a14-0496d53fdc53","Type":"ContainerStarted","Data":"3839ae89c52de6181591b24c4a2c074a46856e6ef757f96aa00178b73de62010"} Apr 24 21:17:51.955305 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:51.955274 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-m6d6n" event={"ID":"223043ea-b132-4d5d-9a14-0496d53fdc53","Type":"ContainerStarted","Data":"65cb7e02b87cbee1c10dc25320c518a4d75b5b86b69d011086260032fc170b4f"} Apr 24 21:17:51.955663 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:51.955312 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-m6d6n" event={"ID":"223043ea-b132-4d5d-9a14-0496d53fdc53","Type":"ContainerStarted","Data":"621915af46c4ffcc8e291dc472a3c0fdf65e9660b37cab8a9fcf130ca28160bd"} Apr 24 21:17:51.975305 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:51.975259 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-m6d6n" podStartSLOduration=131.091204234 podStartE2EDuration="2m11.975246966s" podCreationTimestamp="2026-04-24 21:15:40 +0000 UTC" firstStartedPulling="2026-04-24 21:17:50.614703373 +0000 UTC m=+130.610454109" lastFinishedPulling="2026-04-24 21:17:51.498746105 +0000 UTC m=+131.494496841" observedRunningTime="2026-04-24 21:17:51.974337509 +0000 UTC m=+131.970088268" watchObservedRunningTime="2026-04-24 21:17:51.975246966 +0000 UTC m=+131.970997751" Apr 24 21:17:54.567617 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:54.567590 2573 scope.go:117] "RemoveContainer" containerID="0703844da8487429d581535178c2146fe72a2691d700a59d71afa177ab814ca9" Apr 24 21:17:54.965122 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:54.965097 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/2.log" Apr 24 21:17:54.965457 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:54.965442 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/1.log" Apr 24 21:17:54.965504 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:54.965473 2573 generic.go:358] "Generic (PLEG): container finished" podID="c1a58610-8ce3-4b65-8ceb-500127ff5a26" containerID="4d0cf134c927b0e695121051929c9e682bbd6f03bed1624497881742ee187a74" exitCode=255 Apr 24 21:17:54.965573 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:54.965551 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-dln5m" event={"ID":"c1a58610-8ce3-4b65-8ceb-500127ff5a26","Type":"ContainerDied","Data":"4d0cf134c927b0e695121051929c9e682bbd6f03bed1624497881742ee187a74"} Apr 24 21:17:54.965608 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:54.965597 2573 scope.go:117] "RemoveContainer" containerID="0703844da8487429d581535178c2146fe72a2691d700a59d71afa177ab814ca9" Apr 24 21:17:54.965974 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:54.965956 2573 scope.go:117] "RemoveContainer" containerID="4d0cf134c927b0e695121051929c9e682bbd6f03bed1624497881742ee187a74" Apr 24 21:17:54.966160 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:17:54.966143 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-dln5m_openshift-console-operator(c1a58610-8ce3-4b65-8ceb-500127ff5a26)\"" pod="openshift-console-operator/console-operator-9d4b6777b-dln5m" podUID="c1a58610-8ce3-4b65-8ceb-500127ff5a26" Apr 24 21:17:55.969116 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:17:55.969090 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/2.log" Apr 24 21:18:00.537618 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:00.537592 2573 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-dln5m" Apr 24 21:18:00.537618 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:00.537622 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-dln5m" Apr 24 21:18:00.538050 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:00.537976 2573 scope.go:117] "RemoveContainer" containerID="4d0cf134c927b0e695121051929c9e682bbd6f03bed1624497881742ee187a74" Apr 24 21:18:00.538166 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:18:00.538148 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-dln5m_openshift-console-operator(c1a58610-8ce3-4b65-8ceb-500127ff5a26)\"" pod="openshift-console-operator/console-operator-9d4b6777b-dln5m" podUID="c1a58610-8ce3-4b65-8ceb-500127ff5a26" Apr 24 21:18:01.893446 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:01.893396 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/772a4257-de5c-42f7-8b8c-0ee2404f99a6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-nl9ms\" (UID: \"772a4257-de5c-42f7-8b8c-0ee2404f99a6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nl9ms" Apr 24 21:18:01.895775 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:01.895743 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/772a4257-de5c-42f7-8b8c-0ee2404f99a6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-nl9ms\" (UID: \"772a4257-de5c-42f7-8b8c-0ee2404f99a6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nl9ms" Apr 24 21:18:01.901077 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:01.901056 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-gdt8q\"" Apr 24 21:18:01.909070 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:01.909052 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nl9ms" Apr 24 21:18:02.024910 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:02.024861 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-nl9ms"] Apr 24 21:18:02.027534 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:18:02.027506 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod772a4257_de5c_42f7_8b8c_0ee2404f99a6.slice/crio-a9a6c7e5dd0ff8e8b23570e5bda5a98ad20ee6300f2697c24dbd2996f83d7b8d WatchSource:0}: Error finding container a9a6c7e5dd0ff8e8b23570e5bda5a98ad20ee6300f2697c24dbd2996f83d7b8d: Status 404 returned error can't find the container with id a9a6c7e5dd0ff8e8b23570e5bda5a98ad20ee6300f2697c24dbd2996f83d7b8d Apr 24 21:18:02.989240 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:02.989206 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nl9ms" event={"ID":"772a4257-de5c-42f7-8b8c-0ee2404f99a6","Type":"ContainerStarted","Data":"a9a6c7e5dd0ff8e8b23570e5bda5a98ad20ee6300f2697c24dbd2996f83d7b8d"} Apr 24 21:18:03.131210 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:03.131174 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-78x5v"] Apr 24 21:18:03.136549 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:03.136526 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-78x5v" Apr 24 21:18:03.139976 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:03.139788 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-hxfzx\"" Apr 24 21:18:03.139976 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:03.139788 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 21:18:03.139976 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:03.139831 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 21:18:03.156485 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:03.156446 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-78x5v"] Apr 24 21:18:03.201913 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:03.201854 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bd8ed4fc-6e48-474a-8cc2-9f257be8decd-data-volume\") pod \"insights-runtime-extractor-78x5v\" (UID: \"bd8ed4fc-6e48-474a-8cc2-9f257be8decd\") " pod="openshift-insights/insights-runtime-extractor-78x5v" Apr 24 21:18:03.202090 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:03.201951 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcbtt\" (UniqueName: \"kubernetes.io/projected/bd8ed4fc-6e48-474a-8cc2-9f257be8decd-kube-api-access-gcbtt\") pod \"insights-runtime-extractor-78x5v\" (UID: \"bd8ed4fc-6e48-474a-8cc2-9f257be8decd\") " pod="openshift-insights/insights-runtime-extractor-78x5v" Apr 24 21:18:03.202090 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:03.201991 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bd8ed4fc-6e48-474a-8cc2-9f257be8decd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-78x5v\" (UID: \"bd8ed4fc-6e48-474a-8cc2-9f257be8decd\") " pod="openshift-insights/insights-runtime-extractor-78x5v" Apr 24 21:18:03.202090 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:03.202030 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bd8ed4fc-6e48-474a-8cc2-9f257be8decd-crio-socket\") pod \"insights-runtime-extractor-78x5v\" (UID: \"bd8ed4fc-6e48-474a-8cc2-9f257be8decd\") " pod="openshift-insights/insights-runtime-extractor-78x5v" Apr 24 21:18:03.202256 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:03.202113 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bd8ed4fc-6e48-474a-8cc2-9f257be8decd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-78x5v\" (UID: \"bd8ed4fc-6e48-474a-8cc2-9f257be8decd\") " pod="openshift-insights/insights-runtime-extractor-78x5v" Apr 24 21:18:03.302801 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:03.302726 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bd8ed4fc-6e48-474a-8cc2-9f257be8decd-data-volume\") pod \"insights-runtime-extractor-78x5v\" (UID: \"bd8ed4fc-6e48-474a-8cc2-9f257be8decd\") " pod="openshift-insights/insights-runtime-extractor-78x5v" Apr 24 21:18:03.302801 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:03.302772 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gcbtt\" (UniqueName: \"kubernetes.io/projected/bd8ed4fc-6e48-474a-8cc2-9f257be8decd-kube-api-access-gcbtt\") pod \"insights-runtime-extractor-78x5v\" (UID: \"bd8ed4fc-6e48-474a-8cc2-9f257be8decd\") " pod="openshift-insights/insights-runtime-extractor-78x5v" Apr 24 21:18:03.303047 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:03.302803 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bd8ed4fc-6e48-474a-8cc2-9f257be8decd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-78x5v\" (UID: \"bd8ed4fc-6e48-474a-8cc2-9f257be8decd\") " pod="openshift-insights/insights-runtime-extractor-78x5v" Apr 24 21:18:03.303047 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:03.302836 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bd8ed4fc-6e48-474a-8cc2-9f257be8decd-crio-socket\") pod \"insights-runtime-extractor-78x5v\" (UID: \"bd8ed4fc-6e48-474a-8cc2-9f257be8decd\") " pod="openshift-insights/insights-runtime-extractor-78x5v" Apr 24 21:18:03.303047 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:03.302927 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bd8ed4fc-6e48-474a-8cc2-9f257be8decd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-78x5v\" (UID: \"bd8ed4fc-6e48-474a-8cc2-9f257be8decd\") " pod="openshift-insights/insights-runtime-extractor-78x5v" Apr 24 21:18:03.303047 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:03.302996 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bd8ed4fc-6e48-474a-8cc2-9f257be8decd-crio-socket\") pod \"insights-runtime-extractor-78x5v\" (UID: \"bd8ed4fc-6e48-474a-8cc2-9f257be8decd\") " pod="openshift-insights/insights-runtime-extractor-78x5v" Apr 24 21:18:03.303205 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:03.303143 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bd8ed4fc-6e48-474a-8cc2-9f257be8decd-data-volume\") pod \"insights-runtime-extractor-78x5v\" (UID: \"bd8ed4fc-6e48-474a-8cc2-9f257be8decd\") " pod="openshift-insights/insights-runtime-extractor-78x5v" Apr 24 21:18:03.303458 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:03.303439 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bd8ed4fc-6e48-474a-8cc2-9f257be8decd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-78x5v\" (UID: \"bd8ed4fc-6e48-474a-8cc2-9f257be8decd\") " pod="openshift-insights/insights-runtime-extractor-78x5v" Apr 24 21:18:03.305450 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:03.305429 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bd8ed4fc-6e48-474a-8cc2-9f257be8decd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-78x5v\" (UID: \"bd8ed4fc-6e48-474a-8cc2-9f257be8decd\") " pod="openshift-insights/insights-runtime-extractor-78x5v" Apr 24 21:18:03.317442 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:03.317397 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcbtt\" (UniqueName: \"kubernetes.io/projected/bd8ed4fc-6e48-474a-8cc2-9f257be8decd-kube-api-access-gcbtt\") pod \"insights-runtime-extractor-78x5v\" (UID: \"bd8ed4fc-6e48-474a-8cc2-9f257be8decd\") " pod="openshift-insights/insights-runtime-extractor-78x5v" Apr 24 21:18:03.447664 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:03.447635 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-78x5v" Apr 24 21:18:03.576377 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:03.576346 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-78x5v"] Apr 24 21:18:03.579104 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:18:03.579082 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd8ed4fc_6e48_474a_8cc2_9f257be8decd.slice/crio-1d70ba0668a24246123275993daa4f123a1733804e31d09533fca1a96b9bd33f WatchSource:0}: Error finding container 1d70ba0668a24246123275993daa4f123a1733804e31d09533fca1a96b9bd33f: Status 404 returned error can't find the container with id 1d70ba0668a24246123275993daa4f123a1733804e31d09533fca1a96b9bd33f Apr 24 21:18:03.994337 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:03.994298 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nl9ms" event={"ID":"772a4257-de5c-42f7-8b8c-0ee2404f99a6","Type":"ContainerStarted","Data":"b657d7306cf626f8463abd4d492be52b9d829f37cf948cadfe00e57359c0dec6"} Apr 24 21:18:03.995963 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:03.995928 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-78x5v" event={"ID":"bd8ed4fc-6e48-474a-8cc2-9f257be8decd","Type":"ContainerStarted","Data":"727000d6731cf09db325c8e19c28a8b1ded7a113add631d6c52c252bed8b7ec5"} Apr 24 21:18:03.996079 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:03.995963 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-78x5v" event={"ID":"bd8ed4fc-6e48-474a-8cc2-9f257be8decd","Type":"ContainerStarted","Data":"1d70ba0668a24246123275993daa4f123a1733804e31d09533fca1a96b9bd33f"} Apr 24 21:18:04.012826 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:04.012785 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-nl9ms" podStartSLOduration=32.543000289 podStartE2EDuration="34.012771418s" podCreationTimestamp="2026-04-24 21:17:30 +0000 UTC" firstStartedPulling="2026-04-24 21:18:02.029212648 +0000 UTC m=+142.024963384" lastFinishedPulling="2026-04-24 21:18:03.498983759 +0000 UTC m=+143.494734513" observedRunningTime="2026-04-24 21:18:04.011149079 +0000 UTC m=+144.006899837" watchObservedRunningTime="2026-04-24 21:18:04.012771418 +0000 UTC m=+144.008522175" Apr 24 21:18:04.120144 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:04.120101 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mdndr"] Apr 24 21:18:04.123505 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:04.123487 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mdndr" Apr 24 21:18:04.126124 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:04.126102 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 24 21:18:04.126124 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:04.126120 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-xnbgv\"" Apr 24 21:18:04.131053 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:04.131030 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mdndr"] Apr 24 21:18:04.211508 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:04.211487 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/5f38d17b-da8e-46bd-ba98-9a498446b4d2-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-mdndr\" (UID: \"5f38d17b-da8e-46bd-ba98-9a498446b4d2\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mdndr" Apr 24 21:18:04.312764 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:04.312690 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/5f38d17b-da8e-46bd-ba98-9a498446b4d2-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-mdndr\" (UID: \"5f38d17b-da8e-46bd-ba98-9a498446b4d2\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mdndr" Apr 24 21:18:04.315042 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:04.315019 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/5f38d17b-da8e-46bd-ba98-9a498446b4d2-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-mdndr\" (UID: \"5f38d17b-da8e-46bd-ba98-9a498446b4d2\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mdndr" Apr 24 21:18:04.433343 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:04.433304 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mdndr" Apr 24 21:18:04.551055 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:04.550759 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mdndr"] Apr 24 21:18:04.553280 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:18:04.553240 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f38d17b_da8e_46bd_ba98_9a498446b4d2.slice/crio-924d5de38e55054a5e35703841d7c8232124757c32fcf4993f9aba9d8e2429f9 WatchSource:0}: Error finding container 924d5de38e55054a5e35703841d7c8232124757c32fcf4993f9aba9d8e2429f9: Status 404 returned error can't find the container with id 924d5de38e55054a5e35703841d7c8232124757c32fcf4993f9aba9d8e2429f9 Apr 24 21:18:05.000393 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:05.000342 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mdndr" event={"ID":"5f38d17b-da8e-46bd-ba98-9a498446b4d2","Type":"ContainerStarted","Data":"924d5de38e55054a5e35703841d7c8232124757c32fcf4993f9aba9d8e2429f9"} Apr 24 21:18:05.002635 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:05.002596 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-78x5v" event={"ID":"bd8ed4fc-6e48-474a-8cc2-9f257be8decd","Type":"ContainerStarted","Data":"2a3af0043f50c58057c1258ba8160a49093e3518c7c123cff6eb8a413e588604"} Apr 24 21:18:06.006594 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:06.006512 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-78x5v" event={"ID":"bd8ed4fc-6e48-474a-8cc2-9f257be8decd","Type":"ContainerStarted","Data":"0db9c0d32b586d0efc49d714cb74720ce422107085f81f6d77dad84e6a006659"} Apr 24 21:18:06.007722 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:06.007703 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mdndr" event={"ID":"5f38d17b-da8e-46bd-ba98-9a498446b4d2","Type":"ContainerStarted","Data":"1f40c59587731d182f1af289725d8d4f2671232f4239b6bac8de31b5b56ea30c"} Apr 24 21:18:06.007926 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:06.007901 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mdndr" Apr 24 21:18:06.012269 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:06.012246 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mdndr" Apr 24 21:18:06.025226 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:06.025182 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-78x5v" podStartSLOduration=0.92102184 podStartE2EDuration="3.02517121s" podCreationTimestamp="2026-04-24 21:18:03 +0000 UTC" firstStartedPulling="2026-04-24 21:18:03.630936048 +0000 UTC m=+143.626686784" lastFinishedPulling="2026-04-24 21:18:05.735085416 +0000 UTC m=+145.730836154" observedRunningTime="2026-04-24 21:18:06.023745069 +0000 UTC m=+146.019495826" watchObservedRunningTime="2026-04-24 21:18:06.02517121 +0000 UTC m=+146.020921968" Apr 24 21:18:06.041023 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:06.040983 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mdndr" podStartSLOduration=0.859481397 podStartE2EDuration="2.040972601s" podCreationTimestamp="2026-04-24 21:18:04 +0000 UTC" firstStartedPulling="2026-04-24 21:18:04.555128382 +0000 UTC m=+144.550879118" lastFinishedPulling="2026-04-24 21:18:05.736619583 +0000 UTC m=+145.732370322" observedRunningTime="2026-04-24 21:18:06.039566063 +0000 UTC m=+146.035316820" watchObservedRunningTime="2026-04-24 21:18:06.040972601 +0000 UTC m=+146.036723360" Apr 24 21:18:06.245638 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:06.245605 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-hpwxd"] Apr 24 21:18:06.248990 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:06.248974 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-hpwxd" Apr 24 21:18:06.252206 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:06.252027 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-5tpmh\"" Apr 24 21:18:06.252206 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:06.252129 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-hpwxd"] Apr 24 21:18:06.252950 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:06.252933 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 21:18:06.254329 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:06.254299 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 24 21:18:06.259827 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:06.259781 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 24 21:18:06.333501 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:06.333473 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7d8232dc-a440-4a5c-8138-8119b4f19fd7-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-hpwxd\" (UID: \"7d8232dc-a440-4a5c-8138-8119b4f19fd7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hpwxd" Apr 24 21:18:06.333654 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:06.333530 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d8232dc-a440-4a5c-8138-8119b4f19fd7-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-hpwxd\" (UID: \"7d8232dc-a440-4a5c-8138-8119b4f19fd7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hpwxd" Apr 24 21:18:06.333654 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:06.333590 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwspb\" (UniqueName: \"kubernetes.io/projected/7d8232dc-a440-4a5c-8138-8119b4f19fd7-kube-api-access-cwspb\") pod \"prometheus-operator-5676c8c784-hpwxd\" (UID: \"7d8232dc-a440-4a5c-8138-8119b4f19fd7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hpwxd" Apr 24 21:18:06.333654 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:06.333648 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7d8232dc-a440-4a5c-8138-8119b4f19fd7-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-hpwxd\" (UID: \"7d8232dc-a440-4a5c-8138-8119b4f19fd7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hpwxd" Apr 24 21:18:06.434615 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:06.434586 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7d8232dc-a440-4a5c-8138-8119b4f19fd7-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-hpwxd\" (UID: \"7d8232dc-a440-4a5c-8138-8119b4f19fd7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hpwxd" Apr 24 21:18:06.434759 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:06.434647 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d8232dc-a440-4a5c-8138-8119b4f19fd7-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-hpwxd\" (UID: \"7d8232dc-a440-4a5c-8138-8119b4f19fd7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hpwxd" Apr 24 21:18:06.434759 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:06.434689 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cwspb\" (UniqueName: \"kubernetes.io/projected/7d8232dc-a440-4a5c-8138-8119b4f19fd7-kube-api-access-cwspb\") pod \"prometheus-operator-5676c8c784-hpwxd\" (UID: \"7d8232dc-a440-4a5c-8138-8119b4f19fd7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hpwxd" Apr 24 21:18:06.434759 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:06.434720 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7d8232dc-a440-4a5c-8138-8119b4f19fd7-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-hpwxd\" (UID: \"7d8232dc-a440-4a5c-8138-8119b4f19fd7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hpwxd" Apr 24 21:18:06.435372 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:06.435351 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7d8232dc-a440-4a5c-8138-8119b4f19fd7-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-hpwxd\" (UID: \"7d8232dc-a440-4a5c-8138-8119b4f19fd7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hpwxd" Apr 24 21:18:06.437258 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:06.437233 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7d8232dc-a440-4a5c-8138-8119b4f19fd7-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-hpwxd\" (UID: \"7d8232dc-a440-4a5c-8138-8119b4f19fd7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hpwxd" Apr 24 21:18:06.437336 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:06.437314 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d8232dc-a440-4a5c-8138-8119b4f19fd7-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-hpwxd\" (UID: \"7d8232dc-a440-4a5c-8138-8119b4f19fd7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hpwxd" Apr 24 21:18:06.447818 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:06.447787 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwspb\" (UniqueName: \"kubernetes.io/projected/7d8232dc-a440-4a5c-8138-8119b4f19fd7-kube-api-access-cwspb\") pod \"prometheus-operator-5676c8c784-hpwxd\" (UID: \"7d8232dc-a440-4a5c-8138-8119b4f19fd7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hpwxd" Apr 24 21:18:06.558707 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:06.558633 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-hpwxd" Apr 24 21:18:06.676841 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:06.676812 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-hpwxd"] Apr 24 21:18:06.679645 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:18:06.679615 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d8232dc_a440_4a5c_8138_8119b4f19fd7.slice/crio-70d2bf1fa3869fe9c893716cf5ea44c4e4aaadf0b87b229cbc11e211d512fb30 WatchSource:0}: Error finding container 70d2bf1fa3869fe9c893716cf5ea44c4e4aaadf0b87b229cbc11e211d512fb30: Status 404 returned error can't find the container with id 70d2bf1fa3869fe9c893716cf5ea44c4e4aaadf0b87b229cbc11e211d512fb30 Apr 24 21:18:07.011283 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:07.011245 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-hpwxd" event={"ID":"7d8232dc-a440-4a5c-8138-8119b4f19fd7","Type":"ContainerStarted","Data":"70d2bf1fa3869fe9c893716cf5ea44c4e4aaadf0b87b229cbc11e211d512fb30"} Apr 24 21:18:09.019207 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:09.019165 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-hpwxd" event={"ID":"7d8232dc-a440-4a5c-8138-8119b4f19fd7","Type":"ContainerStarted","Data":"3398d1ccc2c930e9bbadfababc06812971544b3f0ed3e1a280aa0111907d57ba"} Apr 24 21:18:09.019207 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:09.019205 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-hpwxd" event={"ID":"7d8232dc-a440-4a5c-8138-8119b4f19fd7","Type":"ContainerStarted","Data":"3515afcb61a09488c11efdd6a8052b8e2e3493188f3489298d26ecf66b5c89f4"} Apr 24 21:18:09.038949 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:09.038890 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-hpwxd" podStartSLOduration=1.472103645 podStartE2EDuration="3.038862871s" podCreationTimestamp="2026-04-24 21:18:06 +0000 UTC" firstStartedPulling="2026-04-24 21:18:06.681387228 +0000 UTC m=+146.677137968" lastFinishedPulling="2026-04-24 21:18:08.248146458 +0000 UTC m=+148.243897194" observedRunningTime="2026-04-24 21:18:09.038167104 +0000 UTC m=+149.033917862" watchObservedRunningTime="2026-04-24 21:18:09.038862871 +0000 UTC m=+149.034613630" Apr 24 21:18:10.582264 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:10.582234 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-7khp8"] Apr 24 21:18:10.585546 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:10.585525 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7khp8" Apr 24 21:18:10.587952 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:10.587929 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-f2sfg\"" Apr 24 21:18:10.588151 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:10.588127 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 21:18:10.588327 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:10.588307 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 21:18:10.588441 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:10.588248 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 21:18:10.671173 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:10.671144 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bfcb8364-b2c4-4264-99a6-c796a5c6678a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7khp8\" (UID: \"bfcb8364-b2c4-4264-99a6-c796a5c6678a\") " pod="openshift-monitoring/node-exporter-7khp8" Apr 24 21:18:10.671173 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:10.671179 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bfcb8364-b2c4-4264-99a6-c796a5c6678a-node-exporter-tls\") pod \"node-exporter-7khp8\" (UID: \"bfcb8364-b2c4-4264-99a6-c796a5c6678a\") " pod="openshift-monitoring/node-exporter-7khp8" Apr 24 21:18:10.671402 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:10.671198 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bfcb8364-b2c4-4264-99a6-c796a5c6678a-metrics-client-ca\") pod \"node-exporter-7khp8\" (UID: \"bfcb8364-b2c4-4264-99a6-c796a5c6678a\") " pod="openshift-monitoring/node-exporter-7khp8" Apr 24 21:18:10.671402 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:10.671234 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bfcb8364-b2c4-4264-99a6-c796a5c6678a-sys\") pod \"node-exporter-7khp8\" (UID: \"bfcb8364-b2c4-4264-99a6-c796a5c6678a\") " pod="openshift-monitoring/node-exporter-7khp8" Apr 24 21:18:10.671402 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:10.671316 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bfcb8364-b2c4-4264-99a6-c796a5c6678a-node-exporter-textfile\") pod \"node-exporter-7khp8\" (UID: \"bfcb8364-b2c4-4264-99a6-c796a5c6678a\") " pod="openshift-monitoring/node-exporter-7khp8" Apr 24 21:18:10.671402 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:10.671360 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bfcb8364-b2c4-4264-99a6-c796a5c6678a-node-exporter-wtmp\") pod \"node-exporter-7khp8\" (UID: \"bfcb8364-b2c4-4264-99a6-c796a5c6678a\") " pod="openshift-monitoring/node-exporter-7khp8" Apr 24 21:18:10.671402 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:10.671387 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/bfcb8364-b2c4-4264-99a6-c796a5c6678a-node-exporter-accelerators-collector-config\") pod \"node-exporter-7khp8\" (UID: \"bfcb8364-b2c4-4264-99a6-c796a5c6678a\") " pod="openshift-monitoring/node-exporter-7khp8" Apr 24 21:18:10.671658 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:10.671446 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bfcb8364-b2c4-4264-99a6-c796a5c6678a-root\") pod \"node-exporter-7khp8\" (UID: \"bfcb8364-b2c4-4264-99a6-c796a5c6678a\") " pod="openshift-monitoring/node-exporter-7khp8" Apr 24 21:18:10.671658 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:10.671506 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgkfm\" (UniqueName: \"kubernetes.io/projected/bfcb8364-b2c4-4264-99a6-c796a5c6678a-kube-api-access-qgkfm\") pod \"node-exporter-7khp8\" (UID: \"bfcb8364-b2c4-4264-99a6-c796a5c6678a\") " pod="openshift-monitoring/node-exporter-7khp8" Apr 24 21:18:10.772812 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:10.772781 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bfcb8364-b2c4-4264-99a6-c796a5c6678a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7khp8\" (UID: \"bfcb8364-b2c4-4264-99a6-c796a5c6678a\") " pod="openshift-monitoring/node-exporter-7khp8" Apr 24 21:18:10.772812 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:10.772817 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bfcb8364-b2c4-4264-99a6-c796a5c6678a-node-exporter-tls\") pod \"node-exporter-7khp8\" (UID: \"bfcb8364-b2c4-4264-99a6-c796a5c6678a\") " pod="openshift-monitoring/node-exporter-7khp8" Apr 24 21:18:10.773066 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:10.772834 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bfcb8364-b2c4-4264-99a6-c796a5c6678a-metrics-client-ca\") pod \"node-exporter-7khp8\" (UID: \"bfcb8364-b2c4-4264-99a6-c796a5c6678a\") " pod="openshift-monitoring/node-exporter-7khp8" Apr 24 21:18:10.773066 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:18:10.772957 2573 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 21:18:10.773066 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:18:10.773019 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfcb8364-b2c4-4264-99a6-c796a5c6678a-node-exporter-tls podName:bfcb8364-b2c4-4264-99a6-c796a5c6678a nodeName:}" failed. No retries permitted until 2026-04-24 21:18:11.27300068 +0000 UTC m=+151.268751425 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/bfcb8364-b2c4-4264-99a6-c796a5c6678a-node-exporter-tls") pod "node-exporter-7khp8" (UID: "bfcb8364-b2c4-4264-99a6-c796a5c6678a") : secret "node-exporter-tls" not found Apr 24 21:18:10.773318 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:10.773289 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bfcb8364-b2c4-4264-99a6-c796a5c6678a-sys\") pod \"node-exporter-7khp8\" (UID: \"bfcb8364-b2c4-4264-99a6-c796a5c6678a\") " pod="openshift-monitoring/node-exporter-7khp8" Apr 24 21:18:10.773455 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:10.773350 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bfcb8364-b2c4-4264-99a6-c796a5c6678a-node-exporter-textfile\") pod \"node-exporter-7khp8\" (UID: \"bfcb8364-b2c4-4264-99a6-c796a5c6678a\") " pod="openshift-monitoring/node-exporter-7khp8" Apr 24 21:18:10.773455 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:10.773382 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bfcb8364-b2c4-4264-99a6-c796a5c6678a-node-exporter-wtmp\") pod \"node-exporter-7khp8\" (UID: \"bfcb8364-b2c4-4264-99a6-c796a5c6678a\") " pod="openshift-monitoring/node-exporter-7khp8" Apr 24 21:18:10.773455 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:10.773397 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bfcb8364-b2c4-4264-99a6-c796a5c6678a-metrics-client-ca\") pod \"node-exporter-7khp8\" (UID: \"bfcb8364-b2c4-4264-99a6-c796a5c6678a\") " pod="openshift-monitoring/node-exporter-7khp8" Apr 24 21:18:10.773455 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:10.773422 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/bfcb8364-b2c4-4264-99a6-c796a5c6678a-node-exporter-accelerators-collector-config\") pod \"node-exporter-7khp8\" (UID: \"bfcb8364-b2c4-4264-99a6-c796a5c6678a\") " pod="openshift-monitoring/node-exporter-7khp8" Apr 24 21:18:10.773659 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:10.773459 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bfcb8364-b2c4-4264-99a6-c796a5c6678a-root\") pod \"node-exporter-7khp8\" (UID: \"bfcb8364-b2c4-4264-99a6-c796a5c6678a\") " pod="openshift-monitoring/node-exporter-7khp8" Apr 24 21:18:10.773659 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:10.773513 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qgkfm\" (UniqueName: \"kubernetes.io/projected/bfcb8364-b2c4-4264-99a6-c796a5c6678a-kube-api-access-qgkfm\") pod \"node-exporter-7khp8\" (UID: \"bfcb8364-b2c4-4264-99a6-c796a5c6678a\") " pod="openshift-monitoring/node-exporter-7khp8" Apr 24 21:18:10.773659 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:10.773628 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bfcb8364-b2c4-4264-99a6-c796a5c6678a-node-exporter-textfile\") pod \"node-exporter-7khp8\" (UID: \"bfcb8364-b2c4-4264-99a6-c796a5c6678a\") " pod="openshift-monitoring/node-exporter-7khp8" Apr 24 21:18:10.773817 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:10.773683 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bfcb8364-b2c4-4264-99a6-c796a5c6678a-sys\") pod \"node-exporter-7khp8\" (UID: \"bfcb8364-b2c4-4264-99a6-c796a5c6678a\") " pod="openshift-monitoring/node-exporter-7khp8" Apr 24 21:18:10.773817 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:10.773764 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bfcb8364-b2c4-4264-99a6-c796a5c6678a-root\") pod \"node-exporter-7khp8\" (UID: \"bfcb8364-b2c4-4264-99a6-c796a5c6678a\") " pod="openshift-monitoring/node-exporter-7khp8" Apr 24 21:18:10.773943 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:10.773848 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bfcb8364-b2c4-4264-99a6-c796a5c6678a-node-exporter-wtmp\") pod \"node-exporter-7khp8\" (UID: \"bfcb8364-b2c4-4264-99a6-c796a5c6678a\") " pod="openshift-monitoring/node-exporter-7khp8" Apr 24 21:18:10.774243 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:10.774216 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/bfcb8364-b2c4-4264-99a6-c796a5c6678a-node-exporter-accelerators-collector-config\") pod \"node-exporter-7khp8\" (UID: \"bfcb8364-b2c4-4264-99a6-c796a5c6678a\") " pod="openshift-monitoring/node-exporter-7khp8" Apr 24 21:18:10.775664 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:10.775640 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bfcb8364-b2c4-4264-99a6-c796a5c6678a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7khp8\" (UID: \"bfcb8364-b2c4-4264-99a6-c796a5c6678a\") " pod="openshift-monitoring/node-exporter-7khp8" Apr 24 21:18:10.783745 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:10.783699 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgkfm\" (UniqueName: \"kubernetes.io/projected/bfcb8364-b2c4-4264-99a6-c796a5c6678a-kube-api-access-qgkfm\") pod \"node-exporter-7khp8\" (UID: \"bfcb8364-b2c4-4264-99a6-c796a5c6678a\") " pod="openshift-monitoring/node-exporter-7khp8" Apr 24 21:18:11.276882 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:11.276835 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bfcb8364-b2c4-4264-99a6-c796a5c6678a-node-exporter-tls\") pod \"node-exporter-7khp8\" (UID: \"bfcb8364-b2c4-4264-99a6-c796a5c6678a\") " pod="openshift-monitoring/node-exporter-7khp8" Apr 24 21:18:11.279214 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:11.279192 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bfcb8364-b2c4-4264-99a6-c796a5c6678a-node-exporter-tls\") pod \"node-exporter-7khp8\" (UID: \"bfcb8364-b2c4-4264-99a6-c796a5c6678a\") " pod="openshift-monitoring/node-exporter-7khp8" Apr 24 21:18:11.495766 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:11.495727 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7khp8" Apr 24 21:18:11.504218 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:18:11.504188 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfcb8364_b2c4_4264_99a6_c796a5c6678a.slice/crio-073d6cfc0e7807ee2270f0c3df2eb3687fa3542f29edd6627d53488c41c4e16a WatchSource:0}: Error finding container 073d6cfc0e7807ee2270f0c3df2eb3687fa3542f29edd6627d53488c41c4e16a: Status 404 returned error can't find the container with id 073d6cfc0e7807ee2270f0c3df2eb3687fa3542f29edd6627d53488c41c4e16a Apr 24 21:18:12.028207 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:12.028174 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7khp8" event={"ID":"bfcb8364-b2c4-4264-99a6-c796a5c6678a","Type":"ContainerStarted","Data":"073d6cfc0e7807ee2270f0c3df2eb3687fa3542f29edd6627d53488c41c4e16a"} Apr 24 21:18:12.567403 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:12.567372 2573 scope.go:117] "RemoveContainer" containerID="4d0cf134c927b0e695121051929c9e682bbd6f03bed1624497881742ee187a74" Apr 24 21:18:12.567610 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:18:12.567590 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-dln5m_openshift-console-operator(c1a58610-8ce3-4b65-8ceb-500127ff5a26)\"" pod="openshift-console-operator/console-operator-9d4b6777b-dln5m" podUID="c1a58610-8ce3-4b65-8ceb-500127ff5a26" Apr 24 21:18:13.031841 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:13.031803 2573 generic.go:358] "Generic (PLEG): container finished" podID="bfcb8364-b2c4-4264-99a6-c796a5c6678a" containerID="95c485347f483a37a719f88bc3089fbe1d7fc7c3e081631af8af3c68383f46b1" exitCode=0 Apr 24 21:18:13.032284 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:13.031860 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7khp8" event={"ID":"bfcb8364-b2c4-4264-99a6-c796a5c6678a","Type":"ContainerDied","Data":"95c485347f483a37a719f88bc3089fbe1d7fc7c3e081631af8af3c68383f46b1"} Apr 24 21:18:13.633010 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:13.632970 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8"] Apr 24 21:18:13.636862 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:13.636840 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8" Apr 24 21:18:13.639591 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:13.639567 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 24 21:18:13.639705 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:13.639598 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 24 21:18:13.639772 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:13.639743 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 24 21:18:13.639888 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:13.639856 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-4jeccap07e13d\"" Apr 24 21:18:13.640001 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:13.639983 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 24 21:18:13.640101 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:13.640083 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 24 21:18:13.640227 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:13.640210 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-czcf8\"" Apr 24 21:18:13.648446 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:13.648426 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8"] Apr 24 21:18:13.699910 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:13.699886 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-522s5\" (UniqueName: \"kubernetes.io/projected/ef5047f4-c231-4399-9dd1-f54603124de8-kube-api-access-522s5\") pod \"thanos-querier-5b89f6dc86-bwsr8\" (UID: \"ef5047f4-c231-4399-9dd1-f54603124de8\") " pod="openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8" Apr 24 21:18:13.700038 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:13.699926 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/ef5047f4-c231-4399-9dd1-f54603124de8-secret-thanos-querier-tls\") pod \"thanos-querier-5b89f6dc86-bwsr8\" (UID: \"ef5047f4-c231-4399-9dd1-f54603124de8\") " pod="openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8" Apr 24 21:18:13.700038 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:13.699979 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ef5047f4-c231-4399-9dd1-f54603124de8-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5b89f6dc86-bwsr8\" (UID: \"ef5047f4-c231-4399-9dd1-f54603124de8\") " pod="openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8" Apr 24 21:18:13.700038 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:13.700009 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/ef5047f4-c231-4399-9dd1-f54603124de8-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5b89f6dc86-bwsr8\" (UID: \"ef5047f4-c231-4399-9dd1-f54603124de8\") " pod="openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8" Apr 24 21:18:13.700209 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:13.700136 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/ef5047f4-c231-4399-9dd1-f54603124de8-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5b89f6dc86-bwsr8\" (UID: \"ef5047f4-c231-4399-9dd1-f54603124de8\") " pod="openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8" Apr 24 21:18:13.700209 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:13.700166 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ef5047f4-c231-4399-9dd1-f54603124de8-secret-grpc-tls\") pod \"thanos-querier-5b89f6dc86-bwsr8\" (UID: \"ef5047f4-c231-4399-9dd1-f54603124de8\") " pod="openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8" Apr 24 21:18:13.700209 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:13.700199 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ef5047f4-c231-4399-9dd1-f54603124de8-metrics-client-ca\") pod \"thanos-querier-5b89f6dc86-bwsr8\" (UID: \"ef5047f4-c231-4399-9dd1-f54603124de8\") " pod="openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8" Apr 24 21:18:13.700349 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:13.700261 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ef5047f4-c231-4399-9dd1-f54603124de8-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5b89f6dc86-bwsr8\" (UID: \"ef5047f4-c231-4399-9dd1-f54603124de8\") " pod="openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8" Apr 24 21:18:13.801170 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:13.801139 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/ef5047f4-c231-4399-9dd1-f54603124de8-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5b89f6dc86-bwsr8\" (UID: \"ef5047f4-c231-4399-9dd1-f54603124de8\") " pod="openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8" Apr 24 21:18:13.801170 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:13.801172 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ef5047f4-c231-4399-9dd1-f54603124de8-secret-grpc-tls\") pod \"thanos-querier-5b89f6dc86-bwsr8\" (UID: \"ef5047f4-c231-4399-9dd1-f54603124de8\") " pod="openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8" Apr 24 21:18:13.801380 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:13.801199 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ef5047f4-c231-4399-9dd1-f54603124de8-metrics-client-ca\") pod \"thanos-querier-5b89f6dc86-bwsr8\" (UID: \"ef5047f4-c231-4399-9dd1-f54603124de8\") " pod="openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8" Apr 24 21:18:13.801435 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:13.801389 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ef5047f4-c231-4399-9dd1-f54603124de8-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5b89f6dc86-bwsr8\" (UID: \"ef5047f4-c231-4399-9dd1-f54603124de8\") " pod="openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8" Apr 24 21:18:13.801505 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:13.801490 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-522s5\" (UniqueName: \"kubernetes.io/projected/ef5047f4-c231-4399-9dd1-f54603124de8-kube-api-access-522s5\") pod \"thanos-querier-5b89f6dc86-bwsr8\" (UID: \"ef5047f4-c231-4399-9dd1-f54603124de8\") " pod="openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8" Apr 24 21:18:13.801562 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:13.801529 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/ef5047f4-c231-4399-9dd1-f54603124de8-secret-thanos-querier-tls\") pod \"thanos-querier-5b89f6dc86-bwsr8\" (UID: \"ef5047f4-c231-4399-9dd1-f54603124de8\") " pod="openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8" Apr 24 21:18:13.801614 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:13.801557 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ef5047f4-c231-4399-9dd1-f54603124de8-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5b89f6dc86-bwsr8\" (UID: \"ef5047f4-c231-4399-9dd1-f54603124de8\") " pod="openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8" Apr 24 21:18:13.801614 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:13.801587 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/ef5047f4-c231-4399-9dd1-f54603124de8-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5b89f6dc86-bwsr8\" (UID: \"ef5047f4-c231-4399-9dd1-f54603124de8\") " pod="openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8" Apr 24 21:18:13.802231 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:13.802207 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ef5047f4-c231-4399-9dd1-f54603124de8-metrics-client-ca\") pod \"thanos-querier-5b89f6dc86-bwsr8\" (UID: \"ef5047f4-c231-4399-9dd1-f54603124de8\") " pod="openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8" Apr 24 21:18:13.804311 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:13.804275 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ef5047f4-c231-4399-9dd1-f54603124de8-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5b89f6dc86-bwsr8\" (UID: \"ef5047f4-c231-4399-9dd1-f54603124de8\") " pod="openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8" Apr 24 21:18:13.804490 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:13.804438 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/ef5047f4-c231-4399-9dd1-f54603124de8-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5b89f6dc86-bwsr8\" (UID: \"ef5047f4-c231-4399-9dd1-f54603124de8\") " pod="openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8" Apr 24 21:18:13.804490 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:13.804474 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ef5047f4-c231-4399-9dd1-f54603124de8-secret-grpc-tls\") pod \"thanos-querier-5b89f6dc86-bwsr8\" (UID: \"ef5047f4-c231-4399-9dd1-f54603124de8\") " pod="openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8" Apr 24 21:18:13.804602 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:13.804522 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/ef5047f4-c231-4399-9dd1-f54603124de8-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5b89f6dc86-bwsr8\" (UID: \"ef5047f4-c231-4399-9dd1-f54603124de8\") " pod="openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8" Apr 24 21:18:13.804888 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:13.804855 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/ef5047f4-c231-4399-9dd1-f54603124de8-secret-thanos-querier-tls\") pod \"thanos-querier-5b89f6dc86-bwsr8\" (UID: \"ef5047f4-c231-4399-9dd1-f54603124de8\") " pod="openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8" Apr 24 21:18:13.805102 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:13.805081 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ef5047f4-c231-4399-9dd1-f54603124de8-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5b89f6dc86-bwsr8\" (UID: \"ef5047f4-c231-4399-9dd1-f54603124de8\") " pod="openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8" Apr 24 21:18:13.809852 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:13.809832 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-522s5\" (UniqueName: \"kubernetes.io/projected/ef5047f4-c231-4399-9dd1-f54603124de8-kube-api-access-522s5\") pod \"thanos-querier-5b89f6dc86-bwsr8\" (UID: \"ef5047f4-c231-4399-9dd1-f54603124de8\") " pod="openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8" Apr 24 21:18:13.947034 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:13.946953 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8" Apr 24 21:18:14.038029 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:14.037983 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7khp8" event={"ID":"bfcb8364-b2c4-4264-99a6-c796a5c6678a","Type":"ContainerStarted","Data":"4e85bd17f398f18647c2a90331d1e99012743dd678e48d3cc4718a8600b257d4"} Apr 24 21:18:14.038352 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:14.038059 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7khp8" event={"ID":"bfcb8364-b2c4-4264-99a6-c796a5c6678a","Type":"ContainerStarted","Data":"f771e5c84795bba4a25efd99848b9eca702bc6d0c50191b9a912741c9831620d"} Apr 24 21:18:14.059488 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:14.059438 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-7khp8" podStartSLOduration=3.3685519250000002 podStartE2EDuration="4.059424934s" podCreationTimestamp="2026-04-24 21:18:10 +0000 UTC" firstStartedPulling="2026-04-24 21:18:11.505859187 +0000 UTC m=+151.501609933" lastFinishedPulling="2026-04-24 21:18:12.196732202 +0000 UTC m=+152.192482942" observedRunningTime="2026-04-24 21:18:14.058584863 +0000 UTC m=+154.054335645" watchObservedRunningTime="2026-04-24 21:18:14.059424934 +0000 UTC m=+154.055175691" Apr 24 21:18:14.069214 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:14.069187 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8"] Apr 24 21:18:14.074053 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:18:14.074026 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef5047f4_c231_4399_9dd1_f54603124de8.slice/crio-2853febd4ad6ed2928c055378549f505e90a237aa8d59a9c33f1028501668bb9 WatchSource:0}: Error finding container 2853febd4ad6ed2928c055378549f505e90a237aa8d59a9c33f1028501668bb9: Status 404 returned error can't find the container with id 2853febd4ad6ed2928c055378549f505e90a237aa8d59a9c33f1028501668bb9 Apr 24 21:18:14.979018 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:14.978971 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-5978444448-xwdl6"] Apr 24 21:18:14.982415 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:14.982394 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5978444448-xwdl6" Apr 24 21:18:14.986002 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:14.985979 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 24 21:18:14.986949 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:14.986928 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-qtbz7\"" Apr 24 21:18:14.986949 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:14.986942 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 24 21:18:14.987419 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:14.987398 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 21:18:14.987419 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:14.987414 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-fvmk7eef58399\"" Apr 24 21:18:14.987570 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:14.987410 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 24 21:18:14.998720 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:14.998680 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5978444448-xwdl6"] Apr 24 21:18:15.042408 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:15.042374 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8" event={"ID":"ef5047f4-c231-4399-9dd1-f54603124de8","Type":"ContainerStarted","Data":"2853febd4ad6ed2928c055378549f505e90a237aa8d59a9c33f1028501668bb9"} Apr 24 21:18:15.113180 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:15.113151 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/e4dc9d95-b126-48df-97ba-118157d5b0a4-metrics-server-audit-profiles\") pod \"metrics-server-5978444448-xwdl6\" (UID: \"e4dc9d95-b126-48df-97ba-118157d5b0a4\") " pod="openshift-monitoring/metrics-server-5978444448-xwdl6" Apr 24 21:18:15.113328 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:15.113186 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4dc9d95-b126-48df-97ba-118157d5b0a4-client-ca-bundle\") pod \"metrics-server-5978444448-xwdl6\" (UID: \"e4dc9d95-b126-48df-97ba-118157d5b0a4\") " pod="openshift-monitoring/metrics-server-5978444448-xwdl6" Apr 24 21:18:15.113328 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:15.113288 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4wp6\" (UniqueName: \"kubernetes.io/projected/e4dc9d95-b126-48df-97ba-118157d5b0a4-kube-api-access-m4wp6\") pod \"metrics-server-5978444448-xwdl6\" (UID: \"e4dc9d95-b126-48df-97ba-118157d5b0a4\") " pod="openshift-monitoring/metrics-server-5978444448-xwdl6" Apr 24 21:18:15.113449 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:15.113340 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/e4dc9d95-b126-48df-97ba-118157d5b0a4-audit-log\") pod \"metrics-server-5978444448-xwdl6\" (UID: \"e4dc9d95-b126-48df-97ba-118157d5b0a4\") " pod="openshift-monitoring/metrics-server-5978444448-xwdl6" Apr 24 21:18:15.113611 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:15.113584 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4dc9d95-b126-48df-97ba-118157d5b0a4-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5978444448-xwdl6\" (UID: \"e4dc9d95-b126-48df-97ba-118157d5b0a4\") " pod="openshift-monitoring/metrics-server-5978444448-xwdl6" Apr 24 21:18:15.113706 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:15.113641 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/e4dc9d95-b126-48df-97ba-118157d5b0a4-secret-metrics-server-tls\") pod \"metrics-server-5978444448-xwdl6\" (UID: \"e4dc9d95-b126-48df-97ba-118157d5b0a4\") " pod="openshift-monitoring/metrics-server-5978444448-xwdl6" Apr 24 21:18:15.113751 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:15.113695 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/e4dc9d95-b126-48df-97ba-118157d5b0a4-secret-metrics-server-client-certs\") pod \"metrics-server-5978444448-xwdl6\" (UID: \"e4dc9d95-b126-48df-97ba-118157d5b0a4\") " pod="openshift-monitoring/metrics-server-5978444448-xwdl6" Apr 24 21:18:15.215161 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:15.215127 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/e4dc9d95-b126-48df-97ba-118157d5b0a4-metrics-server-audit-profiles\") pod \"metrics-server-5978444448-xwdl6\" (UID: \"e4dc9d95-b126-48df-97ba-118157d5b0a4\") " pod="openshift-monitoring/metrics-server-5978444448-xwdl6" Apr 24 21:18:15.215161 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:15.215162 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4dc9d95-b126-48df-97ba-118157d5b0a4-client-ca-bundle\") pod \"metrics-server-5978444448-xwdl6\" (UID: \"e4dc9d95-b126-48df-97ba-118157d5b0a4\") " pod="openshift-monitoring/metrics-server-5978444448-xwdl6" Apr 24 21:18:15.215402 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:15.215195 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m4wp6\" (UniqueName: \"kubernetes.io/projected/e4dc9d95-b126-48df-97ba-118157d5b0a4-kube-api-access-m4wp6\") pod \"metrics-server-5978444448-xwdl6\" (UID: \"e4dc9d95-b126-48df-97ba-118157d5b0a4\") " pod="openshift-monitoring/metrics-server-5978444448-xwdl6" Apr 24 21:18:15.215402 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:15.215255 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/e4dc9d95-b126-48df-97ba-118157d5b0a4-audit-log\") pod \"metrics-server-5978444448-xwdl6\" (UID: \"e4dc9d95-b126-48df-97ba-118157d5b0a4\") " pod="openshift-monitoring/metrics-server-5978444448-xwdl6" Apr 24 21:18:15.215587 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:15.215560 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4dc9d95-b126-48df-97ba-118157d5b0a4-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5978444448-xwdl6\" (UID: \"e4dc9d95-b126-48df-97ba-118157d5b0a4\") " pod="openshift-monitoring/metrics-server-5978444448-xwdl6" Apr 24 21:18:15.215730 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:15.215648 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/e4dc9d95-b126-48df-97ba-118157d5b0a4-audit-log\") pod \"metrics-server-5978444448-xwdl6\" (UID: \"e4dc9d95-b126-48df-97ba-118157d5b0a4\") " pod="openshift-monitoring/metrics-server-5978444448-xwdl6" Apr 24 21:18:15.216280 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:15.215629 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/e4dc9d95-b126-48df-97ba-118157d5b0a4-secret-metrics-server-tls\") pod \"metrics-server-5978444448-xwdl6\" (UID: \"e4dc9d95-b126-48df-97ba-118157d5b0a4\") " pod="openshift-monitoring/metrics-server-5978444448-xwdl6" Apr 24 21:18:15.216407 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:15.216382 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/e4dc9d95-b126-48df-97ba-118157d5b0a4-secret-metrics-server-client-certs\") pod \"metrics-server-5978444448-xwdl6\" (UID: \"e4dc9d95-b126-48df-97ba-118157d5b0a4\") " pod="openshift-monitoring/metrics-server-5978444448-xwdl6" Apr 24 21:18:15.216616 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:15.216589 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/e4dc9d95-b126-48df-97ba-118157d5b0a4-metrics-server-audit-profiles\") pod \"metrics-server-5978444448-xwdl6\" (UID: \"e4dc9d95-b126-48df-97ba-118157d5b0a4\") " pod="openshift-monitoring/metrics-server-5978444448-xwdl6" Apr 24 21:18:15.221604 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:15.216980 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4dc9d95-b126-48df-97ba-118157d5b0a4-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5978444448-xwdl6\" (UID: \"e4dc9d95-b126-48df-97ba-118157d5b0a4\") " pod="openshift-monitoring/metrics-server-5978444448-xwdl6" Apr 24 21:18:15.221604 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:15.219017 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4dc9d95-b126-48df-97ba-118157d5b0a4-client-ca-bundle\") pod \"metrics-server-5978444448-xwdl6\" (UID: \"e4dc9d95-b126-48df-97ba-118157d5b0a4\") " pod="openshift-monitoring/metrics-server-5978444448-xwdl6" Apr 24 21:18:15.221604 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:15.220150 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/e4dc9d95-b126-48df-97ba-118157d5b0a4-secret-metrics-server-client-certs\") pod \"metrics-server-5978444448-xwdl6\" (UID: \"e4dc9d95-b126-48df-97ba-118157d5b0a4\") " pod="openshift-monitoring/metrics-server-5978444448-xwdl6" Apr 24 21:18:15.222752 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:15.222726 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/e4dc9d95-b126-48df-97ba-118157d5b0a4-secret-metrics-server-tls\") pod \"metrics-server-5978444448-xwdl6\" (UID: \"e4dc9d95-b126-48df-97ba-118157d5b0a4\") " pod="openshift-monitoring/metrics-server-5978444448-xwdl6" Apr 24 21:18:15.224745 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:15.224723 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4wp6\" (UniqueName: \"kubernetes.io/projected/e4dc9d95-b126-48df-97ba-118157d5b0a4-kube-api-access-m4wp6\") pod \"metrics-server-5978444448-xwdl6\" (UID: \"e4dc9d95-b126-48df-97ba-118157d5b0a4\") " pod="openshift-monitoring/metrics-server-5978444448-xwdl6" Apr 24 21:18:15.293739 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:15.293647 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5978444448-xwdl6" Apr 24 21:18:15.761896 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:15.761846 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5978444448-xwdl6"] Apr 24 21:18:15.764906 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:18:15.764864 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4dc9d95_b126_48df_97ba_118157d5b0a4.slice/crio-24bdc4fa6f8e81caeb0d762a3309cd43cd2f1ce990ad5876308d54ab256f2f53 WatchSource:0}: Error finding container 24bdc4fa6f8e81caeb0d762a3309cd43cd2f1ce990ad5876308d54ab256f2f53: Status 404 returned error can't find the container with id 24bdc4fa6f8e81caeb0d762a3309cd43cd2f1ce990ad5876308d54ab256f2f53 Apr 24 21:18:15.860037 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:18:15.860006 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" podUID="f5c73ec4-3e98-4b0a-a1ae-236998f28fb1" Apr 24 21:18:15.874274 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:18:15.874238 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-4fr27" podUID="022d6343-7dfc-470e-8e3c-3380ea630933" Apr 24 21:18:15.881373 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:18:15.881346 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-gp5s2" podUID="7c51aa96-bca7-47fa-bba2-badf0e22ee4d" Apr 24 21:18:16.046387 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:16.046341 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8" event={"ID":"ef5047f4-c231-4399-9dd1-f54603124de8","Type":"ContainerStarted","Data":"ebecd272482e772a17588770c2791f579e4d55ac16b93397334eb31ed72df11b"} Apr 24 21:18:16.046387 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:16.046378 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8" event={"ID":"ef5047f4-c231-4399-9dd1-f54603124de8","Type":"ContainerStarted","Data":"8f8e5364b246d9465a2b48c204bcd1dc5bee4cee44713091e4d6758273533383"} Apr 24 21:18:16.046387 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:16.046389 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8" event={"ID":"ef5047f4-c231-4399-9dd1-f54603124de8","Type":"ContainerStarted","Data":"322c37e8bed27240d16bee42a6f26f43f0b6c346926455e1cb45ad391b81e7fb"} Apr 24 21:18:16.047347 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:16.047321 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gp5s2" Apr 24 21:18:16.047453 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:16.047358 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4fr27" Apr 24 21:18:16.047453 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:16.047356 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5978444448-xwdl6" event={"ID":"e4dc9d95-b126-48df-97ba-118157d5b0a4","Type":"ContainerStarted","Data":"24bdc4fa6f8e81caeb0d762a3309cd43cd2f1ce990ad5876308d54ab256f2f53"} Apr 24 21:18:16.047453 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:16.047364 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:18:16.909987 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:16.909894 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:18:16.915009 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:16.914980 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:16.917814 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:16.917639 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 21:18:16.917953 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:16.917837 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 21:18:16.919609 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:16.918555 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 21:18:16.919609 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:16.918831 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 21:18:16.919609 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:16.919137 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 21:18:16.919609 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:16.919472 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 21:18:16.920061 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:16.919845 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-m26qw\"" Apr 24 21:18:16.920061 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:16.919985 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 21:18:16.920169 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:16.920113 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-8tcl0k3ke0b36\"" Apr 24 21:18:16.920258 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:16.920222 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 21:18:16.920258 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:16.920253 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 21:18:16.920567 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:16.920486 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 21:18:16.922041 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:16.922020 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 21:18:16.928773 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:16.928729 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:18:16.929319 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:16.929296 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 21:18:17.035343 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.034569 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.035343 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.034626 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.035343 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.034659 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.035343 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.034744 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvvpf\" (UniqueName: \"kubernetes.io/projected/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-kube-api-access-kvvpf\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.035343 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.034787 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.035343 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.034826 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-config\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.035343 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.034852 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.035343 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.034896 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.035343 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.034926 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.035343 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.034949 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.035343 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.034973 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-config-out\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.035343 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.034999 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.035343 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.035043 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.035343 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.035073 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-web-config\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.035343 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.035097 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.035343 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.035138 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.036294 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.035161 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.036294 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.035199 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.054060 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.054024 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8" event={"ID":"ef5047f4-c231-4399-9dd1-f54603124de8","Type":"ContainerStarted","Data":"02f31d5c0220f3c119868b528dafcf79b59c810bc14b8a9551ff3df415606d36"} Apr 24 21:18:17.151842 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.150202 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.151842 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.150248 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.151842 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.150301 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvvpf\" (UniqueName: \"kubernetes.io/projected/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-kube-api-access-kvvpf\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.151842 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.150340 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.151842 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.150373 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-config\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.151842 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.150398 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.151842 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.150424 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.151842 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.150453 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.151842 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.150476 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.151842 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.150503 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-config-out\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.151842 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.150530 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.151842 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.150569 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.151842 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.150595 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-web-config\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.151842 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.150616 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.151842 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.150655 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.151842 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.150680 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.151842 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.150712 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.152747 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.150791 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.152747 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.151485 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.159315 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.158919 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.159315 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.159189 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.161503 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.160742 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.165481 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.165453 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.168456 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.167629 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.168456 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.168234 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.172036 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.171971 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-web-config\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.172036 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.171979 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.173070 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.173024 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-config-out\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.174209 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.174182 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.174678 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.174637 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.175611 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.175585 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-config\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.178961 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.176827 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.178961 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.177646 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.178961 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.178035 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.179535 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.179511 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.191409 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.191385 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvvpf\" (UniqueName: \"kubernetes.io/projected/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-kube-api-access-kvvpf\") pod \"prometheus-k8s-0\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.231024 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.230084 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:17.395240 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:17.395206 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:18:17.400963 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:18:17.400929 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc611eee9_1cbe_4e0f_abe6_0eccd82b9fd8.slice/crio-cf8a73b9213ada27b7836d6cce078514a61964e7226f294ff4ef1f928b75d480 WatchSource:0}: Error finding container cf8a73b9213ada27b7836d6cce078514a61964e7226f294ff4ef1f928b75d480: Status 404 returned error can't find the container with id cf8a73b9213ada27b7836d6cce078514a61964e7226f294ff4ef1f928b75d480 Apr 24 21:18:18.057983 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:18.057922 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8","Type":"ContainerStarted","Data":"cf8a73b9213ada27b7836d6cce078514a61964e7226f294ff4ef1f928b75d480"} Apr 24 21:18:18.059441 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:18.059408 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5978444448-xwdl6" event={"ID":"e4dc9d95-b126-48df-97ba-118157d5b0a4","Type":"ContainerStarted","Data":"5a711f82ae19bdb0c013b28583495065f6d32a534b4092c65f751188f3da81c9"} Apr 24 21:18:18.062349 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:18.062321 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8" event={"ID":"ef5047f4-c231-4399-9dd1-f54603124de8","Type":"ContainerStarted","Data":"02f78384a633ce7022ba28e1063edeb55f535dc85ba284aa2ea9ab472265a5c6"} Apr 24 21:18:18.062455 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:18.062354 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8" event={"ID":"ef5047f4-c231-4399-9dd1-f54603124de8","Type":"ContainerStarted","Data":"b61595ea54898380efd737a156a23b0f4abf0d156ea4bcb587eb0fbb0b566b29"} Apr 24 21:18:18.062534 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:18.062523 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8" Apr 24 21:18:18.076610 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:18.076563 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-5978444448-xwdl6" podStartSLOduration=2.672339197 podStartE2EDuration="4.076549705s" podCreationTimestamp="2026-04-24 21:18:14 +0000 UTC" firstStartedPulling="2026-04-24 21:18:15.767360789 +0000 UTC m=+155.763111529" lastFinishedPulling="2026-04-24 21:18:17.171571294 +0000 UTC m=+157.167322037" observedRunningTime="2026-04-24 21:18:18.07541467 +0000 UTC m=+158.071165422" watchObservedRunningTime="2026-04-24 21:18:18.076549705 +0000 UTC m=+158.072300440" Apr 24 21:18:18.099626 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:18.099574 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8" podStartSLOduration=2.601767874 podStartE2EDuration="5.099561687s" podCreationTimestamp="2026-04-24 21:18:13 +0000 UTC" firstStartedPulling="2026-04-24 21:18:14.075721915 +0000 UTC m=+154.071472652" lastFinishedPulling="2026-04-24 21:18:16.573515725 +0000 UTC m=+156.569266465" observedRunningTime="2026-04-24 21:18:18.099171547 +0000 UTC m=+158.094922309" watchObservedRunningTime="2026-04-24 21:18:18.099561687 +0000 UTC m=+158.095312446" Apr 24 21:18:19.066611 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:19.066578 2573 generic.go:358] "Generic (PLEG): container finished" podID="c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" containerID="dfbfe1325c6539258808a9347d5b803eae2a2e9b2e3b8138dab4e36f3d52ae32" exitCode=0 Apr 24 21:18:19.067045 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:19.066662 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8","Type":"ContainerDied","Data":"dfbfe1325c6539258808a9347d5b803eae2a2e9b2e3b8138dab4e36f3d52ae32"} Apr 24 21:18:20.784187 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:20.784094 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-registry-tls\") pod \"image-registry-7675dc97f4-6p6nx\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:18:20.784187 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:20.784162 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/022d6343-7dfc-470e-8e3c-3380ea630933-metrics-tls\") pod \"dns-default-4fr27\" (UID: \"022d6343-7dfc-470e-8e3c-3380ea630933\") " pod="openshift-dns/dns-default-4fr27" Apr 24 21:18:20.784678 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:20.784227 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c51aa96-bca7-47fa-bba2-badf0e22ee4d-cert\") pod \"ingress-canary-gp5s2\" (UID: \"7c51aa96-bca7-47fa-bba2-badf0e22ee4d\") " pod="openshift-ingress-canary/ingress-canary-gp5s2" Apr 24 21:18:20.787098 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:20.787036 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/022d6343-7dfc-470e-8e3c-3380ea630933-metrics-tls\") pod \"dns-default-4fr27\" (UID: \"022d6343-7dfc-470e-8e3c-3380ea630933\") " pod="openshift-dns/dns-default-4fr27" Apr 24 21:18:20.787213 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:20.787118 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c51aa96-bca7-47fa-bba2-badf0e22ee4d-cert\") pod \"ingress-canary-gp5s2\" (UID: \"7c51aa96-bca7-47fa-bba2-badf0e22ee4d\") " pod="openshift-ingress-canary/ingress-canary-gp5s2" Apr 24 21:18:20.787292 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:20.787229 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-registry-tls\") pod \"image-registry-7675dc97f4-6p6nx\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:18:20.852228 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:20.852200 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-qvvhg\"" Apr 24 21:18:20.852377 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:20.852246 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-txjjd\"" Apr 24 21:18:20.852377 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:20.852246 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-bc8qn\"" Apr 24 21:18:20.859383 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:20.859362 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gp5s2" Apr 24 21:18:20.859492 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:20.859466 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:18:20.859561 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:20.859526 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4fr27" Apr 24 21:18:21.593184 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:21.593133 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gp5s2"] Apr 24 21:18:21.600263 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:18:21.600238 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c51aa96_bca7_47fa_bba2_badf0e22ee4d.slice/crio-6bf9df914963a2f57755ff0b17e3563895049cdc9ea185d2c9bf67679774bfef WatchSource:0}: Error finding container 6bf9df914963a2f57755ff0b17e3563895049cdc9ea185d2c9bf67679774bfef: Status 404 returned error can't find the container with id 6bf9df914963a2f57755ff0b17e3563895049cdc9ea185d2c9bf67679774bfef Apr 24 21:18:21.619143 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:21.618575 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7675dc97f4-6p6nx"] Apr 24 21:18:21.624610 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:18:21.624571 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5c73ec4_3e98_4b0a_a1ae_236998f28fb1.slice/crio-0e4c7e773981073f32736d0fef99111865560845b7c09ad9430232d7385ef932 WatchSource:0}: Error finding container 0e4c7e773981073f32736d0fef99111865560845b7c09ad9430232d7385ef932: Status 404 returned error can't find the container with id 0e4c7e773981073f32736d0fef99111865560845b7c09ad9430232d7385ef932 Apr 24 21:18:21.635606 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:21.634236 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4fr27"] Apr 24 21:18:21.644110 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:18:21.644077 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod022d6343_7dfc_470e_8e3c_3380ea630933.slice/crio-f936534b47817e7dfcde0a921bd4427a40972fa070c01452233ff5fcbdaaee1c WatchSource:0}: Error finding container f936534b47817e7dfcde0a921bd4427a40972fa070c01452233ff5fcbdaaee1c: Status 404 returned error can't find the container with id f936534b47817e7dfcde0a921bd4427a40972fa070c01452233ff5fcbdaaee1c Apr 24 21:18:22.078694 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:22.078655 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" event={"ID":"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1","Type":"ContainerStarted","Data":"bdde4cc03545ffa5f50b35da30524db110ded55fd35dbc89102ba465521225a8"} Apr 24 21:18:22.078694 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:22.078699 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" event={"ID":"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1","Type":"ContainerStarted","Data":"0e4c7e773981073f32736d0fef99111865560845b7c09ad9430232d7385ef932"} Apr 24 21:18:22.079266 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:22.078758 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:18:22.080041 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:22.080011 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4fr27" event={"ID":"022d6343-7dfc-470e-8e3c-3380ea630933","Type":"ContainerStarted","Data":"f936534b47817e7dfcde0a921bd4427a40972fa070c01452233ff5fcbdaaee1c"} Apr 24 21:18:22.081157 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:22.081131 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gp5s2" event={"ID":"7c51aa96-bca7-47fa-bba2-badf0e22ee4d","Type":"ContainerStarted","Data":"6bf9df914963a2f57755ff0b17e3563895049cdc9ea185d2c9bf67679774bfef"} Apr 24 21:18:22.084527 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:22.084503 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8","Type":"ContainerStarted","Data":"a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4"} Apr 24 21:18:22.084642 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:22.084530 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8","Type":"ContainerStarted","Data":"d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545"} Apr 24 21:18:22.084642 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:22.084547 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8","Type":"ContainerStarted","Data":"6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13"} Apr 24 21:18:22.084642 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:22.084562 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8","Type":"ContainerStarted","Data":"eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2"} Apr 24 21:18:22.084642 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:22.084574 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8","Type":"ContainerStarted","Data":"6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8"} Apr 24 21:18:22.084642 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:22.084586 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8","Type":"ContainerStarted","Data":"26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d"} Apr 24 21:18:22.100497 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:22.100455 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" podStartSLOduration=161.10044029 podStartE2EDuration="2m41.10044029s" podCreationTimestamp="2026-04-24 21:15:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:18:22.098457608 +0000 UTC m=+162.094208368" watchObservedRunningTime="2026-04-24 21:18:22.10044029 +0000 UTC m=+162.096191049" Apr 24 21:18:22.124401 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:22.124311 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.030244718 podStartE2EDuration="6.12428588s" podCreationTimestamp="2026-04-24 21:18:16 +0000 UTC" firstStartedPulling="2026-04-24 21:18:17.403208923 +0000 UTC m=+157.398959664" lastFinishedPulling="2026-04-24 21:18:21.497250089 +0000 UTC m=+161.493000826" observedRunningTime="2026-04-24 21:18:22.12333334 +0000 UTC m=+162.119084122" watchObservedRunningTime="2026-04-24 21:18:22.12428588 +0000 UTC m=+162.120036640" Apr 24 21:18:22.231381 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:22.231287 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:18:24.072501 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:24.072468 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5b89f6dc86-bwsr8" Apr 24 21:18:24.092461 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:24.092428 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4fr27" event={"ID":"022d6343-7dfc-470e-8e3c-3380ea630933","Type":"ContainerStarted","Data":"e40d861a3bd98c614dfb17fefefe677b632aeb6de45d99e1d51ce66c4093506a"} Apr 24 21:18:24.092624 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:24.092469 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4fr27" event={"ID":"022d6343-7dfc-470e-8e3c-3380ea630933","Type":"ContainerStarted","Data":"a69ce5750672d7196c3ac8649276fe15007201457386aca1d9a473e26e29c102"} Apr 24 21:18:24.092624 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:24.092532 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-4fr27" Apr 24 21:18:24.093787 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:24.093764 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gp5s2" event={"ID":"7c51aa96-bca7-47fa-bba2-badf0e22ee4d","Type":"ContainerStarted","Data":"4d5dffafa5085492378e9e0e7d1d9f87808960bf0d5911de687d8f6b6d1de040"} Apr 24 21:18:24.115477 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:24.115431 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-4fr27" podStartSLOduration=130.237831182 podStartE2EDuration="2m12.115414417s" podCreationTimestamp="2026-04-24 21:16:12 +0000 UTC" firstStartedPulling="2026-04-24 21:18:21.646123603 +0000 UTC m=+161.641874345" lastFinishedPulling="2026-04-24 21:18:23.523706843 +0000 UTC m=+163.519457580" observedRunningTime="2026-04-24 21:18:24.114160784 +0000 UTC m=+164.109911542" watchObservedRunningTime="2026-04-24 21:18:24.115414417 +0000 UTC m=+164.111165219" Apr 24 21:18:24.129098 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:24.129059 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-gp5s2" podStartSLOduration=130.206857785 podStartE2EDuration="2m12.129044635s" podCreationTimestamp="2026-04-24 21:16:12 +0000 UTC" firstStartedPulling="2026-04-24 21:18:21.60634324 +0000 UTC m=+161.602093991" lastFinishedPulling="2026-04-24 21:18:23.528530086 +0000 UTC m=+163.524280841" observedRunningTime="2026-04-24 21:18:24.128141838 +0000 UTC m=+164.123892619" watchObservedRunningTime="2026-04-24 21:18:24.129044635 +0000 UTC m=+164.124795394" Apr 24 21:18:24.567321 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:24.567295 2573 scope.go:117] "RemoveContainer" containerID="4d0cf134c927b0e695121051929c9e682bbd6f03bed1624497881742ee187a74" Apr 24 21:18:25.098693 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:25.098665 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/2.log" Apr 24 21:18:25.099180 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:25.098785 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-dln5m" event={"ID":"c1a58610-8ce3-4b65-8ceb-500127ff5a26","Type":"ContainerStarted","Data":"2d59fa1645665d1d069fad48fdf262ece6721aa1fb3311407d60b56c9b312314"} Apr 24 21:18:25.099332 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:25.099316 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-dln5m" Apr 24 21:18:25.121621 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:25.121560 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-dln5m" podStartSLOduration=52.586192779 podStartE2EDuration="55.121543564s" podCreationTimestamp="2026-04-24 21:17:30 +0000 UTC" firstStartedPulling="2026-04-24 21:17:30.878771017 +0000 UTC m=+110.874521753" lastFinishedPulling="2026-04-24 21:17:33.414121787 +0000 UTC m=+113.409872538" observedRunningTime="2026-04-24 21:18:25.118510846 +0000 UTC m=+165.114261603" watchObservedRunningTime="2026-04-24 21:18:25.121543564 +0000 UTC m=+165.117294322" Apr 24 21:18:25.485439 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:25.485406 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7675dc97f4-6p6nx"] Apr 24 21:18:25.489441 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:25.489415 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-dln5m" Apr 24 21:18:34.101130 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:34.101097 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4fr27" Apr 24 21:18:35.294757 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:35.294716 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-5978444448-xwdl6" Apr 24 21:18:35.294757 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:35.294756 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-5978444448-xwdl6" Apr 24 21:18:45.491130 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:45.491098 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:18:49.175955 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:49.175916 2573 generic.go:358] "Generic (PLEG): container finished" podID="621c1634-3d26-4e3e-8a2e-f735fa5423f9" containerID="990d51c271f077ba75c202ef6760de4d5fb3e586a9a11257a796a550c5110ab4" exitCode=0 Apr 24 21:18:49.176276 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:49.175989 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gmz9s" event={"ID":"621c1634-3d26-4e3e-8a2e-f735fa5423f9","Type":"ContainerDied","Data":"990d51c271f077ba75c202ef6760de4d5fb3e586a9a11257a796a550c5110ab4"} Apr 24 21:18:49.176343 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:49.176301 2573 scope.go:117] "RemoveContainer" containerID="990d51c271f077ba75c202ef6760de4d5fb3e586a9a11257a796a550c5110ab4" Apr 24 21:18:50.180532 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:50.180497 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gmz9s" event={"ID":"621c1634-3d26-4e3e-8a2e-f735fa5423f9","Type":"ContainerStarted","Data":"c2f20f189ae94ab5dbb93a36d837e3ebd95b97dea77366b95614f3298846b41e"} Apr 24 21:18:50.505847 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:50.505798 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" podUID="f5c73ec4-3e98-4b0a-a1ae-236998f28fb1" containerName="registry" containerID="cri-o://bdde4cc03545ffa5f50b35da30524db110ded55fd35dbc89102ba465521225a8" gracePeriod=30 Apr 24 21:18:50.750016 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:50.749987 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:18:50.846902 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:50.846802 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-registry-tls\") pod \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " Apr 24 21:18:50.846902 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:50.846831 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-image-registry-private-configuration\") pod \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " Apr 24 21:18:50.846902 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:50.846864 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8l8l\" (UniqueName: \"kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-kube-api-access-m8l8l\") pod \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " Apr 24 21:18:50.846902 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:50.846902 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-ca-trust-extracted\") pod \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " Apr 24 21:18:50.847210 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:50.846944 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-bound-sa-token\") pod \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " Apr 24 21:18:50.847210 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:50.846991 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-installation-pull-secrets\") pod \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " Apr 24 21:18:50.847210 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:50.847015 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-trusted-ca\") pod \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " Apr 24 21:18:50.847210 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:50.847042 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-registry-certificates\") pod \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\" (UID: \"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1\") " Apr 24 21:18:50.847570 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:50.847508 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "f5c73ec4-3e98-4b0a-a1ae-236998f28fb1" (UID: "f5c73ec4-3e98-4b0a-a1ae-236998f28fb1"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:18:50.847702 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:50.847588 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "f5c73ec4-3e98-4b0a-a1ae-236998f28fb1" (UID: "f5c73ec4-3e98-4b0a-a1ae-236998f28fb1"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:18:50.849486 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:50.849434 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-kube-api-access-m8l8l" (OuterVolumeSpecName: "kube-api-access-m8l8l") pod "f5c73ec4-3e98-4b0a-a1ae-236998f28fb1" (UID: "f5c73ec4-3e98-4b0a-a1ae-236998f28fb1"). InnerVolumeSpecName "kube-api-access-m8l8l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:18:50.849623 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:50.849597 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "f5c73ec4-3e98-4b0a-a1ae-236998f28fb1" (UID: "f5c73ec4-3e98-4b0a-a1ae-236998f28fb1"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:18:50.849726 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:50.849621 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "f5c73ec4-3e98-4b0a-a1ae-236998f28fb1" (UID: "f5c73ec4-3e98-4b0a-a1ae-236998f28fb1"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:18:50.849780 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:50.849739 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "f5c73ec4-3e98-4b0a-a1ae-236998f28fb1" (UID: "f5c73ec4-3e98-4b0a-a1ae-236998f28fb1"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:18:50.849826 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:50.849801 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "f5c73ec4-3e98-4b0a-a1ae-236998f28fb1" (UID: "f5c73ec4-3e98-4b0a-a1ae-236998f28fb1"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:18:50.855832 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:50.855807 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "f5c73ec4-3e98-4b0a-a1ae-236998f28fb1" (UID: "f5c73ec4-3e98-4b0a-a1ae-236998f28fb1"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:18:50.948132 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:50.948098 2573 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-registry-tls\") on node \"ip-10-0-134-147.ec2.internal\" DevicePath \"\"" Apr 24 21:18:50.948132 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:50.948129 2573 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-image-registry-private-configuration\") on node \"ip-10-0-134-147.ec2.internal\" DevicePath \"\"" Apr 24 21:18:50.948331 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:50.948144 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m8l8l\" (UniqueName: \"kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-kube-api-access-m8l8l\") on node \"ip-10-0-134-147.ec2.internal\" DevicePath \"\"" Apr 24 21:18:50.948331 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:50.948158 2573 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-ca-trust-extracted\") on node \"ip-10-0-134-147.ec2.internal\" DevicePath \"\"" Apr 24 21:18:50.948331 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:50.948173 2573 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-bound-sa-token\") on node \"ip-10-0-134-147.ec2.internal\" DevicePath \"\"" Apr 24 21:18:50.948331 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:50.948186 2573 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-installation-pull-secrets\") on node \"ip-10-0-134-147.ec2.internal\" DevicePath \"\"" Apr 24 21:18:50.948331 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:50.948199 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-trusted-ca\") on node \"ip-10-0-134-147.ec2.internal\" DevicePath \"\"" Apr 24 21:18:50.948331 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:50.948211 2573 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1-registry-certificates\") on node \"ip-10-0-134-147.ec2.internal\" DevicePath \"\"" Apr 24 21:18:51.184455 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:51.184376 2573 generic.go:358] "Generic (PLEG): container finished" podID="f5c73ec4-3e98-4b0a-a1ae-236998f28fb1" containerID="bdde4cc03545ffa5f50b35da30524db110ded55fd35dbc89102ba465521225a8" exitCode=0 Apr 24 21:18:51.184455 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:51.184428 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" Apr 24 21:18:51.184455 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:51.184441 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" event={"ID":"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1","Type":"ContainerDied","Data":"bdde4cc03545ffa5f50b35da30524db110ded55fd35dbc89102ba465521225a8"} Apr 24 21:18:51.184988 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:51.184473 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7675dc97f4-6p6nx" event={"ID":"f5c73ec4-3e98-4b0a-a1ae-236998f28fb1","Type":"ContainerDied","Data":"0e4c7e773981073f32736d0fef99111865560845b7c09ad9430232d7385ef932"} Apr 24 21:18:51.184988 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:51.184491 2573 scope.go:117] "RemoveContainer" containerID="bdde4cc03545ffa5f50b35da30524db110ded55fd35dbc89102ba465521225a8" Apr 24 21:18:51.192720 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:51.192701 2573 scope.go:117] "RemoveContainer" containerID="bdde4cc03545ffa5f50b35da30524db110ded55fd35dbc89102ba465521225a8" Apr 24 21:18:51.192993 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:18:51.192969 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdde4cc03545ffa5f50b35da30524db110ded55fd35dbc89102ba465521225a8\": container with ID starting with bdde4cc03545ffa5f50b35da30524db110ded55fd35dbc89102ba465521225a8 not found: ID does not exist" containerID="bdde4cc03545ffa5f50b35da30524db110ded55fd35dbc89102ba465521225a8" Apr 24 21:18:51.193043 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:51.193001 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdde4cc03545ffa5f50b35da30524db110ded55fd35dbc89102ba465521225a8"} err="failed to get container status \"bdde4cc03545ffa5f50b35da30524db110ded55fd35dbc89102ba465521225a8\": rpc error: code = NotFound desc = could not find container \"bdde4cc03545ffa5f50b35da30524db110ded55fd35dbc89102ba465521225a8\": container with ID starting with bdde4cc03545ffa5f50b35da30524db110ded55fd35dbc89102ba465521225a8 not found: ID does not exist" Apr 24 21:18:51.205421 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:51.205399 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7675dc97f4-6p6nx"] Apr 24 21:18:51.209120 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:51.209101 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7675dc97f4-6p6nx"] Apr 24 21:18:52.571177 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:52.571137 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5c73ec4-3e98-4b0a-a1ae-236998f28fb1" path="/var/lib/kubelet/pods/f5c73ec4-3e98-4b0a-a1ae-236998f28fb1/volumes" Apr 24 21:18:55.198233 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:55.198203 2573 generic.go:358] "Generic (PLEG): container finished" podID="fb24c7be-a2bf-47fd-a8da-4bcaf272012a" containerID="feed3bc54a3ffa1c1f70ff627276e496fcedceaa233158ee727b3a1b055cb691" exitCode=0 Apr 24 21:18:55.198611 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:55.198275 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-xsrdx" event={"ID":"fb24c7be-a2bf-47fd-a8da-4bcaf272012a","Type":"ContainerDied","Data":"feed3bc54a3ffa1c1f70ff627276e496fcedceaa233158ee727b3a1b055cb691"} Apr 24 21:18:55.198611 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:55.198589 2573 scope.go:117] "RemoveContainer" containerID="feed3bc54a3ffa1c1f70ff627276e496fcedceaa233158ee727b3a1b055cb691" Apr 24 21:18:55.298702 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:55.298678 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-5978444448-xwdl6" Apr 24 21:18:55.302368 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:55.302348 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-5978444448-xwdl6" Apr 24 21:18:56.203711 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:56.203665 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-xsrdx" event={"ID":"fb24c7be-a2bf-47fd-a8da-4bcaf272012a","Type":"ContainerStarted","Data":"2d6b2859a60381f09865f0c0a07b5a9d39323b0a79cccb4a7f41087bbf19d30a"} Apr 24 21:18:57.099718 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:57.099693 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-nl9ms_772a4257-de5c-42f7-8b8c-0ee2404f99a6/cluster-monitoring-operator/0.log" Apr 24 21:18:57.898010 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:57.897982 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-5978444448-xwdl6_e4dc9d95-b126-48df-97ba-118157d5b0a4/metrics-server/0.log" Apr 24 21:18:58.296912 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:58.296883 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7khp8_bfcb8364-b2c4-4264-99a6-c796a5c6678a/init-textfile/0.log" Apr 24 21:18:58.498503 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:58.498475 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7khp8_bfcb8364-b2c4-4264-99a6-c796a5c6678a/node-exporter/0.log" Apr 24 21:18:58.698532 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:18:58.698460 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7khp8_bfcb8364-b2c4-4264-99a6-c796a5c6678a/kube-rbac-proxy/0.log" Apr 24 21:19:00.697680 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:00.697649 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8/init-config-reloader/0.log" Apr 24 21:19:00.899577 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:00.899546 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8/prometheus/0.log" Apr 24 21:19:01.097697 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:01.097668 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8/config-reloader/0.log" Apr 24 21:19:01.297855 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:01.297827 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8/thanos-sidecar/0.log" Apr 24 21:19:01.497679 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:01.497656 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8/kube-rbac-proxy-web/0.log" Apr 24 21:19:01.697646 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:01.697621 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8/kube-rbac-proxy/0.log" Apr 24 21:19:01.898145 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:01.898070 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8/kube-rbac-proxy-thanos/0.log" Apr 24 21:19:02.099267 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:02.099221 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-hpwxd_7d8232dc-a440-4a5c-8138-8119b4f19fd7/prometheus-operator/0.log" Apr 24 21:19:02.298170 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:02.298141 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-hpwxd_7d8232dc-a440-4a5c-8138-8119b4f19fd7/kube-rbac-proxy/0.log" Apr 24 21:19:02.498427 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:02.498396 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-mdndr_5f38d17b-da8e-46bd-ba98-9a498446b4d2/prometheus-operator-admission-webhook/0.log" Apr 24 21:19:02.697535 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:02.697458 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5b89f6dc86-bwsr8_ef5047f4-c231-4399-9dd1-f54603124de8/thanos-query/0.log" Apr 24 21:19:02.898353 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:02.898326 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5b89f6dc86-bwsr8_ef5047f4-c231-4399-9dd1-f54603124de8/kube-rbac-proxy-web/0.log" Apr 24 21:19:03.098816 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:03.098788 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5b89f6dc86-bwsr8_ef5047f4-c231-4399-9dd1-f54603124de8/kube-rbac-proxy/0.log" Apr 24 21:19:03.298375 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:03.298345 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5b89f6dc86-bwsr8_ef5047f4-c231-4399-9dd1-f54603124de8/prom-label-proxy/0.log" Apr 24 21:19:03.497858 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:03.497825 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5b89f6dc86-bwsr8_ef5047f4-c231-4399-9dd1-f54603124de8/kube-rbac-proxy-rules/0.log" Apr 24 21:19:03.697794 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:03.697769 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5b89f6dc86-bwsr8_ef5047f4-c231-4399-9dd1-f54603124de8/kube-rbac-proxy-metrics/0.log" Apr 24 21:19:04.097765 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:04.097734 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/2.log" Apr 24 21:19:04.298912 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:04.298887 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/3.log" Apr 24 21:19:17.230884 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:17.230820 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:17.250807 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:17.250777 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:17.285438 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:17.285371 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:35.239741 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.239698 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:19:35.240331 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.240278 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" containerName="prometheus" containerID="cri-o://26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d" gracePeriod=600 Apr 24 21:19:35.240556 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.240316 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" containerName="kube-rbac-proxy" containerID="cri-o://d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545" gracePeriod=600 Apr 24 21:19:35.240671 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.240353 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" containerName="kube-rbac-proxy-thanos" containerID="cri-o://a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4" gracePeriod=600 Apr 24 21:19:35.240671 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.240371 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" containerName="kube-rbac-proxy-web" containerID="cri-o://6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13" gracePeriod=600 Apr 24 21:19:35.240671 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.240377 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" containerName="config-reloader" containerID="cri-o://6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8" gracePeriod=600 Apr 24 21:19:35.240835 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.240375 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" containerName="thanos-sidecar" containerID="cri-o://eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2" gracePeriod=600 Apr 24 21:19:35.496433 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.496369 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:35.514156 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.514129 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvvpf\" (UniqueName: \"kubernetes.io/projected/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-kube-api-access-kvvpf\") pod \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " Apr 24 21:19:35.514297 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.514172 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-tls-assets\") pod \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " Apr 24 21:19:35.514297 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.514203 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-config\") pod \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " Apr 24 21:19:35.514297 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.514219 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-config-out\") pod \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " Apr 24 21:19:35.514297 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.514237 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-prometheus-trusted-ca-bundle\") pod \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " Apr 24 21:19:35.514297 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.514260 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-prometheus-k8s-rulefiles-0\") pod \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " Apr 24 21:19:35.514297 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.514284 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-prometheus-k8s-db\") pod \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " Apr 24 21:19:35.514608 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.514303 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " Apr 24 21:19:35.514676 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.514643 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" (UID: "c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:19:35.514732 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.514705 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-secret-grpc-tls\") pod \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " Apr 24 21:19:35.514785 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.514753 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " Apr 24 21:19:35.514839 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.514811 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-secret-prometheus-k8s-tls\") pod \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " Apr 24 21:19:35.514907 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.514844 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-thanos-prometheus-http-client-file\") pod \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " Apr 24 21:19:35.514960 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.514906 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-secret-kube-rbac-proxy\") pod \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " Apr 24 21:19:35.514960 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.514947 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-configmap-kubelet-serving-ca-bundle\") pod \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " Apr 24 21:19:35.515061 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.514972 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-configmap-metrics-client-ca\") pod \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " Apr 24 21:19:35.515061 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.514996 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-web-config\") pod \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " Apr 24 21:19:35.515061 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.515027 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-secret-metrics-client-certs\") pod \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " Apr 24 21:19:35.515061 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.515054 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-configmap-serving-certs-ca-bundle\") pod \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\" (UID: \"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8\") " Apr 24 21:19:35.516433 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.515345 2573 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-prometheus-trusted-ca-bundle\") on node \"ip-10-0-134-147.ec2.internal\" DevicePath \"\"" Apr 24 21:19:35.516433 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.515516 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" (UID: "c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:19:35.516433 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.515643 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" (UID: "c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:19:35.516433 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.515780 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" (UID: "c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:19:35.516433 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.515938 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" (UID: "c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:19:35.516433 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.516221 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" (UID: "c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:19:35.517585 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.517483 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-config" (OuterVolumeSpecName: "config") pod "c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" (UID: "c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:19:35.517585 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.517556 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" (UID: "c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:19:35.520614 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.519408 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" (UID: "c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:19:35.520614 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.519517 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-kube-api-access-kvvpf" (OuterVolumeSpecName: "kube-api-access-kvvpf") pod "c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" (UID: "c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8"). InnerVolumeSpecName "kube-api-access-kvvpf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:19:35.521224 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.521184 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" (UID: "c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:19:35.521371 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.521294 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" (UID: "c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:19:35.521664 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.521634 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" (UID: "c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:19:35.521899 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.521749 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" (UID: "c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:19:35.521899 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.521809 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" (UID: "c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:19:35.522022 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.521888 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-config-out" (OuterVolumeSpecName: "config-out") pod "c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" (UID: "c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:19:35.522022 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.521954 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" (UID: "c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:19:35.533842 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.533816 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-web-config" (OuterVolumeSpecName: "web-config") pod "c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" (UID: "c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:19:35.615949 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.615912 2573 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-secret-metrics-client-certs\") on node \"ip-10-0-134-147.ec2.internal\" DevicePath \"\"" Apr 24 21:19:35.615949 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.615939 2573 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-134-147.ec2.internal\" DevicePath \"\"" Apr 24 21:19:35.615949 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.615950 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kvvpf\" (UniqueName: \"kubernetes.io/projected/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-kube-api-access-kvvpf\") on node \"ip-10-0-134-147.ec2.internal\" DevicePath \"\"" Apr 24 21:19:35.616165 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.615961 2573 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-tls-assets\") on node \"ip-10-0-134-147.ec2.internal\" DevicePath \"\"" Apr 24 21:19:35.616165 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.615970 2573 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-config\") on node \"ip-10-0-134-147.ec2.internal\" DevicePath \"\"" Apr 24 21:19:35.616165 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.615978 2573 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-config-out\") on node \"ip-10-0-134-147.ec2.internal\" DevicePath \"\"" Apr 24 21:19:35.616165 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.615987 2573 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-134-147.ec2.internal\" DevicePath \"\"" Apr 24 21:19:35.616165 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.615996 2573 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-prometheus-k8s-db\") on node \"ip-10-0-134-147.ec2.internal\" DevicePath \"\"" Apr 24 21:19:35.616165 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.616006 2573 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-134-147.ec2.internal\" DevicePath \"\"" Apr 24 21:19:35.616165 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.616016 2573 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-secret-grpc-tls\") on node \"ip-10-0-134-147.ec2.internal\" DevicePath \"\"" Apr 24 21:19:35.616165 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.616024 2573 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-134-147.ec2.internal\" DevicePath \"\"" Apr 24 21:19:35.616165 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.616033 2573 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-secret-prometheus-k8s-tls\") on node \"ip-10-0-134-147.ec2.internal\" DevicePath \"\"" Apr 24 21:19:35.616165 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.616042 2573 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-thanos-prometheus-http-client-file\") on node \"ip-10-0-134-147.ec2.internal\" DevicePath \"\"" Apr 24 21:19:35.616165 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.616052 2573 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-secret-kube-rbac-proxy\") on node \"ip-10-0-134-147.ec2.internal\" DevicePath \"\"" Apr 24 21:19:35.616165 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.616060 2573 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-134-147.ec2.internal\" DevicePath \"\"" Apr 24 21:19:35.616165 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.616069 2573 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-configmap-metrics-client-ca\") on node \"ip-10-0-134-147.ec2.internal\" DevicePath \"\"" Apr 24 21:19:35.616165 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:35.616077 2573 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8-web-config\") on node \"ip-10-0-134-147.ec2.internal\" DevicePath \"\"" Apr 24 21:19:36.321285 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.321255 2573 generic.go:358] "Generic (PLEG): container finished" podID="c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" containerID="a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4" exitCode=0 Apr 24 21:19:36.321285 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.321280 2573 generic.go:358] "Generic (PLEG): container finished" podID="c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" containerID="d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545" exitCode=0 Apr 24 21:19:36.321285 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.321288 2573 generic.go:358] "Generic (PLEG): container finished" podID="c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" containerID="6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13" exitCode=0 Apr 24 21:19:36.321721 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.321293 2573 generic.go:358] "Generic (PLEG): container finished" podID="c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" containerID="eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2" exitCode=0 Apr 24 21:19:36.321721 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.321300 2573 generic.go:358] "Generic (PLEG): container finished" podID="c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" containerID="6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8" exitCode=0 Apr 24 21:19:36.321721 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.321305 2573 generic.go:358] "Generic (PLEG): container finished" podID="c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" containerID="26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d" exitCode=0 Apr 24 21:19:36.321721 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.321328 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8","Type":"ContainerDied","Data":"a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4"} Apr 24 21:19:36.321721 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.321352 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8","Type":"ContainerDied","Data":"d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545"} Apr 24 21:19:36.321721 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.321356 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.321721 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.321372 2573 scope.go:117] "RemoveContainer" containerID="a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4" Apr 24 21:19:36.321721 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.321363 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8","Type":"ContainerDied","Data":"6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13"} Apr 24 21:19:36.321721 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.321510 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8","Type":"ContainerDied","Data":"eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2"} Apr 24 21:19:36.321721 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.321537 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8","Type":"ContainerDied","Data":"6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8"} Apr 24 21:19:36.321721 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.321552 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8","Type":"ContainerDied","Data":"26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d"} Apr 24 21:19:36.321721 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.321569 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8","Type":"ContainerDied","Data":"cf8a73b9213ada27b7836d6cce078514a61964e7226f294ff4ef1f928b75d480"} Apr 24 21:19:36.329966 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.329947 2573 scope.go:117] "RemoveContainer" containerID="d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545" Apr 24 21:19:36.336588 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.336571 2573 scope.go:117] "RemoveContainer" containerID="6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13" Apr 24 21:19:36.342640 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.342625 2573 scope.go:117] "RemoveContainer" containerID="eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2" Apr 24 21:19:36.345525 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.345504 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:19:36.349522 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.349497 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:19:36.349821 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.349807 2573 scope.go:117] "RemoveContainer" containerID="6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8" Apr 24 21:19:36.356306 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.356286 2573 scope.go:117] "RemoveContainer" containerID="26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d" Apr 24 21:19:36.362920 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.362906 2573 scope.go:117] "RemoveContainer" containerID="dfbfe1325c6539258808a9347d5b803eae2a2e9b2e3b8138dab4e36f3d52ae32" Apr 24 21:19:36.369063 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.369046 2573 scope.go:117] "RemoveContainer" containerID="a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4" Apr 24 21:19:36.369292 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:19:36.369277 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4\": container with ID starting with a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4 not found: ID does not exist" containerID="a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4" Apr 24 21:19:36.369335 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.369298 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4"} err="failed to get container status \"a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4\": rpc error: code = NotFound desc = could not find container \"a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4\": container with ID starting with a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4 not found: ID does not exist" Apr 24 21:19:36.369335 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.369314 2573 scope.go:117] "RemoveContainer" containerID="d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545" Apr 24 21:19:36.369546 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:19:36.369528 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545\": container with ID starting with d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545 not found: ID does not exist" containerID="d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545" Apr 24 21:19:36.369597 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.369551 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545"} err="failed to get container status \"d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545\": rpc error: code = NotFound desc = could not find container \"d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545\": container with ID starting with d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545 not found: ID does not exist" Apr 24 21:19:36.369597 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.369567 2573 scope.go:117] "RemoveContainer" containerID="6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13" Apr 24 21:19:36.369769 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:19:36.369752 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13\": container with ID starting with 6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13 not found: ID does not exist" containerID="6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13" Apr 24 21:19:36.369806 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.369773 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13"} err="failed to get container status \"6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13\": rpc error: code = NotFound desc = could not find container \"6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13\": container with ID starting with 6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13 not found: ID does not exist" Apr 24 21:19:36.369806 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.369788 2573 scope.go:117] "RemoveContainer" containerID="eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2" Apr 24 21:19:36.370114 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:19:36.370096 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2\": container with ID starting with eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2 not found: ID does not exist" containerID="eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2" Apr 24 21:19:36.370174 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.370116 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2"} err="failed to get container status \"eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2\": rpc error: code = NotFound desc = could not find container \"eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2\": container with ID starting with eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2 not found: ID does not exist" Apr 24 21:19:36.370174 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.370130 2573 scope.go:117] "RemoveContainer" containerID="6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8" Apr 24 21:19:36.370314 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:19:36.370299 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8\": container with ID starting with 6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8 not found: ID does not exist" containerID="6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8" Apr 24 21:19:36.370348 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.370318 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8"} err="failed to get container status \"6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8\": rpc error: code = NotFound desc = could not find container \"6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8\": container with ID starting with 6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8 not found: ID does not exist" Apr 24 21:19:36.370348 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.370329 2573 scope.go:117] "RemoveContainer" containerID="26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d" Apr 24 21:19:36.370544 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:19:36.370528 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d\": container with ID starting with 26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d not found: ID does not exist" containerID="26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d" Apr 24 21:19:36.370575 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.370557 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d"} err="failed to get container status \"26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d\": rpc error: code = NotFound desc = could not find container \"26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d\": container with ID starting with 26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d not found: ID does not exist" Apr 24 21:19:36.370575 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.370573 2573 scope.go:117] "RemoveContainer" containerID="dfbfe1325c6539258808a9347d5b803eae2a2e9b2e3b8138dab4e36f3d52ae32" Apr 24 21:19:36.370807 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:19:36.370793 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfbfe1325c6539258808a9347d5b803eae2a2e9b2e3b8138dab4e36f3d52ae32\": container with ID starting with dfbfe1325c6539258808a9347d5b803eae2a2e9b2e3b8138dab4e36f3d52ae32 not found: ID does not exist" containerID="dfbfe1325c6539258808a9347d5b803eae2a2e9b2e3b8138dab4e36f3d52ae32" Apr 24 21:19:36.370843 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.370812 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfbfe1325c6539258808a9347d5b803eae2a2e9b2e3b8138dab4e36f3d52ae32"} err="failed to get container status \"dfbfe1325c6539258808a9347d5b803eae2a2e9b2e3b8138dab4e36f3d52ae32\": rpc error: code = NotFound desc = could not find container \"dfbfe1325c6539258808a9347d5b803eae2a2e9b2e3b8138dab4e36f3d52ae32\": container with ID starting with dfbfe1325c6539258808a9347d5b803eae2a2e9b2e3b8138dab4e36f3d52ae32 not found: ID does not exist" Apr 24 21:19:36.370843 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.370825 2573 scope.go:117] "RemoveContainer" containerID="a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4" Apr 24 21:19:36.371044 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.371027 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4"} err="failed to get container status \"a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4\": rpc error: code = NotFound desc = could not find container \"a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4\": container with ID starting with a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4 not found: ID does not exist" Apr 24 21:19:36.371084 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.371046 2573 scope.go:117] "RemoveContainer" containerID="d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545" Apr 24 21:19:36.371256 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.371240 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545"} err="failed to get container status \"d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545\": rpc error: code = NotFound desc = could not find container \"d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545\": container with ID starting with d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545 not found: ID does not exist" Apr 24 21:19:36.371300 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.371256 2573 scope.go:117] "RemoveContainer" containerID="6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13" Apr 24 21:19:36.371458 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.371442 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13"} err="failed to get container status \"6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13\": rpc error: code = NotFound desc = could not find container \"6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13\": container with ID starting with 6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13 not found: ID does not exist" Apr 24 21:19:36.371507 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.371460 2573 scope.go:117] "RemoveContainer" containerID="eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2" Apr 24 21:19:36.371666 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.371643 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2"} err="failed to get container status \"eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2\": rpc error: code = NotFound desc = could not find container \"eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2\": container with ID starting with eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2 not found: ID does not exist" Apr 24 21:19:36.371748 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.371663 2573 scope.go:117] "RemoveContainer" containerID="6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8" Apr 24 21:19:36.371889 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.371849 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8"} err="failed to get container status \"6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8\": rpc error: code = NotFound desc = could not find container \"6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8\": container with ID starting with 6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8 not found: ID does not exist" Apr 24 21:19:36.371889 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.371885 2573 scope.go:117] "RemoveContainer" containerID="26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d" Apr 24 21:19:36.372088 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.372072 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d"} err="failed to get container status \"26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d\": rpc error: code = NotFound desc = could not find container \"26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d\": container with ID starting with 26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d not found: ID does not exist" Apr 24 21:19:36.372135 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.372088 2573 scope.go:117] "RemoveContainer" containerID="dfbfe1325c6539258808a9347d5b803eae2a2e9b2e3b8138dab4e36f3d52ae32" Apr 24 21:19:36.372277 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.372261 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfbfe1325c6539258808a9347d5b803eae2a2e9b2e3b8138dab4e36f3d52ae32"} err="failed to get container status \"dfbfe1325c6539258808a9347d5b803eae2a2e9b2e3b8138dab4e36f3d52ae32\": rpc error: code = NotFound desc = could not find container \"dfbfe1325c6539258808a9347d5b803eae2a2e9b2e3b8138dab4e36f3d52ae32\": container with ID starting with dfbfe1325c6539258808a9347d5b803eae2a2e9b2e3b8138dab4e36f3d52ae32 not found: ID does not exist" Apr 24 21:19:36.372277 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.372277 2573 scope.go:117] "RemoveContainer" containerID="a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4" Apr 24 21:19:36.372453 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.372435 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4"} err="failed to get container status \"a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4\": rpc error: code = NotFound desc = could not find container \"a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4\": container with ID starting with a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4 not found: ID does not exist" Apr 24 21:19:36.372520 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.372455 2573 scope.go:117] "RemoveContainer" containerID="d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545" Apr 24 21:19:36.372642 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.372627 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545"} err="failed to get container status \"d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545\": rpc error: code = NotFound desc = could not find container \"d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545\": container with ID starting with d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545 not found: ID does not exist" Apr 24 21:19:36.372689 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.372643 2573 scope.go:117] "RemoveContainer" containerID="6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13" Apr 24 21:19:36.372798 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.372782 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13"} err="failed to get container status \"6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13\": rpc error: code = NotFound desc = could not find container \"6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13\": container with ID starting with 6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13 not found: ID does not exist" Apr 24 21:19:36.372842 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.372799 2573 scope.go:117] "RemoveContainer" containerID="eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2" Apr 24 21:19:36.372982 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.372965 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2"} err="failed to get container status \"eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2\": rpc error: code = NotFound desc = could not find container \"eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2\": container with ID starting with eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2 not found: ID does not exist" Apr 24 21:19:36.373037 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.372982 2573 scope.go:117] "RemoveContainer" containerID="6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8" Apr 24 21:19:36.373157 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.373138 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8"} err="failed to get container status \"6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8\": rpc error: code = NotFound desc = could not find container \"6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8\": container with ID starting with 6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8 not found: ID does not exist" Apr 24 21:19:36.373195 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.373166 2573 scope.go:117] "RemoveContainer" containerID="26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d" Apr 24 21:19:36.373401 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.373380 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d"} err="failed to get container status \"26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d\": rpc error: code = NotFound desc = could not find container \"26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d\": container with ID starting with 26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d not found: ID does not exist" Apr 24 21:19:36.373475 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.373401 2573 scope.go:117] "RemoveContainer" containerID="dfbfe1325c6539258808a9347d5b803eae2a2e9b2e3b8138dab4e36f3d52ae32" Apr 24 21:19:36.373601 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.373584 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfbfe1325c6539258808a9347d5b803eae2a2e9b2e3b8138dab4e36f3d52ae32"} err="failed to get container status \"dfbfe1325c6539258808a9347d5b803eae2a2e9b2e3b8138dab4e36f3d52ae32\": rpc error: code = NotFound desc = could not find container \"dfbfe1325c6539258808a9347d5b803eae2a2e9b2e3b8138dab4e36f3d52ae32\": container with ID starting with dfbfe1325c6539258808a9347d5b803eae2a2e9b2e3b8138dab4e36f3d52ae32 not found: ID does not exist" Apr 24 21:19:36.373639 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.373603 2573 scope.go:117] "RemoveContainer" containerID="a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4" Apr 24 21:19:36.373795 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.373778 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4"} err="failed to get container status \"a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4\": rpc error: code = NotFound desc = could not find container \"a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4\": container with ID starting with a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4 not found: ID does not exist" Apr 24 21:19:36.373832 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.373795 2573 scope.go:117] "RemoveContainer" containerID="d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545" Apr 24 21:19:36.374105 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.374075 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545"} err="failed to get container status \"d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545\": rpc error: code = NotFound desc = could not find container \"d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545\": container with ID starting with d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545 not found: ID does not exist" Apr 24 21:19:36.374105 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.374106 2573 scope.go:117] "RemoveContainer" containerID="6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13" Apr 24 21:19:36.374377 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.374354 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13"} err="failed to get container status \"6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13\": rpc error: code = NotFound desc = could not find container \"6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13\": container with ID starting with 6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13 not found: ID does not exist" Apr 24 21:19:36.374446 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.374379 2573 scope.go:117] "RemoveContainer" containerID="eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2" Apr 24 21:19:36.374725 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.374699 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2"} err="failed to get container status \"eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2\": rpc error: code = NotFound desc = could not find container \"eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2\": container with ID starting with eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2 not found: ID does not exist" Apr 24 21:19:36.374832 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.374727 2573 scope.go:117] "RemoveContainer" containerID="6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8" Apr 24 21:19:36.375015 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.374991 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8"} err="failed to get container status \"6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8\": rpc error: code = NotFound desc = could not find container \"6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8\": container with ID starting with 6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8 not found: ID does not exist" Apr 24 21:19:36.375093 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.375018 2573 scope.go:117] "RemoveContainer" containerID="26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d" Apr 24 21:19:36.375297 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.375275 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d"} err="failed to get container status \"26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d\": rpc error: code = NotFound desc = could not find container \"26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d\": container with ID starting with 26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d not found: ID does not exist" Apr 24 21:19:36.375356 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.375298 2573 scope.go:117] "RemoveContainer" containerID="dfbfe1325c6539258808a9347d5b803eae2a2e9b2e3b8138dab4e36f3d52ae32" Apr 24 21:19:36.375541 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.375514 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfbfe1325c6539258808a9347d5b803eae2a2e9b2e3b8138dab4e36f3d52ae32"} err="failed to get container status \"dfbfe1325c6539258808a9347d5b803eae2a2e9b2e3b8138dab4e36f3d52ae32\": rpc error: code = NotFound desc = could not find container \"dfbfe1325c6539258808a9347d5b803eae2a2e9b2e3b8138dab4e36f3d52ae32\": container with ID starting with dfbfe1325c6539258808a9347d5b803eae2a2e9b2e3b8138dab4e36f3d52ae32 not found: ID does not exist" Apr 24 21:19:36.375596 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.375541 2573 scope.go:117] "RemoveContainer" containerID="a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4" Apr 24 21:19:36.375797 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.375772 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4"} err="failed to get container status \"a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4\": rpc error: code = NotFound desc = could not find container \"a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4\": container with ID starting with a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4 not found: ID does not exist" Apr 24 21:19:36.375934 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.375797 2573 scope.go:117] "RemoveContainer" containerID="d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545" Apr 24 21:19:36.376121 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.376100 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545"} err="failed to get container status \"d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545\": rpc error: code = NotFound desc = could not find container \"d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545\": container with ID starting with d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545 not found: ID does not exist" Apr 24 21:19:36.376121 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.376120 2573 scope.go:117] "RemoveContainer" containerID="6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13" Apr 24 21:19:36.376224 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.376172 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:19:36.376368 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.376344 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13"} err="failed to get container status \"6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13\": rpc error: code = NotFound desc = could not find container \"6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13\": container with ID starting with 6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13 not found: ID does not exist" Apr 24 21:19:36.376415 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.376371 2573 scope.go:117] "RemoveContainer" containerID="eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2" Apr 24 21:19:36.376560 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.376542 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" containerName="prometheus" Apr 24 21:19:36.376625 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.376562 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" containerName="prometheus" Apr 24 21:19:36.376625 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.376574 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" containerName="config-reloader" Apr 24 21:19:36.376625 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.376580 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" containerName="config-reloader" Apr 24 21:19:36.376625 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.376587 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" containerName="kube-rbac-proxy" Apr 24 21:19:36.376625 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.376592 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" containerName="kube-rbac-proxy" Apr 24 21:19:36.376625 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.376601 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" containerName="init-config-reloader" Apr 24 21:19:36.376625 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.376607 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" containerName="init-config-reloader" Apr 24 21:19:36.376625 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.376615 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" containerName="kube-rbac-proxy-thanos" Apr 24 21:19:36.376625 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.376622 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" containerName="kube-rbac-proxy-thanos" Apr 24 21:19:36.376954 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.376634 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" containerName="thanos-sidecar" Apr 24 21:19:36.376954 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.376639 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" containerName="thanos-sidecar" Apr 24 21:19:36.376954 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.376574 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2"} err="failed to get container status \"eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2\": rpc error: code = NotFound desc = could not find container \"eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2\": container with ID starting with eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2 not found: ID does not exist" Apr 24 21:19:36.376954 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.376648 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" containerName="kube-rbac-proxy-web" Apr 24 21:19:36.376954 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.376653 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" containerName="kube-rbac-proxy-web" Apr 24 21:19:36.376954 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.376673 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f5c73ec4-3e98-4b0a-a1ae-236998f28fb1" containerName="registry" Apr 24 21:19:36.376954 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.376678 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5c73ec4-3e98-4b0a-a1ae-236998f28fb1" containerName="registry" Apr 24 21:19:36.376954 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.376653 2573 scope.go:117] "RemoveContainer" containerID="6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8" Apr 24 21:19:36.376954 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.376735 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f5c73ec4-3e98-4b0a-a1ae-236998f28fb1" containerName="registry" Apr 24 21:19:36.376954 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.376745 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" containerName="config-reloader" Apr 24 21:19:36.376954 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.376754 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" containerName="kube-rbac-proxy-web" Apr 24 21:19:36.376954 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.376762 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" containerName="kube-rbac-proxy" Apr 24 21:19:36.376954 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.376772 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" containerName="thanos-sidecar" Apr 24 21:19:36.376954 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.376783 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" containerName="prometheus" Apr 24 21:19:36.376954 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.376794 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" containerName="kube-rbac-proxy-thanos" Apr 24 21:19:36.376954 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.376940 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8"} err="failed to get container status \"6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8\": rpc error: code = NotFound desc = could not find container \"6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8\": container with ID starting with 6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8 not found: ID does not exist" Apr 24 21:19:36.376954 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.376955 2573 scope.go:117] "RemoveContainer" containerID="26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d" Apr 24 21:19:36.377486 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.377147 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d"} err="failed to get container status \"26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d\": rpc error: code = NotFound desc = could not find container \"26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d\": container with ID starting with 26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d not found: ID does not exist" Apr 24 21:19:36.377486 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.377163 2573 scope.go:117] "RemoveContainer" containerID="dfbfe1325c6539258808a9347d5b803eae2a2e9b2e3b8138dab4e36f3d52ae32" Apr 24 21:19:36.377486 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.377364 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfbfe1325c6539258808a9347d5b803eae2a2e9b2e3b8138dab4e36f3d52ae32"} err="failed to get container status \"dfbfe1325c6539258808a9347d5b803eae2a2e9b2e3b8138dab4e36f3d52ae32\": rpc error: code = NotFound desc = could not find container \"dfbfe1325c6539258808a9347d5b803eae2a2e9b2e3b8138dab4e36f3d52ae32\": container with ID starting with dfbfe1325c6539258808a9347d5b803eae2a2e9b2e3b8138dab4e36f3d52ae32 not found: ID does not exist" Apr 24 21:19:36.377486 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.377414 2573 scope.go:117] "RemoveContainer" containerID="a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4" Apr 24 21:19:36.377702 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.377679 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4"} err="failed to get container status \"a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4\": rpc error: code = NotFound desc = could not find container \"a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4\": container with ID starting with a9989b4ced884de0e01591188e34e54ad859ab914d17a36658f41c2bb264c8d4 not found: ID does not exist" Apr 24 21:19:36.377753 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.377704 2573 scope.go:117] "RemoveContainer" containerID="d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545" Apr 24 21:19:36.377968 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.377949 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545"} err="failed to get container status \"d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545\": rpc error: code = NotFound desc = could not find container \"d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545\": container with ID starting with d1a0ccbb66ad0ce99239eb8e4d7da820afcbd2f6611d1a92103b6010fdc7c545 not found: ID does not exist" Apr 24 21:19:36.378011 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.377970 2573 scope.go:117] "RemoveContainer" containerID="6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13" Apr 24 21:19:36.378178 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.378163 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13"} err="failed to get container status \"6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13\": rpc error: code = NotFound desc = could not find container \"6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13\": container with ID starting with 6575c8ed7b614d77a5011483d23b6306867a82e2a855466d0a1683a4368d7c13 not found: ID does not exist" Apr 24 21:19:36.378231 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.378178 2573 scope.go:117] "RemoveContainer" containerID="eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2" Apr 24 21:19:36.378375 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.378358 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2"} err="failed to get container status \"eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2\": rpc error: code = NotFound desc = could not find container \"eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2\": container with ID starting with eef430f2e6add31af1c6ce3c662e538899119acc33508daa8778b9c1630b1ba2 not found: ID does not exist" Apr 24 21:19:36.378422 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.378375 2573 scope.go:117] "RemoveContainer" containerID="6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8" Apr 24 21:19:36.378586 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.378567 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8"} err="failed to get container status \"6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8\": rpc error: code = NotFound desc = could not find container \"6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8\": container with ID starting with 6b3c9afd9020293b98b03d0295df6427b44d85c9a87336a5afb21c71c6a991e8 not found: ID does not exist" Apr 24 21:19:36.378635 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.378587 2573 scope.go:117] "RemoveContainer" containerID="26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d" Apr 24 21:19:36.378836 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.378816 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d"} err="failed to get container status \"26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d\": rpc error: code = NotFound desc = could not find container \"26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d\": container with ID starting with 26c1b6860aa2016aa819a17e464109d8b96193c2d4306d45e5910e7f5c46f61d not found: ID does not exist" Apr 24 21:19:36.378903 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.378836 2573 scope.go:117] "RemoveContainer" containerID="dfbfe1325c6539258808a9347d5b803eae2a2e9b2e3b8138dab4e36f3d52ae32" Apr 24 21:19:36.379092 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.379072 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfbfe1325c6539258808a9347d5b803eae2a2e9b2e3b8138dab4e36f3d52ae32"} err="failed to get container status \"dfbfe1325c6539258808a9347d5b803eae2a2e9b2e3b8138dab4e36f3d52ae32\": rpc error: code = NotFound desc = could not find container \"dfbfe1325c6539258808a9347d5b803eae2a2e9b2e3b8138dab4e36f3d52ae32\": container with ID starting with dfbfe1325c6539258808a9347d5b803eae2a2e9b2e3b8138dab4e36f3d52ae32 not found: ID does not exist" Apr 24 21:19:36.382285 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.382270 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.386095 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.386065 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 21:19:36.386095 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.386074 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 21:19:36.394480 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.394460 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 21:19:36.395227 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.395206 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 21:19:36.395227 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.395225 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-m26qw\"" Apr 24 21:19:36.395362 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.395235 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 21:19:36.395362 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.395264 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-8tcl0k3ke0b36\"" Apr 24 21:19:36.395362 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.395206 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 21:19:36.408334 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.408318 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 21:19:36.408507 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.408495 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 21:19:36.409517 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.409504 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 21:19:36.415505 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.415491 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 21:19:36.423629 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.423613 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvzms\" (UniqueName: \"kubernetes.io/projected/6bc5e087-28d3-4e11-9207-606735fbb327-kube-api-access-lvzms\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.423691 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.423637 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6bc5e087-28d3-4e11-9207-606735fbb327-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.423691 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.423663 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6bc5e087-28d3-4e11-9207-606735fbb327-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.423691 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.423685 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6bc5e087-28d3-4e11-9207-606735fbb327-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.423794 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.423738 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6bc5e087-28d3-4e11-9207-606735fbb327-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.423794 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.423769 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6bc5e087-28d3-4e11-9207-606735fbb327-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.423863 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.423805 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bc5e087-28d3-4e11-9207-606735fbb327-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.423863 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.423834 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6bc5e087-28d3-4e11-9207-606735fbb327-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.423863 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.423854 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bc5e087-28d3-4e11-9207-606735fbb327-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.423989 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.423888 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6bc5e087-28d3-4e11-9207-606735fbb327-config\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.423989 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.423912 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6bc5e087-28d3-4e11-9207-606735fbb327-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.423989 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.423947 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6bc5e087-28d3-4e11-9207-606735fbb327-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.423989 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.423967 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6bc5e087-28d3-4e11-9207-606735fbb327-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.423989 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.423982 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6bc5e087-28d3-4e11-9207-606735fbb327-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.424143 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.424000 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6bc5e087-28d3-4e11-9207-606735fbb327-config-out\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.424143 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.424032 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6bc5e087-28d3-4e11-9207-606735fbb327-web-config\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.424143 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.424062 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6bc5e087-28d3-4e11-9207-606735fbb327-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.424143 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.424083 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bc5e087-28d3-4e11-9207-606735fbb327-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.442349 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.442324 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:19:36.448300 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.448282 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 21:19:36.452019 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.451760 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 21:19:36.525482 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.525417 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6bc5e087-28d3-4e11-9207-606735fbb327-config\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.525482 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.525446 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6bc5e087-28d3-4e11-9207-606735fbb327-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.525482 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.525462 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6bc5e087-28d3-4e11-9207-606735fbb327-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.526834 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.525482 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6bc5e087-28d3-4e11-9207-606735fbb327-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.526834 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.525506 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6bc5e087-28d3-4e11-9207-606735fbb327-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.526834 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.525531 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6bc5e087-28d3-4e11-9207-606735fbb327-config-out\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.526834 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.525557 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6bc5e087-28d3-4e11-9207-606735fbb327-web-config\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.526834 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.525669 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6bc5e087-28d3-4e11-9207-606735fbb327-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.526834 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.525715 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bc5e087-28d3-4e11-9207-606735fbb327-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.526834 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.525766 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lvzms\" (UniqueName: \"kubernetes.io/projected/6bc5e087-28d3-4e11-9207-606735fbb327-kube-api-access-lvzms\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.526834 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.525793 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6bc5e087-28d3-4e11-9207-606735fbb327-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.526834 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.525830 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6bc5e087-28d3-4e11-9207-606735fbb327-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.526834 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.525895 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6bc5e087-28d3-4e11-9207-606735fbb327-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.526834 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.525932 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6bc5e087-28d3-4e11-9207-606735fbb327-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.526834 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.525961 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6bc5e087-28d3-4e11-9207-606735fbb327-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.526834 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.526005 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bc5e087-28d3-4e11-9207-606735fbb327-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.526834 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.526050 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6bc5e087-28d3-4e11-9207-606735fbb327-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.526834 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.526083 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bc5e087-28d3-4e11-9207-606735fbb327-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.526834 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.526281 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6bc5e087-28d3-4e11-9207-606735fbb327-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.526834 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.526712 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bc5e087-28d3-4e11-9207-606735fbb327-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.529366 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.528361 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bc5e087-28d3-4e11-9207-606735fbb327-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.529366 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.528913 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6bc5e087-28d3-4e11-9207-606735fbb327-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.529366 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.528958 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bc5e087-28d3-4e11-9207-606735fbb327-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.529366 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.528990 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6bc5e087-28d3-4e11-9207-606735fbb327-config-out\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.529366 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.529270 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6bc5e087-28d3-4e11-9207-606735fbb327-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.529366 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.529328 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6bc5e087-28d3-4e11-9207-606735fbb327-config\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.529704 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.529453 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6bc5e087-28d3-4e11-9207-606735fbb327-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.530237 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.530140 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6bc5e087-28d3-4e11-9207-606735fbb327-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.530321 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.530272 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6bc5e087-28d3-4e11-9207-606735fbb327-web-config\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.530572 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.530548 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6bc5e087-28d3-4e11-9207-606735fbb327-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.530897 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.530860 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6bc5e087-28d3-4e11-9207-606735fbb327-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.531273 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.531250 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6bc5e087-28d3-4e11-9207-606735fbb327-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.531413 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.531397 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6bc5e087-28d3-4e11-9207-606735fbb327-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.531566 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.531547 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6bc5e087-28d3-4e11-9207-606735fbb327-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.531631 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.531614 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6bc5e087-28d3-4e11-9207-606735fbb327-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.535138 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.535119 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvzms\" (UniqueName: \"kubernetes.io/projected/6bc5e087-28d3-4e11-9207-606735fbb327-kube-api-access-lvzms\") pod \"prometheus-k8s-0\" (UID: \"6bc5e087-28d3-4e11-9207-606735fbb327\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.571092 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.571068 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8" path="/var/lib/kubelet/pods/c611eee9-1cbe-4e0f-abe6-0eccd82b9fd8/volumes" Apr 24 21:19:36.690711 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.690681 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:19:36.816421 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:36.816396 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:19:36.818376 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:19:36.818353 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bc5e087_28d3_4e11_9207_606735fbb327.slice/crio-0338df21f02cae41df062fe3f86596715b5e3c73510d7aaefcaadaba09aed745 WatchSource:0}: Error finding container 0338df21f02cae41df062fe3f86596715b5e3c73510d7aaefcaadaba09aed745: Status 404 returned error can't find the container with id 0338df21f02cae41df062fe3f86596715b5e3c73510d7aaefcaadaba09aed745 Apr 24 21:19:37.325516 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:37.325485 2573 generic.go:358] "Generic (PLEG): container finished" podID="6bc5e087-28d3-4e11-9207-606735fbb327" containerID="5df3963a910a720a0fff2292d20ecbf846308e497d765617e1fa88b0c538bcfa" exitCode=0 Apr 24 21:19:37.325949 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:37.325570 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6bc5e087-28d3-4e11-9207-606735fbb327","Type":"ContainerDied","Data":"5df3963a910a720a0fff2292d20ecbf846308e497d765617e1fa88b0c538bcfa"} Apr 24 21:19:37.325949 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:37.325616 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6bc5e087-28d3-4e11-9207-606735fbb327","Type":"ContainerStarted","Data":"0338df21f02cae41df062fe3f86596715b5e3c73510d7aaefcaadaba09aed745"} Apr 24 21:19:38.332511 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:38.332477 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6bc5e087-28d3-4e11-9207-606735fbb327","Type":"ContainerStarted","Data":"f87a1234e9b7b5472c44fec86616ad63722de45d4a5679c49e5015b0db2f18c2"} Apr 24 21:19:38.332511 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:38.332514 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6bc5e087-28d3-4e11-9207-606735fbb327","Type":"ContainerStarted","Data":"b64f13c220b8c0c2cc016577fdc1f3c807ed8d9b5b2f0fff4ffa8c75dae26c92"} Apr 24 21:19:38.332906 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:38.332526 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6bc5e087-28d3-4e11-9207-606735fbb327","Type":"ContainerStarted","Data":"5334421dc4704d43089f854a5a631740a720fae77ccadd97a3427cc3bb355302"} Apr 24 21:19:38.332906 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:38.332534 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6bc5e087-28d3-4e11-9207-606735fbb327","Type":"ContainerStarted","Data":"6e269c9fa23aa27987568c6a1914bc669fad2e2e5aef5ae9689ea758c2fa9578"} Apr 24 21:19:38.332906 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:38.332542 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6bc5e087-28d3-4e11-9207-606735fbb327","Type":"ContainerStarted","Data":"7bb84e30c6e798afc23ab02eb0e7c9f2ab73731dd5a977426ee24e0417ea38e5"} Apr 24 21:19:38.332906 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:38.332550 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6bc5e087-28d3-4e11-9207-606735fbb327","Type":"ContainerStarted","Data":"d7dd64b729260b3b6300262b515c80834f8a410fa90849e22c618408ae0c281e"} Apr 24 21:19:38.361757 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:38.361698 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.361683481 podStartE2EDuration="2.361683481s" podCreationTimestamp="2026-04-24 21:19:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:19:38.360156788 +0000 UTC m=+238.355907546" watchObservedRunningTime="2026-04-24 21:19:38.361683481 +0000 UTC m=+238.357434239" Apr 24 21:19:41.691599 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:19:41.691558 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:20:36.691831 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:20:36.691796 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:20:36.706704 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:20:36.706678 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:20:37.517286 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:20:37.517250 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:20:40.450040 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:20:40.450010 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/2.log" Apr 24 21:20:40.450978 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:20:40.450956 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/2.log" Apr 24 21:20:40.454234 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:20:40.454214 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/ovn-acl-logging/0.log" Apr 24 21:20:40.454910 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:20:40.454893 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/ovn-acl-logging/0.log" Apr 24 21:20:40.460928 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:20:40.460910 2573 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 21:24:42.030680 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:24:42.030599 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-krml7"] Apr 24 21:24:42.034108 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:24:42.034087 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-krml7" Apr 24 21:24:42.036784 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:24:42.036743 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 24 21:24:42.037651 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:24:42.037630 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-8rg2q\"" Apr 24 21:24:42.037771 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:24:42.037664 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 21:24:42.037771 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:24:42.037689 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 21:24:42.042785 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:24:42.042766 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-krml7"] Apr 24 21:24:42.121486 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:24:42.121457 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/381681d5-f4d2-4d6a-95df-11552410ec25-cert\") pod \"odh-model-controller-696fc77849-krml7\" (UID: \"381681d5-f4d2-4d6a-95df-11552410ec25\") " pod="kserve/odh-model-controller-696fc77849-krml7" Apr 24 21:24:42.121654 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:24:42.121539 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgxwq\" (UniqueName: \"kubernetes.io/projected/381681d5-f4d2-4d6a-95df-11552410ec25-kube-api-access-qgxwq\") pod \"odh-model-controller-696fc77849-krml7\" (UID: \"381681d5-f4d2-4d6a-95df-11552410ec25\") " pod="kserve/odh-model-controller-696fc77849-krml7" Apr 24 21:24:42.222600 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:24:42.222570 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qgxwq\" (UniqueName: \"kubernetes.io/projected/381681d5-f4d2-4d6a-95df-11552410ec25-kube-api-access-qgxwq\") pod \"odh-model-controller-696fc77849-krml7\" (UID: \"381681d5-f4d2-4d6a-95df-11552410ec25\") " pod="kserve/odh-model-controller-696fc77849-krml7" Apr 24 21:24:42.222763 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:24:42.222656 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/381681d5-f4d2-4d6a-95df-11552410ec25-cert\") pod \"odh-model-controller-696fc77849-krml7\" (UID: \"381681d5-f4d2-4d6a-95df-11552410ec25\") " pod="kserve/odh-model-controller-696fc77849-krml7" Apr 24 21:24:42.222812 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:24:42.222786 2573 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 24 21:24:42.222850 ip-10-0-134-147 kubenswrapper[2573]: E0424 21:24:42.222846 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/381681d5-f4d2-4d6a-95df-11552410ec25-cert podName:381681d5-f4d2-4d6a-95df-11552410ec25 nodeName:}" failed. No retries permitted until 2026-04-24 21:24:42.722827239 +0000 UTC m=+542.718577981 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/381681d5-f4d2-4d6a-95df-11552410ec25-cert") pod "odh-model-controller-696fc77849-krml7" (UID: "381681d5-f4d2-4d6a-95df-11552410ec25") : secret "odh-model-controller-webhook-cert" not found Apr 24 21:24:42.232894 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:24:42.232857 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgxwq\" (UniqueName: \"kubernetes.io/projected/381681d5-f4d2-4d6a-95df-11552410ec25-kube-api-access-qgxwq\") pod \"odh-model-controller-696fc77849-krml7\" (UID: \"381681d5-f4d2-4d6a-95df-11552410ec25\") " pod="kserve/odh-model-controller-696fc77849-krml7" Apr 24 21:24:42.726381 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:24:42.726344 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/381681d5-f4d2-4d6a-95df-11552410ec25-cert\") pod \"odh-model-controller-696fc77849-krml7\" (UID: \"381681d5-f4d2-4d6a-95df-11552410ec25\") " pod="kserve/odh-model-controller-696fc77849-krml7" Apr 24 21:24:42.728640 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:24:42.728617 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/381681d5-f4d2-4d6a-95df-11552410ec25-cert\") pod \"odh-model-controller-696fc77849-krml7\" (UID: \"381681d5-f4d2-4d6a-95df-11552410ec25\") " pod="kserve/odh-model-controller-696fc77849-krml7" Apr 24 21:24:42.946166 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:24:42.946135 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-krml7" Apr 24 21:24:43.079333 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:24:43.079303 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-krml7"] Apr 24 21:24:43.083050 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:24:43.083013 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod381681d5_f4d2_4d6a_95df_11552410ec25.slice/crio-b0f346d38ed81f9742fe3b5862fa9ed5104fc0a39cf736a28d70fe4d1459064f WatchSource:0}: Error finding container b0f346d38ed81f9742fe3b5862fa9ed5104fc0a39cf736a28d70fe4d1459064f: Status 404 returned error can't find the container with id b0f346d38ed81f9742fe3b5862fa9ed5104fc0a39cf736a28d70fe4d1459064f Apr 24 21:24:43.084197 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:24:43.084178 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:24:43.193179 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:24:43.193144 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-krml7" event={"ID":"381681d5-f4d2-4d6a-95df-11552410ec25","Type":"ContainerStarted","Data":"b0f346d38ed81f9742fe3b5862fa9ed5104fc0a39cf736a28d70fe4d1459064f"} Apr 24 21:24:46.203914 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:24:46.203865 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-krml7" event={"ID":"381681d5-f4d2-4d6a-95df-11552410ec25","Type":"ContainerStarted","Data":"ff50f5c3665636a38106cb5fe545149154a360857978c3aab6184894158a7d10"} Apr 24 21:24:46.204284 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:24:46.204026 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-krml7" Apr 24 21:24:46.232963 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:24:46.232921 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-krml7" podStartSLOduration=1.829473368 podStartE2EDuration="4.232907794s" podCreationTimestamp="2026-04-24 21:24:42 +0000 UTC" firstStartedPulling="2026-04-24 21:24:43.084296751 +0000 UTC m=+543.080047487" lastFinishedPulling="2026-04-24 21:24:45.487731163 +0000 UTC m=+545.483481913" observedRunningTime="2026-04-24 21:24:46.232568964 +0000 UTC m=+546.228319722" watchObservedRunningTime="2026-04-24 21:24:46.232907794 +0000 UTC m=+546.228658548" Apr 24 21:24:57.209671 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:24:57.209639 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-krml7" Apr 24 21:24:57.995658 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:24:57.995624 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-pnl7z"] Apr 24 21:24:57.998840 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:24:57.998824 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-pnl7z" Apr 24 21:24:58.001154 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:24:58.001131 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 21:24:58.001154 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:24:58.001132 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-d7l67\"" Apr 24 21:24:58.005965 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:24:58.005941 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-pnl7z"] Apr 24 21:24:58.057410 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:24:58.057384 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrrtn\" (UniqueName: \"kubernetes.io/projected/e5d4192a-b937-4b5c-aabc-9e283e8b8c17-kube-api-access-mrrtn\") pod \"s3-init-pnl7z\" (UID: \"e5d4192a-b937-4b5c-aabc-9e283e8b8c17\") " pod="kserve/s3-init-pnl7z" Apr 24 21:24:58.158162 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:24:58.158128 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mrrtn\" (UniqueName: \"kubernetes.io/projected/e5d4192a-b937-4b5c-aabc-9e283e8b8c17-kube-api-access-mrrtn\") pod \"s3-init-pnl7z\" (UID: \"e5d4192a-b937-4b5c-aabc-9e283e8b8c17\") " pod="kserve/s3-init-pnl7z" Apr 24 21:24:58.166571 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:24:58.166547 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrrtn\" (UniqueName: \"kubernetes.io/projected/e5d4192a-b937-4b5c-aabc-9e283e8b8c17-kube-api-access-mrrtn\") pod \"s3-init-pnl7z\" (UID: \"e5d4192a-b937-4b5c-aabc-9e283e8b8c17\") " pod="kserve/s3-init-pnl7z" Apr 24 21:24:58.316645 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:24:58.316562 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-pnl7z" Apr 24 21:24:58.430914 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:24:58.430890 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-pnl7z"] Apr 24 21:24:58.433532 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:24:58.433504 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5d4192a_b937_4b5c_aabc_9e283e8b8c17.slice/crio-03a0e117d3704a8fd797c5068b9b4a3c5942c1385b275c0d33b66a3fd6aa4889 WatchSource:0}: Error finding container 03a0e117d3704a8fd797c5068b9b4a3c5942c1385b275c0d33b66a3fd6aa4889: Status 404 returned error can't find the container with id 03a0e117d3704a8fd797c5068b9b4a3c5942c1385b275c0d33b66a3fd6aa4889 Apr 24 21:24:59.245157 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:24:59.245116 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-pnl7z" event={"ID":"e5d4192a-b937-4b5c-aabc-9e283e8b8c17","Type":"ContainerStarted","Data":"03a0e117d3704a8fd797c5068b9b4a3c5942c1385b275c0d33b66a3fd6aa4889"} Apr 24 21:25:03.266549 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:03.266514 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-pnl7z" event={"ID":"e5d4192a-b937-4b5c-aabc-9e283e8b8c17","Type":"ContainerStarted","Data":"91ffb4c04581b3e2d2ed58cca1be68faa908d0c8f605b89004d5ab63afbf33c4"} Apr 24 21:25:03.288543 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:03.288486 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-pnl7z" podStartSLOduration=1.898014581 podStartE2EDuration="6.288467991s" podCreationTimestamp="2026-04-24 21:24:57 +0000 UTC" firstStartedPulling="2026-04-24 21:24:58.435797421 +0000 UTC m=+558.431548161" lastFinishedPulling="2026-04-24 21:25:02.82625083 +0000 UTC m=+562.822001571" observedRunningTime="2026-04-24 21:25:03.286025628 +0000 UTC m=+563.281776395" watchObservedRunningTime="2026-04-24 21:25:03.288467991 +0000 UTC m=+563.284218750" Apr 24 21:25:06.277149 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:06.277115 2573 generic.go:358] "Generic (PLEG): container finished" podID="e5d4192a-b937-4b5c-aabc-9e283e8b8c17" containerID="91ffb4c04581b3e2d2ed58cca1be68faa908d0c8f605b89004d5ab63afbf33c4" exitCode=0 Apr 24 21:25:06.277616 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:06.277159 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-pnl7z" event={"ID":"e5d4192a-b937-4b5c-aabc-9e283e8b8c17","Type":"ContainerDied","Data":"91ffb4c04581b3e2d2ed58cca1be68faa908d0c8f605b89004d5ab63afbf33c4"} Apr 24 21:25:07.400145 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:07.400116 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-pnl7z" Apr 24 21:25:07.442754 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:07.442725 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrrtn\" (UniqueName: \"kubernetes.io/projected/e5d4192a-b937-4b5c-aabc-9e283e8b8c17-kube-api-access-mrrtn\") pod \"e5d4192a-b937-4b5c-aabc-9e283e8b8c17\" (UID: \"e5d4192a-b937-4b5c-aabc-9e283e8b8c17\") " Apr 24 21:25:07.444786 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:07.444762 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5d4192a-b937-4b5c-aabc-9e283e8b8c17-kube-api-access-mrrtn" (OuterVolumeSpecName: "kube-api-access-mrrtn") pod "e5d4192a-b937-4b5c-aabc-9e283e8b8c17" (UID: "e5d4192a-b937-4b5c-aabc-9e283e8b8c17"). InnerVolumeSpecName "kube-api-access-mrrtn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:25:07.543701 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:07.543627 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mrrtn\" (UniqueName: \"kubernetes.io/projected/e5d4192a-b937-4b5c-aabc-9e283e8b8c17-kube-api-access-mrrtn\") on node \"ip-10-0-134-147.ec2.internal\" DevicePath \"\"" Apr 24 21:25:08.284966 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:08.284931 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-pnl7z" event={"ID":"e5d4192a-b937-4b5c-aabc-9e283e8b8c17","Type":"ContainerDied","Data":"03a0e117d3704a8fd797c5068b9b4a3c5942c1385b275c0d33b66a3fd6aa4889"} Apr 24 21:25:08.284966 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:08.284964 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03a0e117d3704a8fd797c5068b9b4a3c5942c1385b275c0d33b66a3fd6aa4889" Apr 24 21:25:08.284966 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:08.284965 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-pnl7z" Apr 24 21:25:15.879730 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:15.879694 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-custom-56h98"] Apr 24 21:25:15.880117 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:15.880022 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5d4192a-b937-4b5c-aabc-9e283e8b8c17" containerName="s3-init" Apr 24 21:25:15.880117 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:15.880033 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5d4192a-b937-4b5c-aabc-9e283e8b8c17" containerName="s3-init" Apr 24 21:25:15.880117 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:15.880097 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e5d4192a-b937-4b5c-aabc-9e283e8b8c17" containerName="s3-init" Apr 24 21:25:15.883271 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:15.883254 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-56h98" Apr 24 21:25:15.885661 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:15.885636 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 24 21:25:15.885736 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:15.885685 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-d7l67\"" Apr 24 21:25:15.890297 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:15.890266 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-56h98"] Apr 24 21:25:16.017802 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:16.017766 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdhdl\" (UniqueName: \"kubernetes.io/projected/cf639ec8-5ec1-408d-b8c3-07575522abeb-kube-api-access-sdhdl\") pod \"s3-tls-init-custom-56h98\" (UID: \"cf639ec8-5ec1-408d-b8c3-07575522abeb\") " pod="kserve/s3-tls-init-custom-56h98" Apr 24 21:25:16.118692 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:16.118661 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sdhdl\" (UniqueName: \"kubernetes.io/projected/cf639ec8-5ec1-408d-b8c3-07575522abeb-kube-api-access-sdhdl\") pod \"s3-tls-init-custom-56h98\" (UID: \"cf639ec8-5ec1-408d-b8c3-07575522abeb\") " pod="kserve/s3-tls-init-custom-56h98" Apr 24 21:25:16.127377 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:16.127354 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdhdl\" (UniqueName: \"kubernetes.io/projected/cf639ec8-5ec1-408d-b8c3-07575522abeb-kube-api-access-sdhdl\") pod \"s3-tls-init-custom-56h98\" (UID: \"cf639ec8-5ec1-408d-b8c3-07575522abeb\") " pod="kserve/s3-tls-init-custom-56h98" Apr 24 21:25:16.211629 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:16.211605 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-56h98" Apr 24 21:25:16.327475 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:16.327356 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-56h98"] Apr 24 21:25:16.330044 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:25:16.330013 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf639ec8_5ec1_408d_b8c3_07575522abeb.slice/crio-b8aa608f015c33642063c44b3cbddc74ca1bdd34d338ac41947122b85dfd0fb3 WatchSource:0}: Error finding container b8aa608f015c33642063c44b3cbddc74ca1bdd34d338ac41947122b85dfd0fb3: Status 404 returned error can't find the container with id b8aa608f015c33642063c44b3cbddc74ca1bdd34d338ac41947122b85dfd0fb3 Apr 24 21:25:17.316405 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:17.316319 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-56h98" event={"ID":"cf639ec8-5ec1-408d-b8c3-07575522abeb","Type":"ContainerStarted","Data":"72a8b3b0b1ac8dc80d1960090de9d3ce61fd2e78746c6c6b6919b7b176571d22"} Apr 24 21:25:17.316405 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:17.316356 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-56h98" event={"ID":"cf639ec8-5ec1-408d-b8c3-07575522abeb","Type":"ContainerStarted","Data":"b8aa608f015c33642063c44b3cbddc74ca1bdd34d338ac41947122b85dfd0fb3"} Apr 24 21:25:17.334943 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:17.334885 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-custom-56h98" podStartSLOduration=2.334851632 podStartE2EDuration="2.334851632s" podCreationTimestamp="2026-04-24 21:25:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:25:17.333506796 +0000 UTC m=+577.329257561" watchObservedRunningTime="2026-04-24 21:25:17.334851632 +0000 UTC m=+577.330602405" Apr 24 21:25:21.329624 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:21.329593 2573 generic.go:358] "Generic (PLEG): container finished" podID="cf639ec8-5ec1-408d-b8c3-07575522abeb" containerID="72a8b3b0b1ac8dc80d1960090de9d3ce61fd2e78746c6c6b6919b7b176571d22" exitCode=0 Apr 24 21:25:21.330022 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:21.329670 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-56h98" event={"ID":"cf639ec8-5ec1-408d-b8c3-07575522abeb","Type":"ContainerDied","Data":"72a8b3b0b1ac8dc80d1960090de9d3ce61fd2e78746c6c6b6919b7b176571d22"} Apr 24 21:25:22.461805 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:22.461784 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-56h98" Apr 24 21:25:22.575480 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:22.575449 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdhdl\" (UniqueName: \"kubernetes.io/projected/cf639ec8-5ec1-408d-b8c3-07575522abeb-kube-api-access-sdhdl\") pod \"cf639ec8-5ec1-408d-b8c3-07575522abeb\" (UID: \"cf639ec8-5ec1-408d-b8c3-07575522abeb\") " Apr 24 21:25:22.577285 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:22.577260 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf639ec8-5ec1-408d-b8c3-07575522abeb-kube-api-access-sdhdl" (OuterVolumeSpecName: "kube-api-access-sdhdl") pod "cf639ec8-5ec1-408d-b8c3-07575522abeb" (UID: "cf639ec8-5ec1-408d-b8c3-07575522abeb"). InnerVolumeSpecName "kube-api-access-sdhdl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:25:22.676819 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:22.676749 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sdhdl\" (UniqueName: \"kubernetes.io/projected/cf639ec8-5ec1-408d-b8c3-07575522abeb-kube-api-access-sdhdl\") on node \"ip-10-0-134-147.ec2.internal\" DevicePath \"\"" Apr 24 21:25:23.336766 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:23.336734 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-56h98" Apr 24 21:25:23.336766 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:23.336744 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-56h98" event={"ID":"cf639ec8-5ec1-408d-b8c3-07575522abeb","Type":"ContainerDied","Data":"b8aa608f015c33642063c44b3cbddc74ca1bdd34d338ac41947122b85dfd0fb3"} Apr 24 21:25:23.336766 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:23.336774 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8aa608f015c33642063c44b3cbddc74ca1bdd34d338ac41947122b85dfd0fb3" Apr 24 21:25:26.173650 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:26.173617 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-serving-c9mvc"] Apr 24 21:25:26.174030 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:26.174012 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cf639ec8-5ec1-408d-b8c3-07575522abeb" containerName="s3-tls-init-custom" Apr 24 21:25:26.174030 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:26.174026 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf639ec8-5ec1-408d-b8c3-07575522abeb" containerName="s3-tls-init-custom" Apr 24 21:25:26.174117 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:26.174086 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="cf639ec8-5ec1-408d-b8c3-07575522abeb" containerName="s3-tls-init-custom" Apr 24 21:25:26.176044 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:26.176030 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-c9mvc" Apr 24 21:25:26.178391 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:26.178371 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 24 21:25:26.178515 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:26.178406 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-d7l67\"" Apr 24 21:25:26.184792 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:26.184771 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-c9mvc"] Apr 24 21:25:26.303591 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:26.303559 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqbsc\" (UniqueName: \"kubernetes.io/projected/aa51bbf9-cdff-47df-87ca-9b7fa788c0a9-kube-api-access-hqbsc\") pod \"s3-tls-init-serving-c9mvc\" (UID: \"aa51bbf9-cdff-47df-87ca-9b7fa788c0a9\") " pod="kserve/s3-tls-init-serving-c9mvc" Apr 24 21:25:26.404205 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:26.404170 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqbsc\" (UniqueName: \"kubernetes.io/projected/aa51bbf9-cdff-47df-87ca-9b7fa788c0a9-kube-api-access-hqbsc\") pod \"s3-tls-init-serving-c9mvc\" (UID: \"aa51bbf9-cdff-47df-87ca-9b7fa788c0a9\") " pod="kserve/s3-tls-init-serving-c9mvc" Apr 24 21:25:26.413236 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:26.413208 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqbsc\" (UniqueName: \"kubernetes.io/projected/aa51bbf9-cdff-47df-87ca-9b7fa788c0a9-kube-api-access-hqbsc\") pod \"s3-tls-init-serving-c9mvc\" (UID: \"aa51bbf9-cdff-47df-87ca-9b7fa788c0a9\") " pod="kserve/s3-tls-init-serving-c9mvc" Apr 24 21:25:26.497563 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:26.497536 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-c9mvc" Apr 24 21:25:26.617621 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:26.617590 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-c9mvc"] Apr 24 21:25:26.621122 ip-10-0-134-147 kubenswrapper[2573]: W0424 21:25:26.621094 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa51bbf9_cdff_47df_87ca_9b7fa788c0a9.slice/crio-a92a7407eb05ccfb976bac01261bc9878ecfe1314fa6c385288b305b21508cc2 WatchSource:0}: Error finding container a92a7407eb05ccfb976bac01261bc9878ecfe1314fa6c385288b305b21508cc2: Status 404 returned error can't find the container with id a92a7407eb05ccfb976bac01261bc9878ecfe1314fa6c385288b305b21508cc2 Apr 24 21:25:27.349210 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:27.349175 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-c9mvc" event={"ID":"aa51bbf9-cdff-47df-87ca-9b7fa788c0a9","Type":"ContainerStarted","Data":"9dceeef9ad04e49b9c638161e19d8df728f29f8140ca91488685ccaa159027dd"} Apr 24 21:25:27.349210 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:27.349212 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-c9mvc" event={"ID":"aa51bbf9-cdff-47df-87ca-9b7fa788c0a9","Type":"ContainerStarted","Data":"a92a7407eb05ccfb976bac01261bc9878ecfe1314fa6c385288b305b21508cc2"} Apr 24 21:25:27.365713 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:27.365658 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-serving-c9mvc" podStartSLOduration=1.3656440810000001 podStartE2EDuration="1.365644081s" podCreationTimestamp="2026-04-24 21:25:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:25:27.363248015 +0000 UTC m=+587.358998772" watchObservedRunningTime="2026-04-24 21:25:27.365644081 +0000 UTC m=+587.361394838" Apr 24 21:25:31.362254 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:31.362220 2573 generic.go:358] "Generic (PLEG): container finished" podID="aa51bbf9-cdff-47df-87ca-9b7fa788c0a9" containerID="9dceeef9ad04e49b9c638161e19d8df728f29f8140ca91488685ccaa159027dd" exitCode=0 Apr 24 21:25:31.362650 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:31.362295 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-c9mvc" event={"ID":"aa51bbf9-cdff-47df-87ca-9b7fa788c0a9","Type":"ContainerDied","Data":"9dceeef9ad04e49b9c638161e19d8df728f29f8140ca91488685ccaa159027dd"} Apr 24 21:25:32.507789 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:32.507759 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-c9mvc" Apr 24 21:25:32.658800 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:32.658730 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqbsc\" (UniqueName: \"kubernetes.io/projected/aa51bbf9-cdff-47df-87ca-9b7fa788c0a9-kube-api-access-hqbsc\") pod \"aa51bbf9-cdff-47df-87ca-9b7fa788c0a9\" (UID: \"aa51bbf9-cdff-47df-87ca-9b7fa788c0a9\") " Apr 24 21:25:32.660702 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:32.660670 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa51bbf9-cdff-47df-87ca-9b7fa788c0a9-kube-api-access-hqbsc" (OuterVolumeSpecName: "kube-api-access-hqbsc") pod "aa51bbf9-cdff-47df-87ca-9b7fa788c0a9" (UID: "aa51bbf9-cdff-47df-87ca-9b7fa788c0a9"). InnerVolumeSpecName "kube-api-access-hqbsc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:25:32.759611 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:32.759586 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hqbsc\" (UniqueName: \"kubernetes.io/projected/aa51bbf9-cdff-47df-87ca-9b7fa788c0a9-kube-api-access-hqbsc\") on node \"ip-10-0-134-147.ec2.internal\" DevicePath \"\"" Apr 24 21:25:33.368650 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:33.368609 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-c9mvc" event={"ID":"aa51bbf9-cdff-47df-87ca-9b7fa788c0a9","Type":"ContainerDied","Data":"a92a7407eb05ccfb976bac01261bc9878ecfe1314fa6c385288b305b21508cc2"} Apr 24 21:25:33.368650 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:33.368635 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-c9mvc" Apr 24 21:25:33.368926 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:33.368641 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a92a7407eb05ccfb976bac01261bc9878ecfe1314fa6c385288b305b21508cc2" Apr 24 21:25:40.474389 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:40.474357 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/2.log" Apr 24 21:25:40.476719 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:40.476692 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/2.log" Apr 24 21:25:40.478111 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:40.478093 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/ovn-acl-logging/0.log" Apr 24 21:25:40.480225 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:25:40.480206 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/ovn-acl-logging/0.log" Apr 24 21:30:40.497766 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:30:40.497740 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/2.log" Apr 24 21:30:40.500746 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:30:40.500726 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/2.log" Apr 24 21:30:40.501347 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:30:40.501329 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/ovn-acl-logging/0.log" Apr 24 21:30:40.504393 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:30:40.504373 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/ovn-acl-logging/0.log" Apr 24 21:35:40.521122 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:35:40.521095 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/2.log" Apr 24 21:35:40.524489 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:35:40.524471 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/2.log" Apr 24 21:35:40.524615 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:35:40.524537 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/ovn-acl-logging/0.log" Apr 24 21:35:40.527843 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:35:40.527823 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/ovn-acl-logging/0.log" Apr 24 21:40:40.546520 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:40:40.546490 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/2.log" Apr 24 21:40:40.548371 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:40:40.548346 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/2.log" Apr 24 21:40:40.550335 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:40:40.550317 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/ovn-acl-logging/0.log" Apr 24 21:40:40.551698 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:40:40.551680 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/ovn-acl-logging/0.log" Apr 24 21:45:40.578155 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:45:40.578129 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/2.log" Apr 24 21:45:40.579400 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:45:40.579377 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/2.log" Apr 24 21:45:40.581712 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:45:40.581685 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/ovn-acl-logging/0.log" Apr 24 21:45:40.582712 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:45:40.582694 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/ovn-acl-logging/0.log" Apr 24 21:50:40.600603 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:50:40.600574 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/2.log" Apr 24 21:50:40.602813 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:50:40.602783 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/2.log" Apr 24 21:50:40.604095 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:50:40.604078 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/ovn-acl-logging/0.log" Apr 24 21:50:40.606065 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:50:40.606047 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/ovn-acl-logging/0.log" Apr 24 21:55:40.624969 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:55:40.624939 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/2.log" Apr 24 21:55:40.627790 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:55:40.627770 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/2.log" Apr 24 21:55:40.628099 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:55:40.628081 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/ovn-acl-logging/0.log" Apr 24 21:55:40.631138 ip-10-0-134-147 kubenswrapper[2573]: I0424 21:55:40.631119 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/ovn-acl-logging/0.log" Apr 24 22:00:40.649296 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:00:40.649223 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/2.log" Apr 24 22:00:40.652848 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:00:40.652829 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/ovn-acl-logging/0.log" Apr 24 22:00:40.653265 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:00:40.653246 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/2.log" Apr 24 22:00:40.656616 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:00:40.656599 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/ovn-acl-logging/0.log" Apr 24 22:05:40.672603 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:05:40.672575 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/2.log" Apr 24 22:05:40.676085 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:05:40.676063 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/ovn-acl-logging/0.log" Apr 24 22:05:40.676859 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:05:40.676829 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/2.log" Apr 24 22:05:40.680177 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:05:40.680158 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/ovn-acl-logging/0.log" Apr 24 22:10:40.695313 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:10:40.695269 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/2.log" Apr 24 22:10:40.698706 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:10:40.698686 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/ovn-acl-logging/0.log" Apr 24 22:10:40.699582 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:10:40.699565 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/2.log" Apr 24 22:10:40.702780 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:10:40.702763 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/ovn-acl-logging/0.log" Apr 24 22:15:40.717148 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:15:40.717115 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/2.log" Apr 24 22:15:40.720450 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:15:40.720428 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/ovn-acl-logging/0.log" Apr 24 22:15:40.722725 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:15:40.722709 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/2.log" Apr 24 22:15:40.726311 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:15:40.726295 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/ovn-acl-logging/0.log" Apr 24 22:19:58.820217 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:19:58.820184 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7k9g8/must-gather-xhl9b"] Apr 24 22:19:58.820622 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:19:58.820508 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa51bbf9-cdff-47df-87ca-9b7fa788c0a9" containerName="s3-tls-init-serving" Apr 24 22:19:58.820622 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:19:58.820521 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa51bbf9-cdff-47df-87ca-9b7fa788c0a9" containerName="s3-tls-init-serving" Apr 24 22:19:58.820622 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:19:58.820612 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa51bbf9-cdff-47df-87ca-9b7fa788c0a9" containerName="s3-tls-init-serving" Apr 24 22:19:58.823674 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:19:58.823655 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7k9g8/must-gather-xhl9b" Apr 24 22:19:58.826006 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:19:58.825987 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7k9g8\"/\"kube-root-ca.crt\"" Apr 24 22:19:58.826115 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:19:58.825988 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7k9g8\"/\"openshift-service-ca.crt\"" Apr 24 22:19:58.831826 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:19:58.831802 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7k9g8/must-gather-xhl9b"] Apr 24 22:19:58.984130 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:19:58.984099 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/379e3d2b-e974-47e0-9a15-23ef71369561-must-gather-output\") pod \"must-gather-xhl9b\" (UID: \"379e3d2b-e974-47e0-9a15-23ef71369561\") " pod="openshift-must-gather-7k9g8/must-gather-xhl9b" Apr 24 22:19:58.984295 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:19:58.984142 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltjqk\" (UniqueName: \"kubernetes.io/projected/379e3d2b-e974-47e0-9a15-23ef71369561-kube-api-access-ltjqk\") pod \"must-gather-xhl9b\" (UID: \"379e3d2b-e974-47e0-9a15-23ef71369561\") " pod="openshift-must-gather-7k9g8/must-gather-xhl9b" Apr 24 22:19:59.084613 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:19:59.084533 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ltjqk\" (UniqueName: \"kubernetes.io/projected/379e3d2b-e974-47e0-9a15-23ef71369561-kube-api-access-ltjqk\") pod \"must-gather-xhl9b\" (UID: \"379e3d2b-e974-47e0-9a15-23ef71369561\") " pod="openshift-must-gather-7k9g8/must-gather-xhl9b" Apr 24 22:19:59.084752 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:19:59.084684 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/379e3d2b-e974-47e0-9a15-23ef71369561-must-gather-output\") pod \"must-gather-xhl9b\" (UID: \"379e3d2b-e974-47e0-9a15-23ef71369561\") " pod="openshift-must-gather-7k9g8/must-gather-xhl9b" Apr 24 22:19:59.085005 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:19:59.084988 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/379e3d2b-e974-47e0-9a15-23ef71369561-must-gather-output\") pod \"must-gather-xhl9b\" (UID: \"379e3d2b-e974-47e0-9a15-23ef71369561\") " pod="openshift-must-gather-7k9g8/must-gather-xhl9b" Apr 24 22:19:59.093033 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:19:59.093006 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltjqk\" (UniqueName: \"kubernetes.io/projected/379e3d2b-e974-47e0-9a15-23ef71369561-kube-api-access-ltjqk\") pod \"must-gather-xhl9b\" (UID: \"379e3d2b-e974-47e0-9a15-23ef71369561\") " pod="openshift-must-gather-7k9g8/must-gather-xhl9b" Apr 24 22:19:59.151045 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:19:59.151021 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7k9g8/must-gather-xhl9b" Apr 24 22:19:59.268828 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:19:59.268803 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7k9g8/must-gather-xhl9b"] Apr 24 22:19:59.271099 ip-10-0-134-147 kubenswrapper[2573]: W0424 22:19:59.271076 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod379e3d2b_e974_47e0_9a15_23ef71369561.slice/crio-1c5840e6dd8fbfac17d0ccbf907d8a9f660131092a4cd4e4fe64c226ae675ce7 WatchSource:0}: Error finding container 1c5840e6dd8fbfac17d0ccbf907d8a9f660131092a4cd4e4fe64c226ae675ce7: Status 404 returned error can't find the container with id 1c5840e6dd8fbfac17d0ccbf907d8a9f660131092a4cd4e4fe64c226ae675ce7 Apr 24 22:19:59.272645 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:19:59.272629 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:19:59.855600 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:19:59.855565 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7k9g8/must-gather-xhl9b" event={"ID":"379e3d2b-e974-47e0-9a15-23ef71369561","Type":"ContainerStarted","Data":"1c5840e6dd8fbfac17d0ccbf907d8a9f660131092a4cd4e4fe64c226ae675ce7"} Apr 24 22:20:03.871373 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:03.871336 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7k9g8/must-gather-xhl9b" event={"ID":"379e3d2b-e974-47e0-9a15-23ef71369561","Type":"ContainerStarted","Data":"c101d26c4d94dafe8995078663820eaa9a8e48953ebcb8c2828ed9f447a66c72"} Apr 24 22:20:03.871373 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:03.871377 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7k9g8/must-gather-xhl9b" event={"ID":"379e3d2b-e974-47e0-9a15-23ef71369561","Type":"ContainerStarted","Data":"13adee59ba311031aac49e00a5a98e3bb5f362fbf95a694434fe9ee9ae44e105"} Apr 24 22:20:24.939175 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:24.939091 2573 generic.go:358] "Generic (PLEG): container finished" podID="379e3d2b-e974-47e0-9a15-23ef71369561" containerID="13adee59ba311031aac49e00a5a98e3bb5f362fbf95a694434fe9ee9ae44e105" exitCode=0 Apr 24 22:20:24.939175 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:24.939163 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7k9g8/must-gather-xhl9b" event={"ID":"379e3d2b-e974-47e0-9a15-23ef71369561","Type":"ContainerDied","Data":"13adee59ba311031aac49e00a5a98e3bb5f362fbf95a694434fe9ee9ae44e105"} Apr 24 22:20:24.939599 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:24.939462 2573 scope.go:117] "RemoveContainer" containerID="13adee59ba311031aac49e00a5a98e3bb5f362fbf95a694434fe9ee9ae44e105" Apr 24 22:20:25.667588 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:25.667557 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7k9g8_must-gather-xhl9b_379e3d2b-e974-47e0-9a15-23ef71369561/gather/0.log" Apr 24 22:20:29.113637 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:29.113606 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-mt97w_d143aeb1-2388-4e2b-94e5-feca18fa8e79/global-pull-secret-syncer/0.log" Apr 24 22:20:29.311513 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:29.311483 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-vcw5r_790d66ed-34eb-4ff4-b315-99c7cda83b63/konnectivity-agent/0.log" Apr 24 22:20:29.440416 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:29.440348 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-147.ec2.internal_c7195f4bce997b5527f5c21d5b6e5e49/haproxy/0.log" Apr 24 22:20:31.129041 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:31.129002 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7k9g8/must-gather-xhl9b"] Apr 24 22:20:31.129494 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:31.129277 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-7k9g8/must-gather-xhl9b" podUID="379e3d2b-e974-47e0-9a15-23ef71369561" containerName="copy" containerID="cri-o://c101d26c4d94dafe8995078663820eaa9a8e48953ebcb8c2828ed9f447a66c72" gracePeriod=2 Apr 24 22:20:31.131453 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:31.131419 2573 status_manager.go:895] "Failed to get status for pod" podUID="379e3d2b-e974-47e0-9a15-23ef71369561" pod="openshift-must-gather-7k9g8/must-gather-xhl9b" err="pods \"must-gather-xhl9b\" is forbidden: User \"system:node:ip-10-0-134-147.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-7k9g8\": no relationship found between node 'ip-10-0-134-147.ec2.internal' and this object" Apr 24 22:20:31.133412 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:31.133143 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7k9g8/must-gather-xhl9b"] Apr 24 22:20:31.354842 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:31.354820 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7k9g8_must-gather-xhl9b_379e3d2b-e974-47e0-9a15-23ef71369561/copy/0.log" Apr 24 22:20:31.355221 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:31.355207 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7k9g8/must-gather-xhl9b" Apr 24 22:20:31.357117 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:31.357093 2573 status_manager.go:895] "Failed to get status for pod" podUID="379e3d2b-e974-47e0-9a15-23ef71369561" pod="openshift-must-gather-7k9g8/must-gather-xhl9b" err="pods \"must-gather-xhl9b\" is forbidden: User \"system:node:ip-10-0-134-147.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-7k9g8\": no relationship found between node 'ip-10-0-134-147.ec2.internal' and this object" Apr 24 22:20:31.452314 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:31.452293 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltjqk\" (UniqueName: \"kubernetes.io/projected/379e3d2b-e974-47e0-9a15-23ef71369561-kube-api-access-ltjqk\") pod \"379e3d2b-e974-47e0-9a15-23ef71369561\" (UID: \"379e3d2b-e974-47e0-9a15-23ef71369561\") " Apr 24 22:20:31.452403 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:31.452372 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/379e3d2b-e974-47e0-9a15-23ef71369561-must-gather-output\") pod \"379e3d2b-e974-47e0-9a15-23ef71369561\" (UID: \"379e3d2b-e974-47e0-9a15-23ef71369561\") " Apr 24 22:20:31.453425 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:31.453400 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/379e3d2b-e974-47e0-9a15-23ef71369561-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "379e3d2b-e974-47e0-9a15-23ef71369561" (UID: "379e3d2b-e974-47e0-9a15-23ef71369561"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:20:31.454441 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:31.454417 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/379e3d2b-e974-47e0-9a15-23ef71369561-kube-api-access-ltjqk" (OuterVolumeSpecName: "kube-api-access-ltjqk") pod "379e3d2b-e974-47e0-9a15-23ef71369561" (UID: "379e3d2b-e974-47e0-9a15-23ef71369561"). InnerVolumeSpecName "kube-api-access-ltjqk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:20:31.553955 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:31.553931 2573 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/379e3d2b-e974-47e0-9a15-23ef71369561-must-gather-output\") on node \"ip-10-0-134-147.ec2.internal\" DevicePath \"\"" Apr 24 22:20:31.553955 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:31.553952 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ltjqk\" (UniqueName: \"kubernetes.io/projected/379e3d2b-e974-47e0-9a15-23ef71369561-kube-api-access-ltjqk\") on node \"ip-10-0-134-147.ec2.internal\" DevicePath \"\"" Apr 24 22:20:31.959807 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:31.959780 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7k9g8_must-gather-xhl9b_379e3d2b-e974-47e0-9a15-23ef71369561/copy/0.log" Apr 24 22:20:31.960148 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:31.960127 2573 generic.go:358] "Generic (PLEG): container finished" podID="379e3d2b-e974-47e0-9a15-23ef71369561" containerID="c101d26c4d94dafe8995078663820eaa9a8e48953ebcb8c2828ed9f447a66c72" exitCode=143 Apr 24 22:20:31.960219 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:31.960173 2573 scope.go:117] "RemoveContainer" containerID="c101d26c4d94dafe8995078663820eaa9a8e48953ebcb8c2828ed9f447a66c72" Apr 24 22:20:31.960219 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:31.960176 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7k9g8/must-gather-xhl9b" Apr 24 22:20:31.962349 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:31.962315 2573 status_manager.go:895] "Failed to get status for pod" podUID="379e3d2b-e974-47e0-9a15-23ef71369561" pod="openshift-must-gather-7k9g8/must-gather-xhl9b" err="pods \"must-gather-xhl9b\" is forbidden: User \"system:node:ip-10-0-134-147.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-7k9g8\": no relationship found between node 'ip-10-0-134-147.ec2.internal' and this object" Apr 24 22:20:31.968026 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:31.968012 2573 scope.go:117] "RemoveContainer" containerID="13adee59ba311031aac49e00a5a98e3bb5f362fbf95a694434fe9ee9ae44e105" Apr 24 22:20:31.970080 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:31.970057 2573 status_manager.go:895] "Failed to get status for pod" podUID="379e3d2b-e974-47e0-9a15-23ef71369561" pod="openshift-must-gather-7k9g8/must-gather-xhl9b" err="pods \"must-gather-xhl9b\" is forbidden: User \"system:node:ip-10-0-134-147.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-7k9g8\": no relationship found between node 'ip-10-0-134-147.ec2.internal' and this object" Apr 24 22:20:31.979312 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:31.979295 2573 scope.go:117] "RemoveContainer" containerID="c101d26c4d94dafe8995078663820eaa9a8e48953ebcb8c2828ed9f447a66c72" Apr 24 22:20:31.979544 ip-10-0-134-147 kubenswrapper[2573]: E0424 22:20:31.979523 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c101d26c4d94dafe8995078663820eaa9a8e48953ebcb8c2828ed9f447a66c72\": container with ID starting with c101d26c4d94dafe8995078663820eaa9a8e48953ebcb8c2828ed9f447a66c72 not found: ID does not exist" containerID="c101d26c4d94dafe8995078663820eaa9a8e48953ebcb8c2828ed9f447a66c72" Apr 24 22:20:31.979593 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:31.979553 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c101d26c4d94dafe8995078663820eaa9a8e48953ebcb8c2828ed9f447a66c72"} err="failed to get container status \"c101d26c4d94dafe8995078663820eaa9a8e48953ebcb8c2828ed9f447a66c72\": rpc error: code = NotFound desc = could not find container \"c101d26c4d94dafe8995078663820eaa9a8e48953ebcb8c2828ed9f447a66c72\": container with ID starting with c101d26c4d94dafe8995078663820eaa9a8e48953ebcb8c2828ed9f447a66c72 not found: ID does not exist" Apr 24 22:20:31.979593 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:31.979571 2573 scope.go:117] "RemoveContainer" containerID="13adee59ba311031aac49e00a5a98e3bb5f362fbf95a694434fe9ee9ae44e105" Apr 24 22:20:31.979793 ip-10-0-134-147 kubenswrapper[2573]: E0424 22:20:31.979778 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13adee59ba311031aac49e00a5a98e3bb5f362fbf95a694434fe9ee9ae44e105\": container with ID starting with 13adee59ba311031aac49e00a5a98e3bb5f362fbf95a694434fe9ee9ae44e105 not found: ID does not exist" containerID="13adee59ba311031aac49e00a5a98e3bb5f362fbf95a694434fe9ee9ae44e105" Apr 24 22:20:31.979848 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:31.979798 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13adee59ba311031aac49e00a5a98e3bb5f362fbf95a694434fe9ee9ae44e105"} err="failed to get container status \"13adee59ba311031aac49e00a5a98e3bb5f362fbf95a694434fe9ee9ae44e105\": rpc error: code = NotFound desc = could not find container \"13adee59ba311031aac49e00a5a98e3bb5f362fbf95a694434fe9ee9ae44e105\": container with ID starting with 13adee59ba311031aac49e00a5a98e3bb5f362fbf95a694434fe9ee9ae44e105 not found: ID does not exist" Apr 24 22:20:32.139410 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:32.139380 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-nl9ms_772a4257-de5c-42f7-8b8c-0ee2404f99a6/cluster-monitoring-operator/0.log" Apr 24 22:20:32.285498 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:32.285421 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-5978444448-xwdl6_e4dc9d95-b126-48df-97ba-118157d5b0a4/metrics-server/0.log" Apr 24 22:20:32.363214 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:32.363185 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7khp8_bfcb8364-b2c4-4264-99a6-c796a5c6678a/node-exporter/0.log" Apr 24 22:20:32.390548 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:32.390524 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7khp8_bfcb8364-b2c4-4264-99a6-c796a5c6678a/kube-rbac-proxy/0.log" Apr 24 22:20:32.425027 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:32.425007 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7khp8_bfcb8364-b2c4-4264-99a6-c796a5c6678a/init-textfile/0.log" Apr 24 22:20:32.571367 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:32.571291 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="379e3d2b-e974-47e0-9a15-23ef71369561" path="/var/lib/kubelet/pods/379e3d2b-e974-47e0-9a15-23ef71369561/volumes" Apr 24 22:20:32.730368 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:32.730345 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6bc5e087-28d3-4e11-9207-606735fbb327/prometheus/0.log" Apr 24 22:20:32.787157 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:32.787132 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6bc5e087-28d3-4e11-9207-606735fbb327/config-reloader/0.log" Apr 24 22:20:32.819177 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:32.819156 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6bc5e087-28d3-4e11-9207-606735fbb327/thanos-sidecar/0.log" Apr 24 22:20:32.844669 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:32.844616 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6bc5e087-28d3-4e11-9207-606735fbb327/kube-rbac-proxy-web/0.log" Apr 24 22:20:32.870688 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:32.870670 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6bc5e087-28d3-4e11-9207-606735fbb327/kube-rbac-proxy/0.log" Apr 24 22:20:32.894449 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:32.894430 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6bc5e087-28d3-4e11-9207-606735fbb327/kube-rbac-proxy-thanos/0.log" Apr 24 22:20:32.923956 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:32.923938 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6bc5e087-28d3-4e11-9207-606735fbb327/init-config-reloader/0.log" Apr 24 22:20:32.957734 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:32.957705 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-hpwxd_7d8232dc-a440-4a5c-8138-8119b4f19fd7/prometheus-operator/0.log" Apr 24 22:20:32.987076 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:32.987055 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-hpwxd_7d8232dc-a440-4a5c-8138-8119b4f19fd7/kube-rbac-proxy/0.log" Apr 24 22:20:33.015599 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:33.015579 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-mdndr_5f38d17b-da8e-46bd-ba98-9a498446b4d2/prometheus-operator-admission-webhook/0.log" Apr 24 22:20:33.138529 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:33.138453 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5b89f6dc86-bwsr8_ef5047f4-c231-4399-9dd1-f54603124de8/thanos-query/0.log" Apr 24 22:20:33.175535 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:33.175515 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5b89f6dc86-bwsr8_ef5047f4-c231-4399-9dd1-f54603124de8/kube-rbac-proxy-web/0.log" Apr 24 22:20:33.202047 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:33.202024 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5b89f6dc86-bwsr8_ef5047f4-c231-4399-9dd1-f54603124de8/kube-rbac-proxy/0.log" Apr 24 22:20:33.229589 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:33.229567 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5b89f6dc86-bwsr8_ef5047f4-c231-4399-9dd1-f54603124de8/prom-label-proxy/0.log" Apr 24 22:20:33.254335 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:33.254317 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5b89f6dc86-bwsr8_ef5047f4-c231-4399-9dd1-f54603124de8/kube-rbac-proxy-rules/0.log" Apr 24 22:20:33.278824 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:33.278808 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5b89f6dc86-bwsr8_ef5047f4-c231-4399-9dd1-f54603124de8/kube-rbac-proxy-metrics/0.log" Apr 24 22:20:35.050936 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:35.050908 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/2.log" Apr 24 22:20:35.055164 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:35.055147 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/3.log" Apr 24 22:20:36.376619 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:36.376590 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2zhfh/perf-node-gather-daemonset-zt48f"] Apr 24 22:20:36.377001 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:36.376936 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="379e3d2b-e974-47e0-9a15-23ef71369561" containerName="gather" Apr 24 22:20:36.377001 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:36.376949 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="379e3d2b-e974-47e0-9a15-23ef71369561" containerName="gather" Apr 24 22:20:36.377001 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:36.376957 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="379e3d2b-e974-47e0-9a15-23ef71369561" containerName="copy" Apr 24 22:20:36.377001 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:36.376962 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="379e3d2b-e974-47e0-9a15-23ef71369561" containerName="copy" Apr 24 22:20:36.377134 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:36.377019 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="379e3d2b-e974-47e0-9a15-23ef71369561" containerName="gather" Apr 24 22:20:36.377134 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:36.377027 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="379e3d2b-e974-47e0-9a15-23ef71369561" containerName="copy" Apr 24 22:20:36.382140 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:36.382122 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-zt48f" Apr 24 22:20:36.384741 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:36.384716 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2zhfh\"/\"kube-root-ca.crt\"" Apr 24 22:20:36.384741 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:36.384729 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2zhfh\"/\"openshift-service-ca.crt\"" Apr 24 22:20:36.384948 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:36.384725 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-2zhfh\"/\"default-dockercfg-fxrng\"" Apr 24 22:20:36.387401 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:36.387380 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2zhfh/perf-node-gather-daemonset-zt48f"] Apr 24 22:20:36.492019 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:36.491989 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/74c23b2e-f081-4b00-9fbf-3f8b50f90901-lib-modules\") pod \"perf-node-gather-daemonset-zt48f\" (UID: \"74c23b2e-f081-4b00-9fbf-3f8b50f90901\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-zt48f" Apr 24 22:20:36.492209 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:36.492035 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2m5x\" (UniqueName: \"kubernetes.io/projected/74c23b2e-f081-4b00-9fbf-3f8b50f90901-kube-api-access-f2m5x\") pod \"perf-node-gather-daemonset-zt48f\" (UID: \"74c23b2e-f081-4b00-9fbf-3f8b50f90901\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-zt48f" Apr 24 22:20:36.492209 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:36.492117 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/74c23b2e-f081-4b00-9fbf-3f8b50f90901-proc\") pod \"perf-node-gather-daemonset-zt48f\" (UID: \"74c23b2e-f081-4b00-9fbf-3f8b50f90901\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-zt48f" Apr 24 22:20:36.492209 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:36.492163 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/74c23b2e-f081-4b00-9fbf-3f8b50f90901-podres\") pod \"perf-node-gather-daemonset-zt48f\" (UID: \"74c23b2e-f081-4b00-9fbf-3f8b50f90901\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-zt48f" Apr 24 22:20:36.492320 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:36.492219 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/74c23b2e-f081-4b00-9fbf-3f8b50f90901-sys\") pod \"perf-node-gather-daemonset-zt48f\" (UID: \"74c23b2e-f081-4b00-9fbf-3f8b50f90901\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-zt48f" Apr 24 22:20:36.529651 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:36.529628 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4fr27_022d6343-7dfc-470e-8e3c-3380ea630933/dns/0.log" Apr 24 22:20:36.553809 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:36.553791 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4fr27_022d6343-7dfc-470e-8e3c-3380ea630933/kube-rbac-proxy/0.log" Apr 24 22:20:36.593224 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:36.593205 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/74c23b2e-f081-4b00-9fbf-3f8b50f90901-proc\") pod \"perf-node-gather-daemonset-zt48f\" (UID: \"74c23b2e-f081-4b00-9fbf-3f8b50f90901\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-zt48f" Apr 24 22:20:36.593333 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:36.593235 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/74c23b2e-f081-4b00-9fbf-3f8b50f90901-podres\") pod \"perf-node-gather-daemonset-zt48f\" (UID: \"74c23b2e-f081-4b00-9fbf-3f8b50f90901\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-zt48f" Apr 24 22:20:36.593333 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:36.593265 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/74c23b2e-f081-4b00-9fbf-3f8b50f90901-sys\") pod \"perf-node-gather-daemonset-zt48f\" (UID: \"74c23b2e-f081-4b00-9fbf-3f8b50f90901\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-zt48f" Apr 24 22:20:36.593333 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:36.593304 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/74c23b2e-f081-4b00-9fbf-3f8b50f90901-lib-modules\") pod \"perf-node-gather-daemonset-zt48f\" (UID: \"74c23b2e-f081-4b00-9fbf-3f8b50f90901\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-zt48f" Apr 24 22:20:36.593473 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:36.593329 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/74c23b2e-f081-4b00-9fbf-3f8b50f90901-proc\") pod \"perf-node-gather-daemonset-zt48f\" (UID: \"74c23b2e-f081-4b00-9fbf-3f8b50f90901\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-zt48f" Apr 24 22:20:36.593473 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:36.593384 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/74c23b2e-f081-4b00-9fbf-3f8b50f90901-podres\") pod \"perf-node-gather-daemonset-zt48f\" (UID: \"74c23b2e-f081-4b00-9fbf-3f8b50f90901\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-zt48f" Apr 24 22:20:36.593473 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:36.593421 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/74c23b2e-f081-4b00-9fbf-3f8b50f90901-lib-modules\") pod \"perf-node-gather-daemonset-zt48f\" (UID: \"74c23b2e-f081-4b00-9fbf-3f8b50f90901\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-zt48f" Apr 24 22:20:36.593473 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:36.593428 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/74c23b2e-f081-4b00-9fbf-3f8b50f90901-sys\") pod \"perf-node-gather-daemonset-zt48f\" (UID: \"74c23b2e-f081-4b00-9fbf-3f8b50f90901\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-zt48f" Apr 24 22:20:36.593473 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:36.593451 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f2m5x\" (UniqueName: \"kubernetes.io/projected/74c23b2e-f081-4b00-9fbf-3f8b50f90901-kube-api-access-f2m5x\") pod \"perf-node-gather-daemonset-zt48f\" (UID: \"74c23b2e-f081-4b00-9fbf-3f8b50f90901\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-zt48f" Apr 24 22:20:36.600523 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:36.600505 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2m5x\" (UniqueName: \"kubernetes.io/projected/74c23b2e-f081-4b00-9fbf-3f8b50f90901-kube-api-access-f2m5x\") pod \"perf-node-gather-daemonset-zt48f\" (UID: \"74c23b2e-f081-4b00-9fbf-3f8b50f90901\") " pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-zt48f" Apr 24 22:20:36.692355 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:36.692274 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-zt48f" Apr 24 22:20:36.706689 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:36.706664 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zscc7_a1ae49ae-a1e3-464e-a9db-3d0bad2349ab/dns-node-resolver/0.log" Apr 24 22:20:36.807650 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:36.807624 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2zhfh/perf-node-gather-daemonset-zt48f"] Apr 24 22:20:36.809480 ip-10-0-134-147 kubenswrapper[2573]: W0424 22:20:36.809452 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod74c23b2e_f081_4b00_9fbf_3f8b50f90901.slice/crio-fbdf6f113694e6f25afcba44b09b268763a61d032b54f01ee2ab16ff1df86a35 WatchSource:0}: Error finding container fbdf6f113694e6f25afcba44b09b268763a61d032b54f01ee2ab16ff1df86a35: Status 404 returned error can't find the container with id fbdf6f113694e6f25afcba44b09b268763a61d032b54f01ee2ab16ff1df86a35 Apr 24 22:20:36.976073 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:36.976047 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-zt48f" event={"ID":"74c23b2e-f081-4b00-9fbf-3f8b50f90901","Type":"ContainerStarted","Data":"a7b3e8ce3f5d9335b63f8721c8c5898f6da0507f51c665773eb694598c9d8b46"} Apr 24 22:20:36.976073 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:36.976080 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-zt48f" event={"ID":"74c23b2e-f081-4b00-9fbf-3f8b50f90901","Type":"ContainerStarted","Data":"fbdf6f113694e6f25afcba44b09b268763a61d032b54f01ee2ab16ff1df86a35"} Apr 24 22:20:36.976213 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:36.976173 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-zt48f" Apr 24 22:20:36.991150 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:36.990598 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-zt48f" podStartSLOduration=0.990581988 podStartE2EDuration="990.581988ms" podCreationTimestamp="2026-04-24 22:20:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:20:36.989541291 +0000 UTC m=+3896.985292048" watchObservedRunningTime="2026-04-24 22:20:36.990581988 +0000 UTC m=+3896.986332747" Apr 24 22:20:37.184196 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:37.184173 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wv544_f71747fd-1913-4d70-b833-4f352b05ba15/node-ca/0.log" Apr 24 22:20:38.232552 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:38.232522 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-gp5s2_7c51aa96-bca7-47fa-bba2-badf0e22ee4d/serve-healthcheck-canary/0.log" Apr 24 22:20:38.563268 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:38.563182 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-xsrdx_fb24c7be-a2bf-47fd-a8da-4bcaf272012a/insights-operator/0.log" Apr 24 22:20:38.564661 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:38.564641 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-xsrdx_fb24c7be-a2bf-47fd-a8da-4bcaf272012a/insights-operator/1.log" Apr 24 22:20:38.584304 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:38.584282 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-78x5v_bd8ed4fc-6e48-474a-8cc2-9f257be8decd/kube-rbac-proxy/0.log" Apr 24 22:20:38.610011 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:38.609992 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-78x5v_bd8ed4fc-6e48-474a-8cc2-9f257be8decd/exporter/0.log" Apr 24 22:20:38.633022 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:38.633003 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-78x5v_bd8ed4fc-6e48-474a-8cc2-9f257be8decd/extractor/0.log" Apr 24 22:20:40.740479 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:40.740364 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/2.log" Apr 24 22:20:40.763816 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:40.743962 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/ovn-acl-logging/0.log" Apr 24 22:20:40.763816 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:40.746379 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dln5m_c1a58610-8ce3-4b65-8ceb-500127ff5a26/console-operator/2.log" Apr 24 22:20:40.763816 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:40.750077 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/ovn-acl-logging/0.log" Apr 24 22:20:40.944167 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:40.944136 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-krml7_381681d5-f4d2-4d6a-95df-11552410ec25/manager/0.log" Apr 24 22:20:40.965044 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:40.965024 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-pnl7z_e5d4192a-b937-4b5c-aabc-9e283e8b8c17/s3-init/0.log" Apr 24 22:20:40.989254 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:40.989232 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-custom-56h98_cf639ec8-5ec1-408d-b8c3-07575522abeb/s3-tls-init-custom/0.log" Apr 24 22:20:41.011903 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:41.011829 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-serving-c9mvc_aa51bbf9-cdff-47df-87ca-9b7fa788c0a9/s3-tls-init-serving/0.log" Apr 24 22:20:42.989073 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:42.989046 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-2zhfh/perf-node-gather-daemonset-zt48f" Apr 24 22:20:44.498431 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:44.498403 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-mrzsk_4dae161f-cd3c-4605-b787-8a713855dd4c/migrator/0.log" Apr 24 22:20:44.518361 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:44.518339 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-mrzsk_4dae161f-cd3c-4605-b787-8a713855dd4c/graceful-termination/0.log" Apr 24 22:20:44.861412 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:44.861388 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-gmz9s_621c1634-3d26-4e3e-8a2e-f735fa5423f9/kube-storage-version-migrator-operator/1.log" Apr 24 22:20:44.862074 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:44.862055 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-gmz9s_621c1634-3d26-4e3e-8a2e-f735fa5423f9/kube-storage-version-migrator-operator/0.log" Apr 24 22:20:45.844773 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:45.844710 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4qhzb_bc4f30e8-cb9e-4140-96c8-b6e37a3be1d2/kube-multus/0.log" Apr 24 22:20:46.045717 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:46.045687 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hz9sp_f855c2da-63b3-4393-85d5-d812d3b86100/kube-multus-additional-cni-plugins/0.log" Apr 24 22:20:46.066352 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:46.066328 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hz9sp_f855c2da-63b3-4393-85d5-d812d3b86100/egress-router-binary-copy/0.log" Apr 24 22:20:46.087322 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:46.087305 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hz9sp_f855c2da-63b3-4393-85d5-d812d3b86100/cni-plugins/0.log" Apr 24 22:20:46.108199 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:46.108149 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hz9sp_f855c2da-63b3-4393-85d5-d812d3b86100/bond-cni-plugin/0.log" Apr 24 22:20:46.131255 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:46.131235 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hz9sp_f855c2da-63b3-4393-85d5-d812d3b86100/routeoverride-cni/0.log" Apr 24 22:20:46.153684 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:46.153663 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hz9sp_f855c2da-63b3-4393-85d5-d812d3b86100/whereabouts-cni-bincopy/0.log" Apr 24 22:20:46.174837 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:46.174822 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hz9sp_f855c2da-63b3-4393-85d5-d812d3b86100/whereabouts-cni/0.log" Apr 24 22:20:46.456725 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:46.456703 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-m6d6n_223043ea-b132-4d5d-9a14-0496d53fdc53/network-metrics-daemon/0.log" Apr 24 22:20:46.476351 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:46.476333 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-m6d6n_223043ea-b132-4d5d-9a14-0496d53fdc53/kube-rbac-proxy/0.log" Apr 24 22:20:47.982130 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:47.982075 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/ovn-controller/0.log" Apr 24 22:20:47.999535 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:47.999509 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/ovn-acl-logging/0.log" Apr 24 22:20:48.016850 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:48.016827 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/ovn-acl-logging/1.log" Apr 24 22:20:48.034423 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:48.034401 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/kube-rbac-proxy-node/0.log" Apr 24 22:20:48.054443 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:48.054425 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 22:20:48.072210 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:48.072194 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/northd/0.log" Apr 24 22:20:48.091997 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:48.091978 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/nbdb/0.log" Apr 24 22:20:48.113016 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:48.112996 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/sbdb/0.log" Apr 24 22:20:48.209173 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:48.209146 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zvr9w_0a61c3b4-bf4f-42f7-afd7-075420c1040d/ovnkube-controller/0.log" Apr 24 22:20:49.102716 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:49.102685 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-bg2dl_593cf1e8-79f1-4dd1-a9db-ebd333078407/check-endpoints/0.log" Apr 24 22:20:49.169445 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:49.169420 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-wtj7q_ff449891-1658-40ff-a0bd-e08978c661e9/network-check-target-container/0.log" Apr 24 22:20:50.071115 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:50.071074 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-bztfd_019f7af9-37e7-4923-a370-a980a06b7377/iptables-alerter/0.log" Apr 24 22:20:50.782720 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:50.782693 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-t26r7_5e385e54-2661-4cd9-8bc1-ad9750b2e402/tuned/0.log" Apr 24 22:20:52.398694 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:52.398663 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-sqcwt_1706f63c-2ef2-44a4-9a58-455c69e1901d/cluster-samples-operator/0.log" Apr 24 22:20:52.417628 ip-10-0-134-147 kubenswrapper[2573]: I0424 22:20:52.417601 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-sqcwt_1706f63c-2ef2-44a4-9a58-455c69e1901d/cluster-samples-operator-watch/0.log"