Apr 17 14:18:29.628180 ip-10-0-143-215 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 14:18:29.628192 ip-10-0-143-215 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 14:18:29.628200 ip-10-0-143-215 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 14:18:29.628448 ip-10-0-143-215 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 14:18:39.685863 ip-10-0-143-215 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 14:18:39.685884 ip-10-0-143-215 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot ef942b4eee0e4114b9a1fcaad586ed5c -- Apr 17 14:21:44.832535 ip-10-0-143-215 systemd[1]: Starting Kubernetes Kubelet... Apr 17 14:21:45.332828 ip-10-0-143-215 kubenswrapper[2568]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 14:21:45.332828 ip-10-0-143-215 kubenswrapper[2568]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 14:21:45.332828 ip-10-0-143-215 kubenswrapper[2568]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 14:21:45.332828 ip-10-0-143-215 kubenswrapper[2568]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 14:21:45.332828 ip-10-0-143-215 kubenswrapper[2568]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 14:21:45.333821 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.333734 2568 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 14:21:45.340438 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340404 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:21:45.340438 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340434 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:21:45.340438 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340438 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:21:45.340438 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340442 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:21:45.340438 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340445 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:21:45.340438 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340447 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:21:45.340438 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340450 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:21:45.340706 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340453 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:21:45.340706 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340456 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:21:45.340706 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340459 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:21:45.340706 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340462 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:21:45.340706 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340464 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:21:45.340706 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340467 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:21:45.340706 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340469 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:21:45.340706 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340472 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:21:45.340706 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340474 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:21:45.340706 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340477 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:21:45.340706 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340480 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:21:45.340706 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340482 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:21:45.340706 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340485 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:21:45.340706 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340487 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:21:45.340706 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340490 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:21:45.340706 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340493 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:21:45.340706 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340496 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:21:45.340706 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340499 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:21:45.340706 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340502 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:21:45.340706 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340505 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:21:45.341224 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340507 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:21:45.341224 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340510 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:21:45.341224 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340512 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:21:45.341224 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340515 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:21:45.341224 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340517 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:21:45.341224 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340520 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:21:45.341224 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340523 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:21:45.341224 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340525 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:21:45.341224 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340527 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:21:45.341224 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340530 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:21:45.341224 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340532 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:21:45.341224 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340535 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:21:45.341224 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340537 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:21:45.341224 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340540 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:21:45.341224 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340543 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:21:45.341224 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340546 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:21:45.341224 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340549 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:21:45.341224 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340554 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:21:45.341224 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340557 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:21:45.341224 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340561 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:21:45.341714 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340563 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:21:45.341714 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340566 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:21:45.341714 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340568 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:21:45.341714 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340571 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:21:45.341714 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340574 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:21:45.341714 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340577 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:21:45.341714 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340580 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:21:45.341714 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340582 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:21:45.341714 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340585 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:21:45.341714 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340588 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:21:45.341714 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340590 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:21:45.341714 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340593 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:21:45.341714 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340595 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:21:45.341714 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340598 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:21:45.341714 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340600 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:21:45.341714 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340603 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:21:45.341714 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340605 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:21:45.341714 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340608 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:21:45.341714 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340611 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:21:45.341714 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340614 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:21:45.342232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340616 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:21:45.342232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340618 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:21:45.342232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340622 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:21:45.342232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340624 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:21:45.342232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340627 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:21:45.342232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340630 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:21:45.342232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340633 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:21:45.342232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340635 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:21:45.342232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340638 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:21:45.342232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340640 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:21:45.342232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340643 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:21:45.342232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340645 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:21:45.342232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340648 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:21:45.342232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340651 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:21:45.342232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340654 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:21:45.342232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340656 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:21:45.342232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340659 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:21:45.342232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340661 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:21:45.342232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.340665 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:21:45.342671 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341113 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:21:45.342671 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341119 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:21:45.342671 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341122 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:21:45.342671 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341125 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:21:45.342671 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341128 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:21:45.342671 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341130 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:21:45.342671 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341133 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:21:45.342671 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341136 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:21:45.342671 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341139 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:21:45.342671 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341142 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:21:45.342671 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341145 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:21:45.342671 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341148 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:21:45.342671 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341150 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:21:45.342671 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341153 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:21:45.342671 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341156 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:21:45.342671 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341158 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:21:45.342671 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341161 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:21:45.342671 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341180 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:21:45.342671 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341185 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:21:45.343107 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341188 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:21:45.343107 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341192 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:21:45.343107 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341196 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:21:45.343107 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341199 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:21:45.343107 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341201 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:21:45.343107 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341205 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:21:45.343107 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341208 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:21:45.343107 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341210 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:21:45.343107 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341213 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:21:45.343107 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341217 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:21:45.343107 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341221 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:21:45.343107 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341224 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:21:45.343107 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341227 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:21:45.343107 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341231 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:21:45.343107 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341234 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:21:45.343107 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341237 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:21:45.343107 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341239 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:21:45.343107 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341242 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:21:45.343107 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341245 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:21:45.343583 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341247 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:21:45.343583 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341250 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:21:45.343583 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341254 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:21:45.343583 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341257 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:21:45.343583 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341260 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:21:45.343583 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341263 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:21:45.343583 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341266 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:21:45.343583 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341269 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:21:45.343583 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341271 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:21:45.343583 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341274 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:21:45.343583 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341277 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:21:45.343583 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341280 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:21:45.343583 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341282 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:21:45.343583 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341285 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:21:45.343583 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341287 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:21:45.343583 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341290 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:21:45.343583 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341292 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:21:45.343583 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341295 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:21:45.343583 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341297 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:21:45.343583 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341300 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:21:45.344074 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341303 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:21:45.344074 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341308 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:21:45.344074 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341310 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:21:45.344074 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341313 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:21:45.344074 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341315 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:21:45.344074 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341318 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:21:45.344074 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341320 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:21:45.344074 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341323 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:21:45.344074 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341325 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:21:45.344074 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341328 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:21:45.344074 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341331 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:21:45.344074 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341333 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:21:45.344074 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341336 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:21:45.344074 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341338 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:21:45.344074 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341341 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:21:45.344074 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341344 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:21:45.344074 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341346 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:21:45.344074 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341349 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:21:45.344074 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341351 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:21:45.344074 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341355 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:21:45.344689 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341357 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:21:45.344689 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341360 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:21:45.344689 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341363 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:21:45.344689 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341366 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:21:45.344689 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341368 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:21:45.344689 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341371 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:21:45.344689 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341373 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:21:45.344689 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.341377 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:21:45.344689 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342080 2568 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 14:21:45.344689 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342089 2568 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 14:21:45.344689 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342095 2568 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 14:21:45.344689 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342100 2568 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 14:21:45.344689 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342106 2568 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 14:21:45.344689 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342109 2568 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 14:21:45.344689 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342114 2568 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 14:21:45.344689 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342119 2568 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 14:21:45.344689 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342123 2568 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 14:21:45.344689 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342126 2568 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 14:21:45.344689 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342129 2568 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 14:21:45.344689 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342133 2568 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 14:21:45.344689 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342136 2568 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 14:21:45.344689 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342139 2568 flags.go:64] FLAG: --cgroup-root="" Apr 17 14:21:45.345266 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342142 2568 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 14:21:45.345266 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342144 2568 flags.go:64] FLAG: --client-ca-file="" Apr 17 14:21:45.345266 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342147 2568 flags.go:64] FLAG: --cloud-config="" Apr 17 14:21:45.345266 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342150 2568 flags.go:64] FLAG: --cloud-provider="external" Apr 17 14:21:45.345266 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342153 2568 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 14:21:45.345266 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342157 2568 flags.go:64] FLAG: --cluster-domain="" Apr 17 14:21:45.345266 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342160 2568 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 14:21:45.345266 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342174 2568 flags.go:64] FLAG: --config-dir="" Apr 17 14:21:45.345266 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342177 2568 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 14:21:45.345266 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342180 2568 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 14:21:45.345266 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342184 2568 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 14:21:45.345266 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342188 2568 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 14:21:45.345266 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342191 2568 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 14:21:45.345266 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342194 2568 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 14:21:45.345266 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342198 2568 flags.go:64] FLAG: --contention-profiling="false" Apr 17 14:21:45.345266 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342200 2568 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 14:21:45.345266 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342203 2568 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 14:21:45.345266 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342207 2568 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 14:21:45.345266 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342210 2568 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 14:21:45.345266 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342214 2568 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 14:21:45.345266 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342217 2568 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 14:21:45.345266 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342220 2568 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 14:21:45.345266 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342223 2568 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 14:21:45.345266 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342227 2568 flags.go:64] FLAG: --enable-server="true" Apr 17 14:21:45.345266 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342230 2568 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 14:21:45.345885 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342235 2568 flags.go:64] FLAG: --event-burst="100" Apr 17 14:21:45.345885 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342238 2568 flags.go:64] FLAG: --event-qps="50" Apr 17 14:21:45.345885 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342242 2568 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 14:21:45.345885 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342245 2568 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 14:21:45.345885 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342248 2568 flags.go:64] FLAG: --eviction-hard="" Apr 17 14:21:45.345885 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342252 2568 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 14:21:45.345885 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342255 2568 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 14:21:45.345885 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342258 2568 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 14:21:45.345885 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342261 2568 flags.go:64] FLAG: --eviction-soft="" Apr 17 14:21:45.345885 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342264 2568 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 14:21:45.345885 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342266 2568 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 14:21:45.345885 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342269 2568 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 14:21:45.345885 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342273 2568 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 14:21:45.345885 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342276 2568 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 14:21:45.345885 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342278 2568 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 14:21:45.345885 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342281 2568 flags.go:64] FLAG: --feature-gates="" Apr 17 14:21:45.345885 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342286 2568 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 14:21:45.345885 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342289 2568 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 14:21:45.345885 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342292 2568 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 14:21:45.345885 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342295 2568 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 14:21:45.345885 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342298 2568 flags.go:64] FLAG: --healthz-port="10248" Apr 17 14:21:45.345885 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342301 2568 flags.go:64] FLAG: --help="false" Apr 17 14:21:45.345885 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342304 2568 flags.go:64] FLAG: --hostname-override="ip-10-0-143-215.ec2.internal" Apr 17 14:21:45.345885 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342307 2568 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 14:21:45.345885 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342310 2568 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 14:21:45.346523 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342313 2568 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 14:21:45.346523 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342316 2568 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 14:21:45.346523 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342320 2568 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 14:21:45.346523 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342323 2568 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 14:21:45.346523 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342326 2568 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 14:21:45.346523 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342328 2568 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 14:21:45.346523 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342332 2568 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 14:21:45.346523 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342335 2568 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 14:21:45.346523 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342338 2568 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 14:21:45.346523 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342341 2568 flags.go:64] FLAG: --kube-reserved="" Apr 17 14:21:45.346523 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342345 2568 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 14:21:45.346523 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342347 2568 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 14:21:45.346523 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342350 2568 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 14:21:45.346523 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342353 2568 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 14:21:45.346523 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342356 2568 flags.go:64] FLAG: --lock-file="" Apr 17 14:21:45.346523 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342358 2568 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 14:21:45.346523 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342361 2568 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 14:21:45.346523 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342364 2568 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 14:21:45.346523 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342370 2568 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 14:21:45.346523 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342373 2568 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 14:21:45.346523 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342375 2568 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 14:21:45.346523 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342378 2568 flags.go:64] FLAG: --logging-format="text" Apr 17 14:21:45.346523 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342381 2568 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 14:21:45.347080 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342384 2568 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 14:21:45.347080 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342387 2568 flags.go:64] FLAG: --manifest-url="" Apr 17 14:21:45.347080 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342390 2568 flags.go:64] FLAG: --manifest-url-header="" Apr 17 14:21:45.347080 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342394 2568 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 14:21:45.347080 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342397 2568 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 14:21:45.347080 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342401 2568 flags.go:64] FLAG: --max-pods="110" Apr 17 14:21:45.347080 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342404 2568 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 14:21:45.347080 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342406 2568 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 14:21:45.347080 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342409 2568 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 14:21:45.347080 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342412 2568 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 14:21:45.347080 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342415 2568 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 14:21:45.347080 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342418 2568 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 14:21:45.347080 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342421 2568 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 14:21:45.347080 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342428 2568 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 14:21:45.347080 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342432 2568 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 14:21:45.347080 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342435 2568 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 14:21:45.347080 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342438 2568 flags.go:64] FLAG: --pod-cidr="" Apr 17 14:21:45.347080 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342441 2568 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 14:21:45.347080 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342447 2568 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 14:21:45.347080 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342450 2568 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 14:21:45.347080 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342454 2568 flags.go:64] FLAG: --pods-per-core="0" Apr 17 14:21:45.347080 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342457 2568 flags.go:64] FLAG: --port="10250" Apr 17 14:21:45.347080 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342460 2568 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 14:21:45.347080 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342463 2568 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-05f57867b1c76b7bc" Apr 17 14:21:45.347679 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342466 2568 flags.go:64] FLAG: --qos-reserved="" Apr 17 14:21:45.347679 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342469 2568 flags.go:64] FLAG: --read-only-port="10255" Apr 17 14:21:45.347679 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342472 2568 flags.go:64] FLAG: --register-node="true" Apr 17 14:21:45.347679 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342475 2568 flags.go:64] FLAG: --register-schedulable="true" Apr 17 14:21:45.347679 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342478 2568 flags.go:64] FLAG: --register-with-taints="" Apr 17 14:21:45.347679 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342481 2568 flags.go:64] FLAG: --registry-burst="10" Apr 17 14:21:45.347679 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342484 2568 flags.go:64] FLAG: --registry-qps="5" Apr 17 14:21:45.347679 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342487 2568 flags.go:64] FLAG: --reserved-cpus="" Apr 17 14:21:45.347679 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342489 2568 flags.go:64] FLAG: --reserved-memory="" Apr 17 14:21:45.347679 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342493 2568 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 14:21:45.347679 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342496 2568 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 14:21:45.347679 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342499 2568 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 14:21:45.347679 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342502 2568 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 14:21:45.347679 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342505 2568 flags.go:64] FLAG: --runonce="false" Apr 17 14:21:45.347679 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342508 2568 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 14:21:45.347679 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342510 2568 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 14:21:45.347679 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342513 2568 flags.go:64] FLAG: --seccomp-default="false" Apr 17 14:21:45.347679 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342516 2568 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 14:21:45.347679 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342519 2568 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 14:21:45.347679 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342522 2568 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 14:21:45.347679 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342526 2568 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 14:21:45.347679 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342529 2568 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 14:21:45.347679 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342531 2568 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 14:21:45.347679 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342534 2568 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 14:21:45.347679 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342540 2568 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 14:21:45.347679 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342544 2568 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 14:21:45.348298 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342547 2568 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 14:21:45.348298 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342550 2568 flags.go:64] FLAG: --system-cgroups="" Apr 17 14:21:45.348298 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342553 2568 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 14:21:45.348298 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342558 2568 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 14:21:45.348298 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342561 2568 flags.go:64] FLAG: --tls-cert-file="" Apr 17 14:21:45.348298 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342564 2568 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 14:21:45.348298 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342568 2568 flags.go:64] FLAG: --tls-min-version="" Apr 17 14:21:45.348298 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342571 2568 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 14:21:45.348298 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342574 2568 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 14:21:45.348298 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342577 2568 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 14:21:45.348298 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342580 2568 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 14:21:45.348298 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342583 2568 flags.go:64] FLAG: --v="2" Apr 17 14:21:45.348298 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342587 2568 flags.go:64] FLAG: --version="false" Apr 17 14:21:45.348298 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342591 2568 flags.go:64] FLAG: --vmodule="" Apr 17 14:21:45.348298 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342596 2568 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 14:21:45.348298 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.342599 2568 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 14:21:45.348298 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342699 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:21:45.348298 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342703 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:21:45.348298 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342707 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:21:45.348298 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342709 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:21:45.348298 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342712 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:21:45.348298 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342715 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:21:45.348298 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342718 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:21:45.348857 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342720 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:21:45.348857 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342723 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:21:45.348857 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342725 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:21:45.348857 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342728 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:21:45.348857 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342730 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:21:45.348857 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342733 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:21:45.348857 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342735 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:21:45.348857 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342739 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:21:45.348857 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342742 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:21:45.348857 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342744 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:21:45.348857 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342747 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:21:45.348857 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342750 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:21:45.348857 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342752 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:21:45.348857 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342755 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:21:45.348857 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342757 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:21:45.348857 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342760 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:21:45.348857 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342762 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:21:45.348857 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342766 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:21:45.348857 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342770 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:21:45.348857 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342773 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:21:45.349378 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342776 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:21:45.349378 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342778 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:21:45.349378 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342781 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:21:45.349378 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342783 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:21:45.349378 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342803 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:21:45.349378 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342807 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:21:45.349378 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342811 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:21:45.349378 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342813 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:21:45.349378 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342816 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:21:45.349378 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342819 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:21:45.349378 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342822 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:21:45.349378 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342824 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:21:45.349378 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342827 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:21:45.349378 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342830 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:21:45.349378 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342832 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:21:45.349378 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342835 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:21:45.349378 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342838 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:21:45.349378 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342840 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:21:45.349378 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342843 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:21:45.349378 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342847 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:21:45.349872 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342850 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:21:45.349872 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342852 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:21:45.349872 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342856 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:21:45.349872 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342862 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:21:45.349872 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342865 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:21:45.349872 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342868 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:21:45.349872 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342870 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:21:45.349872 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342873 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:21:45.349872 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342876 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:21:45.349872 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342878 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:21:45.349872 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342881 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:21:45.349872 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342883 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:21:45.349872 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342886 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:21:45.349872 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342888 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:21:45.349872 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342891 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:21:45.349872 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342893 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:21:45.349872 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342896 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:21:45.349872 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342898 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:21:45.349872 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342901 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:21:45.349872 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342904 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:21:45.350404 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342906 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:21:45.350404 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342909 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:21:45.350404 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342911 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:21:45.350404 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342914 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:21:45.350404 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342916 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:21:45.350404 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342918 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:21:45.350404 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342921 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:21:45.350404 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342924 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:21:45.350404 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342926 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:21:45.350404 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342928 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:21:45.350404 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342931 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:21:45.350404 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342936 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:21:45.350404 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342939 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:21:45.350404 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342942 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:21:45.350404 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342945 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:21:45.350404 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342949 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:21:45.350404 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342951 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:21:45.350404 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342954 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:21:45.350404 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.342956 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:21:45.350856 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.343742 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 14:21:45.350856 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.350469 2568 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 14:21:45.350856 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.350487 2568 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 14:21:45.350856 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350537 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:21:45.350856 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350542 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:21:45.350856 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350546 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:21:45.350856 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350549 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:21:45.350856 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350552 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:21:45.350856 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350555 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:21:45.350856 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350557 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:21:45.350856 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350560 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:21:45.350856 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350563 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:21:45.350856 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350565 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:21:45.350856 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350568 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:21:45.350856 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350571 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:21:45.351272 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350573 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:21:45.351272 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350576 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:21:45.351272 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350578 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:21:45.351272 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350581 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:21:45.351272 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350584 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:21:45.351272 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350587 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:21:45.351272 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350589 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:21:45.351272 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350591 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:21:45.351272 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350594 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:21:45.351272 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350596 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:21:45.351272 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350599 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:21:45.351272 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350601 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:21:45.351272 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350604 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:21:45.351272 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350607 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:21:45.351272 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350609 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:21:45.351272 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350612 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:21:45.351272 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350614 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:21:45.351272 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350616 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:21:45.351272 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350619 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:21:45.351272 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350622 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:21:45.351744 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350626 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:21:45.351744 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350630 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:21:45.351744 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350633 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:21:45.351744 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350636 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:21:45.351744 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350639 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:21:45.351744 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350641 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:21:45.351744 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350643 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:21:45.351744 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350646 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:21:45.351744 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350648 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:21:45.351744 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350651 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:21:45.351744 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350653 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:21:45.351744 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350656 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:21:45.351744 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350658 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:21:45.351744 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350661 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:21:45.351744 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350663 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:21:45.351744 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350666 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:21:45.351744 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350668 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:21:45.351744 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350671 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:21:45.351744 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350673 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:21:45.352232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350676 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:21:45.352232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350678 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:21:45.352232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350681 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:21:45.352232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350683 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:21:45.352232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350686 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:21:45.352232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350688 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:21:45.352232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350691 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:21:45.352232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350694 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:21:45.352232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350696 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:21:45.352232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350699 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:21:45.352232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350701 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:21:45.352232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350703 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:21:45.352232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350707 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:21:45.352232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350710 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:21:45.352232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350712 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:21:45.352232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350715 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:21:45.352232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350718 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:21:45.352232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350720 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:21:45.352232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350722 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:21:45.352232 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350725 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:21:45.352742 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350727 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:21:45.352742 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350730 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:21:45.352742 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350732 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:21:45.352742 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350735 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:21:45.352742 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350737 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:21:45.352742 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350740 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:21:45.352742 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350742 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:21:45.352742 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350745 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:21:45.352742 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350747 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:21:45.352742 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350750 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:21:45.352742 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350752 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:21:45.352742 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350755 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:21:45.352742 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350757 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:21:45.352742 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350760 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:21:45.352742 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350763 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:21:45.353115 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.350769 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 14:21:45.353115 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350863 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:21:45.353115 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350867 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:21:45.353115 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350870 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:21:45.353115 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350872 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:21:45.353115 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350876 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:21:45.353115 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350879 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:21:45.353115 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350882 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:21:45.353115 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350884 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:21:45.353115 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350889 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:21:45.353115 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350892 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:21:45.353115 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350895 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:21:45.353115 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350897 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:21:45.353115 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350916 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:21:45.353115 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350919 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:21:45.353518 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350922 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:21:45.353518 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350925 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:21:45.353518 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350928 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:21:45.353518 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350931 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:21:45.353518 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350933 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:21:45.353518 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350936 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:21:45.353518 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350938 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:21:45.353518 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350941 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:21:45.353518 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350943 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:21:45.353518 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350946 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:21:45.353518 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350948 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:21:45.353518 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350950 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:21:45.353518 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350953 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:21:45.353518 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350956 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:21:45.353518 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350959 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:21:45.353518 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350961 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:21:45.353518 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350964 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:21:45.353518 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350966 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:21:45.353518 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350969 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:21:45.353518 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350971 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:21:45.354038 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350974 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:21:45.354038 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350976 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:21:45.354038 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350980 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:21:45.354038 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350984 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:21:45.354038 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350987 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:21:45.354038 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350990 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:21:45.354038 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350993 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:21:45.354038 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350996 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:21:45.354038 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.350999 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:21:45.354038 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351001 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:21:45.354038 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351004 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:21:45.354038 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351007 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:21:45.354038 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351009 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:21:45.354038 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351012 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:21:45.354038 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351014 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:21:45.354038 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351017 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:21:45.354038 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351019 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:21:45.354038 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351021 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:21:45.354038 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351024 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:21:45.354523 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351026 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:21:45.354523 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351029 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:21:45.354523 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351031 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:21:45.354523 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351034 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:21:45.354523 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351036 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:21:45.354523 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351038 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:21:45.354523 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351041 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:21:45.354523 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351043 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:21:45.354523 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351045 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:21:45.354523 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351048 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:21:45.354523 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351050 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:21:45.354523 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351053 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:21:45.354523 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351055 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:21:45.354523 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351058 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:21:45.354523 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351060 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:21:45.354523 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351062 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:21:45.354523 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351065 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:21:45.354523 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351068 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:21:45.354523 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351071 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:21:45.354523 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351074 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:21:45.355002 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351076 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:21:45.355002 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351078 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:21:45.355002 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351081 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:21:45.355002 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351083 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:21:45.355002 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351086 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:21:45.355002 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351088 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:21:45.355002 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351091 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:21:45.355002 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351093 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:21:45.355002 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351096 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:21:45.355002 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351098 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:21:45.355002 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351101 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:21:45.355002 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351104 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:21:45.355002 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:45.351108 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:21:45.355002 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.351112 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 14:21:45.355002 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.351884 2568 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 14:21:45.355392 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.354031 2568 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 14:21:45.355392 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.355081 2568 server.go:1019] "Starting client certificate rotation" Apr 17 14:21:45.355392 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.355176 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 14:21:45.355392 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.355216 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 14:21:45.382817 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.382796 2568 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 14:21:45.389227 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.389212 2568 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 14:21:45.408584 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.408560 2568 log.go:25] "Validated CRI v1 runtime API" Apr 17 14:21:45.414327 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.414310 2568 log.go:25] "Validated CRI v1 image API" Apr 17 14:21:45.415212 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.415195 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 14:21:45.415743 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.415717 2568 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 14:21:45.423585 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.423545 2568 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 b9e9dc7f-80b4-48f3-9aea-65ff08dc434f:/dev/nvme0n1p3 d80d41a8-917f-4422-a151-07e0a5711c14:/dev/nvme0n1p4] Apr 17 14:21:45.423659 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.423585 2568 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 14:21:45.430250 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.430002 2568 manager.go:217] Machine: {Timestamp:2026-04-17 14:21:45.427785856 +0000 UTC m=+0.461991746 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3186708 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2a38657994e0f7a318e0fbd2be5478 SystemUUID:ec2a3865-7994-e0f7-a318-e0fbd2be5478 BootID:ef942b4e-ee0e-4114-b9a1-fcaad586ed5c Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:fa:79:67:00:45 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:fa:79:67:00:45 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:da:85:ef:e7:5f:e8 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 14:21:45.430797 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.430787 2568 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 14:21:45.430923 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.430911 2568 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 14:21:45.431824 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.431807 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-t85wh" Apr 17 14:21:45.432238 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.432216 2568 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 14:21:45.432383 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.432240 2568 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-143-215.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 14:21:45.432427 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.432392 2568 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 14:21:45.432427 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.432400 2568 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 14:21:45.432427 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.432413 2568 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 14:21:45.433238 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.433227 2568 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 14:21:45.434713 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.434704 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 17 14:21:45.434815 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.434807 2568 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 14:21:45.437602 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.437593 2568 kubelet.go:491] "Attempting to sync node with API server" Apr 17 14:21:45.437649 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.437607 2568 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 14:21:45.437649 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.437618 2568 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 14:21:45.437649 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.437632 2568 kubelet.go:397] "Adding apiserver pod source" Apr 17 14:21:45.437649 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.437640 2568 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 14:21:45.438342 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.438327 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-t85wh" Apr 17 14:21:45.439859 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.439845 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 14:21:45.439918 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.439870 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 14:21:45.443026 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.443008 2568 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 14:21:45.444914 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.444897 2568 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 14:21:45.446382 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.446365 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 14:21:45.446453 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.446386 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 14:21:45.446453 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.446394 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 14:21:45.446453 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.446400 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 14:21:45.446453 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.446406 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 14:21:45.446453 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.446411 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 14:21:45.446453 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.446417 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 14:21:45.446453 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.446422 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 14:21:45.446453 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.446436 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 14:21:45.446453 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.446442 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 14:21:45.446453 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.446455 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 14:21:45.446718 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.446463 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 14:21:45.447545 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.447535 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 14:21:45.447589 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.447545 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 14:21:45.451589 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.451574 2568 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 14:21:45.451678 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.451622 2568 server.go:1295] "Started kubelet" Apr 17 14:21:45.451998 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.451794 2568 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 14:21:45.452506 ip-10-0-143-215 systemd[1]: Started Kubernetes Kubelet. Apr 17 14:21:45.454785 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.454737 2568 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 14:21:45.454846 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.454813 2568 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 14:21:45.456042 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.456025 2568 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 14:21:45.456365 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.454654 2568 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:21:45.457777 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.457689 2568 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:21:45.458656 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.458635 2568 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-143-215.ec2.internal" not found Apr 17 14:21:45.458770 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.458759 2568 server.go:317] "Adding debug handlers to kubelet server" Apr 17 14:21:45.463401 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.463383 2568 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 14:21:45.463499 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.463438 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 14:21:45.464126 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.464108 2568 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 14:21:45.464251 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.464232 2568 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 14:21:45.464365 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.464326 2568 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 14:21:45.464434 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.464382 2568 reconstruct.go:97] "Volume reconstruction finished" Apr 17 14:21:45.464434 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.464392 2568 reconciler.go:26] "Reconciler: start to sync state" Apr 17 14:21:45.464528 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:45.464446 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-215.ec2.internal\" not found" Apr 17 14:21:45.464826 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.464807 2568 factory.go:153] Registering CRI-O factory Apr 17 14:21:45.464826 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.464826 2568 factory.go:223] Registration of the crio container factory successfully Apr 17 14:21:45.464973 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.464878 2568 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 14:21:45.464973 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.464888 2568 factory.go:55] Registering systemd factory Apr 17 14:21:45.464973 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.464896 2568 factory.go:223] Registration of the systemd container factory successfully Apr 17 14:21:45.464973 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.464916 2568 factory.go:103] Registering Raw factory Apr 17 14:21:45.464973 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.464929 2568 manager.go:1196] Started watching for new ooms in manager Apr 17 14:21:45.465206 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:45.465110 2568 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 14:21:45.465343 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.465327 2568 manager.go:319] Starting recovery of all containers Apr 17 14:21:45.465775 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.465746 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:21:45.467364 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:45.467340 2568 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-143-215.ec2.internal\" not found" node="ip-10-0-143-215.ec2.internal" Apr 17 14:21:45.472873 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.472852 2568 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-143-215.ec2.internal" not found Apr 17 14:21:45.475134 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.475113 2568 manager.go:324] Recovery completed Apr 17 14:21:45.479579 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.479566 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:21:45.481465 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.481449 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-215.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:21:45.481539 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.481477 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-215.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:21:45.481539 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.481497 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-215.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:21:45.482092 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.482076 2568 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 14:21:45.482092 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.482090 2568 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 14:21:45.482182 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.482107 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 17 14:21:45.484711 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.484699 2568 policy_none.go:49] "None policy: Start" Apr 17 14:21:45.484779 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.484716 2568 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 14:21:45.484779 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.484745 2568 state_mem.go:35] "Initializing new in-memory state store" Apr 17 14:21:45.529060 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.529038 2568 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-143-215.ec2.internal" not found Apr 17 14:21:45.538756 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.538736 2568 manager.go:341] "Starting Device Plugin manager" Apr 17 14:21:45.554214 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:45.539047 2568 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 14:21:45.554214 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.539070 2568 server.go:85] "Starting device plugin registration server" Apr 17 14:21:45.554214 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.539535 2568 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 14:21:45.554214 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.539565 2568 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 14:21:45.554214 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.539808 2568 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 14:21:45.554214 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.539911 2568 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 14:21:45.554214 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.539922 2568 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 14:21:45.554214 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:45.540440 2568 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 14:21:45.554214 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:45.540474 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-143-215.ec2.internal\" not found" Apr 17 14:21:45.577653 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.577617 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 14:21:45.578783 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.578766 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 14:21:45.578897 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.578790 2568 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 14:21:45.578897 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.578809 2568 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 14:21:45.578897 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.578816 2568 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 14:21:45.578897 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:45.578851 2568 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 14:21:45.583365 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.583310 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:21:45.640180 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.640097 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:21:45.641259 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.641243 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-215.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:21:45.641333 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.641275 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-215.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:21:45.641333 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.641286 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-215.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:21:45.641333 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.641311 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-143-215.ec2.internal" Apr 17 14:21:45.649449 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.649431 2568 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-143-215.ec2.internal" Apr 17 14:21:45.679707 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.679676 2568 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-215.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-143-215.ec2.internal"] Apr 17 14:21:45.682302 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.682285 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-215.ec2.internal" Apr 17 14:21:45.682381 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.682295 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-215.ec2.internal" Apr 17 14:21:45.708816 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.708795 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-215.ec2.internal" Apr 17 14:21:45.713314 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.713299 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-215.ec2.internal" Apr 17 14:21:45.722289 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.722275 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 14:21:45.724424 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.724410 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 14:21:45.765754 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.765722 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/cca1ef78a831aed884654af106b7c2ef-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-215.ec2.internal\" (UID: \"cca1ef78a831aed884654af106b7c2ef\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-215.ec2.internal" Apr 17 14:21:45.765881 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.765755 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cca1ef78a831aed884654af106b7c2ef-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-215.ec2.internal\" (UID: \"cca1ef78a831aed884654af106b7c2ef\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-215.ec2.internal" Apr 17 14:21:45.765881 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.765781 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3514b8218264fee7cad79c255e536dea-config\") pod \"kube-apiserver-proxy-ip-10-0-143-215.ec2.internal\" (UID: \"3514b8218264fee7cad79c255e536dea\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-215.ec2.internal" Apr 17 14:21:45.866418 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.866331 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/cca1ef78a831aed884654af106b7c2ef-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-215.ec2.internal\" (UID: \"cca1ef78a831aed884654af106b7c2ef\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-215.ec2.internal" Apr 17 14:21:45.866418 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.866368 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cca1ef78a831aed884654af106b7c2ef-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-215.ec2.internal\" (UID: \"cca1ef78a831aed884654af106b7c2ef\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-215.ec2.internal" Apr 17 14:21:45.866418 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.866386 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3514b8218264fee7cad79c255e536dea-config\") pod \"kube-apiserver-proxy-ip-10-0-143-215.ec2.internal\" (UID: \"3514b8218264fee7cad79c255e536dea\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-215.ec2.internal" Apr 17 14:21:45.866616 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.866438 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cca1ef78a831aed884654af106b7c2ef-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-215.ec2.internal\" (UID: \"cca1ef78a831aed884654af106b7c2ef\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-215.ec2.internal" Apr 17 14:21:45.866616 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.866445 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/cca1ef78a831aed884654af106b7c2ef-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-215.ec2.internal\" (UID: \"cca1ef78a831aed884654af106b7c2ef\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-215.ec2.internal" Apr 17 14:21:45.866616 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:45.866498 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3514b8218264fee7cad79c255e536dea-config\") pod \"kube-apiserver-proxy-ip-10-0-143-215.ec2.internal\" (UID: \"3514b8218264fee7cad79c255e536dea\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-215.ec2.internal" Apr 17 14:21:46.025603 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.025563 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-215.ec2.internal" Apr 17 14:21:46.026624 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.026603 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-215.ec2.internal" Apr 17 14:21:46.355607 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.355528 2568 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 14:21:46.356429 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.355710 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 14:21:46.356429 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.355727 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 14:21:46.356429 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.355739 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 14:21:46.438728 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.438694 2568 apiserver.go:52] "Watching apiserver" Apr 17 14:21:46.440881 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.440840 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 14:16:45 +0000 UTC" deadline="2027-10-05 03:10:41.300016596 +0000 UTC" Apr 17 14:21:46.440960 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.440882 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12852h48m54.859139158s" Apr 17 14:21:46.446442 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.446425 2568 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 14:21:46.446796 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.446777 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-r9nsg","openshift-multus/multus-additional-cni-plugins-6zwtc","openshift-multus/network-metrics-daemon-tg9jd","openshift-network-diagnostics/network-check-target-6twhf","kube-system/konnectivity-agent-r7mvj","openshift-cluster-node-tuning-operator/tuned-sqzjh","openshift-dns/node-resolver-6f2x6","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-215.ec2.internal","openshift-multus/multus-xgf47","openshift-network-operator/iptables-alerter-zw8fl","openshift-ovn-kubernetes/ovnkube-node-5fm94","kube-system/kube-apiserver-proxy-ip-10-0-143-215.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6xbpp"] Apr 17 14:21:46.448461 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.448440 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-r9nsg" Apr 17 14:21:46.449786 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.449765 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6zwtc" Apr 17 14:21:46.450820 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.450797 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 14:21:46.450981 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.450964 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-nkhvw\"" Apr 17 14:21:46.451751 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.451332 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 14:21:46.451751 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.451560 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 14:21:46.452830 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.452060 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 14:21:46.452830 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.452344 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 14:21:46.452830 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.452374 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 14:21:46.452830 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.452552 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 14:21:46.452830 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.452640 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 14:21:46.453099 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.452841 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-6hqpg\"" Apr 17 14:21:46.453473 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.453447 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tg9jd" Apr 17 14:21:46.453588 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.453494 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6twhf" Apr 17 14:21:46.453699 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:46.453674 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tg9jd" podUID="41c68694-ceb3-44f8-a9e8-e0655e8aa848" Apr 17 14:21:46.453770 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:46.453670 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6twhf" podUID="28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b" Apr 17 14:21:46.455056 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.455040 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-r7mvj" Apr 17 14:21:46.456078 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.456062 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.457253 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.457237 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6f2x6" Apr 17 14:21:46.457385 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.457368 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 14:21:46.457573 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.457559 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-fnm6z\"" Apr 17 14:21:46.457629 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.457582 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 14:21:46.458303 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.458284 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-5nmjv\"" Apr 17 14:21:46.458384 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.458348 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:21:46.458499 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.458482 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.458591 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.458576 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 14:21:46.459645 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.459621 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-shpd6\"" Apr 17 14:21:46.459727 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.459698 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zw8fl" Apr 17 14:21:46.459727 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.459709 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 14:21:46.459835 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.459776 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 14:21:46.460823 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.460807 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-q5qqg\"" Apr 17 14:21:46.461831 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.461815 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:21:46.461954 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.461934 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 14:21:46.462018 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.461968 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 14:21:46.462018 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.462013 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-j7vwj\"" Apr 17 14:21:46.462255 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.462241 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 14:21:46.462725 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.462713 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.463697 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.463679 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 14:21:46.463874 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.463860 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6xbpp" Apr 17 14:21:46.464945 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.464920 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-b5swr\"" Apr 17 14:21:46.465098 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.464959 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 14:21:46.465389 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.465375 2568 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 14:21:46.466195 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.466153 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 14:21:46.466282 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.466182 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 14:21:46.466282 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.466277 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 14:21:46.466383 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.466219 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 14:21:46.466439 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.466400 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 14:21:46.466544 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.466517 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 14:21:46.466642 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.466624 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 14:21:46.466699 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.466670 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-v2v5k\"" Apr 17 14:21:46.466750 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.466699 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 14:21:46.470283 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.470266 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/641f1866-16ba-4a95-9644-d449d621c322-iptables-alerter-script\") pod \"iptables-alerter-zw8fl\" (UID: \"641f1866-16ba-4a95-9644-d449d621c322\") " pod="openshift-network-operator/iptables-alerter-zw8fl" Apr 17 14:21:46.470357 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.470290 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-var-lib-openvswitch\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.470357 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.470305 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-etc-openvswitch\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.470357 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.470320 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-host-run-multus-certs\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.470357 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.470337 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-host-kubelet\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.470493 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.470388 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-host-cni-netd\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.470493 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.470410 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/709e5989-ba48-455a-b8a9-25c4eafebaa4-ovnkube-script-lib\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.470493 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.470426 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-os-release\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.470493 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.470444 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/438a43a6-e3d8-4f9f-ac52-b92baf10df16-etc-selinux\") pod \"aws-ebs-csi-driver-node-6xbpp\" (UID: \"438a43a6-e3d8-4f9f-ac52-b92baf10df16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6xbpp" Apr 17 14:21:46.470493 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.470458 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d5d72b15-9ee0-40a2-b530-7847abb993f0-serviceca\") pod \"node-ca-r9nsg\" (UID: \"d5d72b15-9ee0-40a2-b530-7847abb993f0\") " pod="openshift-image-registry/node-ca-r9nsg" Apr 17 14:21:46.470493 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.470471 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-tmp\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.470493 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.470485 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-log-socket\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.470721 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.470504 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/438a43a6-e3d8-4f9f-ac52-b92baf10df16-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6xbpp\" (UID: \"438a43a6-e3d8-4f9f-ac52-b92baf10df16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6xbpp" Apr 17 14:21:46.470721 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.470518 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d5d72b15-9ee0-40a2-b530-7847abb993f0-host\") pod \"node-ca-r9nsg\" (UID: \"d5d72b15-9ee0-40a2-b530-7847abb993f0\") " pod="openshift-image-registry/node-ca-r9nsg" Apr 17 14:21:46.470721 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.470533 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/65b9b252-7788-4f34-9046-a58499e7e849-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6zwtc\" (UID: \"65b9b252-7788-4f34-9046-a58499e7e849\") " pod="openshift-multus/multus-additional-cni-plugins-6zwtc" Apr 17 14:21:46.470721 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.470547 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-run\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.470721 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.470575 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-lib-modules\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.470721 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.470603 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-host-var-lib-kubelet\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.470721 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.470619 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-host-cni-bin\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.470721 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.470633 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkm9p\" (UniqueName: \"kubernetes.io/projected/d5d72b15-9ee0-40a2-b530-7847abb993f0-kube-api-access-pkm9p\") pod \"node-ca-r9nsg\" (UID: \"d5d72b15-9ee0-40a2-b530-7847abb993f0\") " pod="openshift-image-registry/node-ca-r9nsg" Apr 17 14:21:46.470721 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.470655 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/65b9b252-7788-4f34-9046-a58499e7e849-system-cni-dir\") pod \"multus-additional-cni-plugins-6zwtc\" (UID: \"65b9b252-7788-4f34-9046-a58499e7e849\") " pod="openshift-multus/multus-additional-cni-plugins-6zwtc" Apr 17 14:21:46.470721 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.470676 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7f633d5b-7896-43f3-b506-dc236c755507-hosts-file\") pod \"node-resolver-6f2x6\" (UID: \"7f633d5b-7896-43f3-b506-dc236c755507\") " pod="openshift-dns/node-resolver-6f2x6" Apr 17 14:21:46.470721 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.470691 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-multus-socket-dir-parent\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.470721 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.470705 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-host-var-lib-cni-multus\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.471149 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.470734 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-multus-conf-dir\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.471149 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.470756 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/709e5989-ba48-455a-b8a9-25c4eafebaa4-env-overrides\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.471149 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.470771 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/438a43a6-e3d8-4f9f-ac52-b92baf10df16-socket-dir\") pod \"aws-ebs-csi-driver-node-6xbpp\" (UID: \"438a43a6-e3d8-4f9f-ac52-b92baf10df16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6xbpp" Apr 17 14:21:46.471149 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.470812 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/438a43a6-e3d8-4f9f-ac52-b92baf10df16-registration-dir\") pod \"aws-ebs-csi-driver-node-6xbpp\" (UID: \"438a43a6-e3d8-4f9f-ac52-b92baf10df16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6xbpp" Apr 17 14:21:46.471149 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.470844 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/65b9b252-7788-4f34-9046-a58499e7e849-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6zwtc\" (UID: \"65b9b252-7788-4f34-9046-a58499e7e849\") " pod="openshift-multus/multus-additional-cni-plugins-6zwtc" Apr 17 14:21:46.471149 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.470872 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-etc-sysctl-conf\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.471149 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.470893 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-multus-cni-dir\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.471149 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.470937 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-run-ovn\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.471149 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.470961 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/65b9b252-7788-4f34-9046-a58499e7e849-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6zwtc\" (UID: \"65b9b252-7788-4f34-9046-a58499e7e849\") " pod="openshift-multus/multus-additional-cni-plugins-6zwtc" Apr 17 14:21:46.471149 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.470987 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7f633d5b-7896-43f3-b506-dc236c755507-tmp-dir\") pod \"node-resolver-6f2x6\" (UID: \"7f633d5b-7896-43f3-b506-dc236c755507\") " pod="openshift-dns/node-resolver-6f2x6" Apr 17 14:21:46.471149 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.471010 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-hostroot\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.471149 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.471032 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-run-openvswitch\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.471149 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.471046 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.471149 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.471070 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-var-lib-kubelet\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.471149 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.471092 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv754\" (UniqueName: \"kubernetes.io/projected/ba2c74ab-e348-46bf-a8a9-3b804800268d-kube-api-access-zv754\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.471149 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.471108 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-host-slash\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.471662 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.471136 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-node-log\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.471662 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.471156 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-host-var-lib-cni-bin\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.471662 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.471190 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwnqs\" (UniqueName: \"kubernetes.io/projected/41c68694-ceb3-44f8-a9e8-e0655e8aa848-kube-api-access-qwnqs\") pod \"network-metrics-daemon-tg9jd\" (UID: \"41c68694-ceb3-44f8-a9e8-e0655e8aa848\") " pod="openshift-multus/network-metrics-daemon-tg9jd" Apr 17 14:21:46.471662 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.471209 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-host-run-ovn-kubernetes\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.471662 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.471223 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/709e5989-ba48-455a-b8a9-25c4eafebaa4-ovn-node-metrics-cert\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.471662 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.471243 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-sys\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.471662 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.471267 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxm4q\" (UniqueName: \"kubernetes.io/projected/7f633d5b-7896-43f3-b506-dc236c755507-kube-api-access-wxm4q\") pod \"node-resolver-6f2x6\" (UID: \"7f633d5b-7896-43f3-b506-dc236c755507\") " pod="openshift-dns/node-resolver-6f2x6" Apr 17 14:21:46.471662 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.471285 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-systemd-units\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.471662 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.471300 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/438a43a6-e3d8-4f9f-ac52-b92baf10df16-sys-fs\") pod \"aws-ebs-csi-driver-node-6xbpp\" (UID: \"438a43a6-e3d8-4f9f-ac52-b92baf10df16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6xbpp" Apr 17 14:21:46.471662 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.471314 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8kj8\" (UniqueName: \"kubernetes.io/projected/65b9b252-7788-4f34-9046-a58499e7e849-kube-api-access-g8kj8\") pod \"multus-additional-cni-plugins-6zwtc\" (UID: \"65b9b252-7788-4f34-9046-a58499e7e849\") " pod="openshift-multus/multus-additional-cni-plugins-6zwtc" Apr 17 14:21:46.471662 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.471363 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-etc-sysconfig\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.471662 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.471397 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-etc-tuned\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.471662 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.471415 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-system-cni-dir\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.472046 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.471807 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/438a43a6-e3d8-4f9f-ac52-b92baf10df16-device-dir\") pod \"aws-ebs-csi-driver-node-6xbpp\" (UID: \"438a43a6-e3d8-4f9f-ac52-b92baf10df16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6xbpp" Apr 17 14:21:46.472046 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.471826 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/65b9b252-7788-4f34-9046-a58499e7e849-os-release\") pod \"multus-additional-cni-plugins-6zwtc\" (UID: \"65b9b252-7788-4f34-9046-a58499e7e849\") " pod="openshift-multus/multus-additional-cni-plugins-6zwtc" Apr 17 14:21:46.472046 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.471840 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rw44\" (UniqueName: \"kubernetes.io/projected/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-kube-api-access-8rw44\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.472046 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.471873 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-cnibin\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.472046 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.471909 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ba2c74ab-e348-46bf-a8a9-3b804800268d-multus-daemon-config\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.472046 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.471937 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sns99\" (UniqueName: \"kubernetes.io/projected/28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b-kube-api-access-sns99\") pod \"network-check-target-6twhf\" (UID: \"28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b\") " pod="openshift-network-diagnostics/network-check-target-6twhf" Apr 17 14:21:46.472046 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.471953 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/641f1866-16ba-4a95-9644-d449d621c322-host-slash\") pod \"iptables-alerter-zw8fl\" (UID: \"641f1866-16ba-4a95-9644-d449d621c322\") " pod="openshift-network-operator/iptables-alerter-zw8fl" Apr 17 14:21:46.472046 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.471971 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq4lf\" (UniqueName: \"kubernetes.io/projected/641f1866-16ba-4a95-9644-d449d621c322-kube-api-access-wq4lf\") pod \"iptables-alerter-zw8fl\" (UID: \"641f1866-16ba-4a95-9644-d449d621c322\") " pod="openshift-network-operator/iptables-alerter-zw8fl" Apr 17 14:21:46.472046 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.471992 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-etc-sysctl-d\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.472046 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.472007 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-host-run-k8s-cni-cncf-io\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.472046 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.472031 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-etc-kubernetes\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.472461 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.472064 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-host-run-netns\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.472461 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.472082 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-run-systemd\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.472461 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.472095 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/709e5989-ba48-455a-b8a9-25c4eafebaa4-ovnkube-config\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.472461 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.472110 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p2jn\" (UniqueName: \"kubernetes.io/projected/438a43a6-e3d8-4f9f-ac52-b92baf10df16-kube-api-access-4p2jn\") pod \"aws-ebs-csi-driver-node-6xbpp\" (UID: \"438a43a6-e3d8-4f9f-ac52-b92baf10df16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6xbpp" Apr 17 14:21:46.472461 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.472133 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/65b9b252-7788-4f34-9046-a58499e7e849-cni-binary-copy\") pod \"multus-additional-cni-plugins-6zwtc\" (UID: \"65b9b252-7788-4f34-9046-a58499e7e849\") " pod="openshift-multus/multus-additional-cni-plugins-6zwtc" Apr 17 14:21:46.472461 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.472146 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/48038e4e-5e89-4f13-aeb7-05e1197d4475-agent-certs\") pod \"konnectivity-agent-r7mvj\" (UID: \"48038e4e-5e89-4f13-aeb7-05e1197d4475\") " pod="kube-system/konnectivity-agent-r7mvj" Apr 17 14:21:46.472461 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.472160 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-etc-kubernetes\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.472461 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.472199 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-etc-modprobe-d\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.472461 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.472214 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-etc-systemd\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.472461 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.472240 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-host\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.472461 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.472259 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ba2c74ab-e348-46bf-a8a9-3b804800268d-cni-binary-copy\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.472461 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.472274 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-host-run-netns\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.472461 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.472289 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41c68694-ceb3-44f8-a9e8-e0655e8aa848-metrics-certs\") pod \"network-metrics-daemon-tg9jd\" (UID: \"41c68694-ceb3-44f8-a9e8-e0655e8aa848\") " pod="openshift-multus/network-metrics-daemon-tg9jd" Apr 17 14:21:46.472461 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.472307 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d6cz\" (UniqueName: \"kubernetes.io/projected/709e5989-ba48-455a-b8a9-25c4eafebaa4-kube-api-access-4d6cz\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.472461 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.472328 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/65b9b252-7788-4f34-9046-a58499e7e849-cnibin\") pod \"multus-additional-cni-plugins-6zwtc\" (UID: \"65b9b252-7788-4f34-9046-a58499e7e849\") " pod="openshift-multus/multus-additional-cni-plugins-6zwtc" Apr 17 14:21:46.472461 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.472360 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/48038e4e-5e89-4f13-aeb7-05e1197d4475-konnectivity-ca\") pod \"konnectivity-agent-r7mvj\" (UID: \"48038e4e-5e89-4f13-aeb7-05e1197d4475\") " pod="kube-system/konnectivity-agent-r7mvj" Apr 17 14:21:46.476200 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.476183 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 14:21:46.500826 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.500796 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-7w7fn" Apr 17 14:21:46.508099 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.508073 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-7w7fn" Apr 17 14:21:46.573151 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573125 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rw44\" (UniqueName: \"kubernetes.io/projected/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-kube-api-access-8rw44\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.573151 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573153 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-cnibin\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.573370 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573187 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ba2c74ab-e348-46bf-a8a9-3b804800268d-multus-daemon-config\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.573370 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573212 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sns99\" (UniqueName: \"kubernetes.io/projected/28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b-kube-api-access-sns99\") pod \"network-check-target-6twhf\" (UID: \"28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b\") " pod="openshift-network-diagnostics/network-check-target-6twhf" Apr 17 14:21:46.573370 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573254 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-cnibin\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.573370 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573258 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/641f1866-16ba-4a95-9644-d449d621c322-host-slash\") pod \"iptables-alerter-zw8fl\" (UID: \"641f1866-16ba-4a95-9644-d449d621c322\") " pod="openshift-network-operator/iptables-alerter-zw8fl" Apr 17 14:21:46.573370 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573294 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/641f1866-16ba-4a95-9644-d449d621c322-host-slash\") pod \"iptables-alerter-zw8fl\" (UID: \"641f1866-16ba-4a95-9644-d449d621c322\") " pod="openshift-network-operator/iptables-alerter-zw8fl" Apr 17 14:21:46.573370 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573301 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wq4lf\" (UniqueName: \"kubernetes.io/projected/641f1866-16ba-4a95-9644-d449d621c322-kube-api-access-wq4lf\") pod \"iptables-alerter-zw8fl\" (UID: \"641f1866-16ba-4a95-9644-d449d621c322\") " pod="openshift-network-operator/iptables-alerter-zw8fl" Apr 17 14:21:46.573370 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573326 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-etc-sysctl-d\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.573370 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573343 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-host-run-k8s-cni-cncf-io\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.573370 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573359 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-etc-kubernetes\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.573792 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573385 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-host-run-netns\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.573792 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573406 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-run-systemd\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.573792 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573428 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/709e5989-ba48-455a-b8a9-25c4eafebaa4-ovnkube-config\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.573792 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573440 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-etc-kubernetes\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.573792 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573453 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-host-run-netns\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.573792 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573451 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4p2jn\" (UniqueName: \"kubernetes.io/projected/438a43a6-e3d8-4f9f-ac52-b92baf10df16-kube-api-access-4p2jn\") pod \"aws-ebs-csi-driver-node-6xbpp\" (UID: \"438a43a6-e3d8-4f9f-ac52-b92baf10df16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6xbpp" Apr 17 14:21:46.573792 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573503 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-host-run-k8s-cni-cncf-io\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.573792 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573515 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-run-systemd\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.573792 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573519 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/65b9b252-7788-4f34-9046-a58499e7e849-cni-binary-copy\") pod \"multus-additional-cni-plugins-6zwtc\" (UID: \"65b9b252-7788-4f34-9046-a58499e7e849\") " pod="openshift-multus/multus-additional-cni-plugins-6zwtc" Apr 17 14:21:46.573792 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573513 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-etc-sysctl-d\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.573792 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573541 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/48038e4e-5e89-4f13-aeb7-05e1197d4475-agent-certs\") pod \"konnectivity-agent-r7mvj\" (UID: \"48038e4e-5e89-4f13-aeb7-05e1197d4475\") " pod="kube-system/konnectivity-agent-r7mvj" Apr 17 14:21:46.573792 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573559 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-etc-kubernetes\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.573792 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573576 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-etc-modprobe-d\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.573792 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573597 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-etc-systemd\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.573792 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573638 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-host\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.573792 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573661 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ba2c74ab-e348-46bf-a8a9-3b804800268d-cni-binary-copy\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.573792 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573687 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-host-run-netns\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.573792 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573711 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41c68694-ceb3-44f8-a9e8-e0655e8aa848-metrics-certs\") pod \"network-metrics-daemon-tg9jd\" (UID: \"41c68694-ceb3-44f8-a9e8-e0655e8aa848\") " pod="openshift-multus/network-metrics-daemon-tg9jd" Apr 17 14:21:46.574652 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573734 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4d6cz\" (UniqueName: \"kubernetes.io/projected/709e5989-ba48-455a-b8a9-25c4eafebaa4-kube-api-access-4d6cz\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.574652 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573759 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/65b9b252-7788-4f34-9046-a58499e7e849-cnibin\") pod \"multus-additional-cni-plugins-6zwtc\" (UID: \"65b9b252-7788-4f34-9046-a58499e7e849\") " pod="openshift-multus/multus-additional-cni-plugins-6zwtc" Apr 17 14:21:46.574652 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573766 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-etc-modprobe-d\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.574652 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573784 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/48038e4e-5e89-4f13-aeb7-05e1197d4475-konnectivity-ca\") pod \"konnectivity-agent-r7mvj\" (UID: \"48038e4e-5e89-4f13-aeb7-05e1197d4475\") " pod="kube-system/konnectivity-agent-r7mvj" Apr 17 14:21:46.574652 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573820 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-etc-kubernetes\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.574652 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573833 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/641f1866-16ba-4a95-9644-d449d621c322-iptables-alerter-script\") pod \"iptables-alerter-zw8fl\" (UID: \"641f1866-16ba-4a95-9644-d449d621c322\") " pod="openshift-network-operator/iptables-alerter-zw8fl" Apr 17 14:21:46.574652 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573843 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ba2c74ab-e348-46bf-a8a9-3b804800268d-multus-daemon-config\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.574652 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573860 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-var-lib-openvswitch\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.574652 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573862 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-host-run-netns\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.574652 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573887 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-etc-openvswitch\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.574652 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573915 2568 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 14:21:46.574652 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573923 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-host-run-multus-certs\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.574652 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573949 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-etc-systemd\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.574652 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573948 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-etc-openvswitch\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.574652 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573973 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-host-kubelet\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.574652 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.573985 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/65b9b252-7788-4f34-9046-a58499e7e849-cnibin\") pod \"multus-additional-cni-plugins-6zwtc\" (UID: \"65b9b252-7788-4f34-9046-a58499e7e849\") " pod="openshift-multus/multus-additional-cni-plugins-6zwtc" Apr 17 14:21:46.574652 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.574017 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-host-kubelet\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.574652 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.574026 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-host\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.575606 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.574092 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/65b9b252-7788-4f34-9046-a58499e7e849-cni-binary-copy\") pod \"multus-additional-cni-plugins-6zwtc\" (UID: \"65b9b252-7788-4f34-9046-a58499e7e849\") " pod="openshift-multus/multus-additional-cni-plugins-6zwtc" Apr 17 14:21:46.575606 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.574102 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-host-cni-netd\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.575606 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:46.574107 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:21:46.575606 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.574181 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-host-run-multus-certs\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.575606 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:46.574213 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41c68694-ceb3-44f8-a9e8-e0655e8aa848-metrics-certs podName:41c68694-ceb3-44f8-a9e8-e0655e8aa848 nodeName:}" failed. No retries permitted until 2026-04-17 14:21:47.074159602 +0000 UTC m=+2.108365494 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41c68694-ceb3-44f8-a9e8-e0655e8aa848-metrics-certs") pod "network-metrics-daemon-tg9jd" (UID: "41c68694-ceb3-44f8-a9e8-e0655e8aa848") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:21:46.575606 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.574219 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-host-cni-netd\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.575606 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.574228 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/709e5989-ba48-455a-b8a9-25c4eafebaa4-ovnkube-config\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.575606 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.574240 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-var-lib-openvswitch\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.575606 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.574281 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/709e5989-ba48-455a-b8a9-25c4eafebaa4-ovnkube-script-lib\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.575606 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.574384 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-os-release\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.575606 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.574421 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/438a43a6-e3d8-4f9f-ac52-b92baf10df16-etc-selinux\") pod \"aws-ebs-csi-driver-node-6xbpp\" (UID: \"438a43a6-e3d8-4f9f-ac52-b92baf10df16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6xbpp" Apr 17 14:21:46.575606 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.574459 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d5d72b15-9ee0-40a2-b530-7847abb993f0-serviceca\") pod \"node-ca-r9nsg\" (UID: \"d5d72b15-9ee0-40a2-b530-7847abb993f0\") " pod="openshift-image-registry/node-ca-r9nsg" Apr 17 14:21:46.575606 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.574458 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-os-release\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.575606 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.574479 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/48038e4e-5e89-4f13-aeb7-05e1197d4475-konnectivity-ca\") pod \"konnectivity-agent-r7mvj\" (UID: \"48038e4e-5e89-4f13-aeb7-05e1197d4475\") " pod="kube-system/konnectivity-agent-r7mvj" Apr 17 14:21:46.575606 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.574482 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-tmp\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.575606 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.574521 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-log-socket\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.575606 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.574537 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/438a43a6-e3d8-4f9f-ac52-b92baf10df16-etc-selinux\") pod \"aws-ebs-csi-driver-node-6xbpp\" (UID: \"438a43a6-e3d8-4f9f-ac52-b92baf10df16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6xbpp" Apr 17 14:21:46.576338 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.574542 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-log-socket\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.576338 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.574576 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/438a43a6-e3d8-4f9f-ac52-b92baf10df16-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6xbpp\" (UID: \"438a43a6-e3d8-4f9f-ac52-b92baf10df16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6xbpp" Apr 17 14:21:46.576338 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.574592 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ba2c74ab-e348-46bf-a8a9-3b804800268d-cni-binary-copy\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.576338 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.574603 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d5d72b15-9ee0-40a2-b530-7847abb993f0-host\") pod \"node-ca-r9nsg\" (UID: \"d5d72b15-9ee0-40a2-b530-7847abb993f0\") " pod="openshift-image-registry/node-ca-r9nsg" Apr 17 14:21:46.576338 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.574638 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/65b9b252-7788-4f34-9046-a58499e7e849-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6zwtc\" (UID: \"65b9b252-7788-4f34-9046-a58499e7e849\") " pod="openshift-multus/multus-additional-cni-plugins-6zwtc" Apr 17 14:21:46.576338 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.574666 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-run\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.576338 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.574690 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-lib-modules\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.576338 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.574744 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/641f1866-16ba-4a95-9644-d449d621c322-iptables-alerter-script\") pod \"iptables-alerter-zw8fl\" (UID: \"641f1866-16ba-4a95-9644-d449d621c322\") " pod="openshift-network-operator/iptables-alerter-zw8fl" Apr 17 14:21:46.576338 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.574749 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-run\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.576338 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.574775 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-host-var-lib-kubelet\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.576338 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.574806 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-host-var-lib-kubelet\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.576338 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.574812 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-host-cni-bin\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.576338 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.574840 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pkm9p\" (UniqueName: \"kubernetes.io/projected/d5d72b15-9ee0-40a2-b530-7847abb993f0-kube-api-access-pkm9p\") pod \"node-ca-r9nsg\" (UID: \"d5d72b15-9ee0-40a2-b530-7847abb993f0\") " pod="openshift-image-registry/node-ca-r9nsg" Apr 17 14:21:46.576338 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.574889 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-lib-modules\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.576338 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.574897 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/709e5989-ba48-455a-b8a9-25c4eafebaa4-ovnkube-script-lib\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.576338 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.574930 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/65b9b252-7788-4f34-9046-a58499e7e849-system-cni-dir\") pod \"multus-additional-cni-plugins-6zwtc\" (UID: \"65b9b252-7788-4f34-9046-a58499e7e849\") " pod="openshift-multus/multus-additional-cni-plugins-6zwtc" Apr 17 14:21:46.576338 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.574947 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/438a43a6-e3d8-4f9f-ac52-b92baf10df16-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6xbpp\" (UID: \"438a43a6-e3d8-4f9f-ac52-b92baf10df16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6xbpp" Apr 17 14:21:46.576338 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.574958 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7f633d5b-7896-43f3-b506-dc236c755507-hosts-file\") pod \"node-resolver-6f2x6\" (UID: \"7f633d5b-7896-43f3-b506-dc236c755507\") " pod="openshift-dns/node-resolver-6f2x6" Apr 17 14:21:46.577215 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.574984 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-multus-socket-dir-parent\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.577215 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575014 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-host-cni-bin\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.577215 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575024 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d5d72b15-9ee0-40a2-b530-7847abb993f0-serviceca\") pod \"node-ca-r9nsg\" (UID: \"d5d72b15-9ee0-40a2-b530-7847abb993f0\") " pod="openshift-image-registry/node-ca-r9nsg" Apr 17 14:21:46.577215 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575053 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-host-var-lib-cni-multus\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.577215 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575077 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-multus-conf-dir\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.577215 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575100 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/709e5989-ba48-455a-b8a9-25c4eafebaa4-env-overrides\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.577215 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575124 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/438a43a6-e3d8-4f9f-ac52-b92baf10df16-socket-dir\") pod \"aws-ebs-csi-driver-node-6xbpp\" (UID: \"438a43a6-e3d8-4f9f-ac52-b92baf10df16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6xbpp" Apr 17 14:21:46.577215 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575147 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/438a43a6-e3d8-4f9f-ac52-b92baf10df16-registration-dir\") pod \"aws-ebs-csi-driver-node-6xbpp\" (UID: \"438a43a6-e3d8-4f9f-ac52-b92baf10df16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6xbpp" Apr 17 14:21:46.577215 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575222 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/65b9b252-7788-4f34-9046-a58499e7e849-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6zwtc\" (UID: \"65b9b252-7788-4f34-9046-a58499e7e849\") " pod="openshift-multus/multus-additional-cni-plugins-6zwtc" Apr 17 14:21:46.577215 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575254 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-etc-sysctl-conf\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.577215 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575283 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-multus-cni-dir\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.577215 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575309 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-run-ovn\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.577215 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575307 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7f633d5b-7896-43f3-b506-dc236c755507-hosts-file\") pod \"node-resolver-6f2x6\" (UID: \"7f633d5b-7896-43f3-b506-dc236c755507\") " pod="openshift-dns/node-resolver-6f2x6" Apr 17 14:21:46.577215 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575336 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/65b9b252-7788-4f34-9046-a58499e7e849-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6zwtc\" (UID: \"65b9b252-7788-4f34-9046-a58499e7e849\") " pod="openshift-multus/multus-additional-cni-plugins-6zwtc" Apr 17 14:21:46.577215 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575386 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/438a43a6-e3d8-4f9f-ac52-b92baf10df16-registration-dir\") pod \"aws-ebs-csi-driver-node-6xbpp\" (UID: \"438a43a6-e3d8-4f9f-ac52-b92baf10df16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6xbpp" Apr 17 14:21:46.577215 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575392 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/65b9b252-7788-4f34-9046-a58499e7e849-system-cni-dir\") pod \"multus-additional-cni-plugins-6zwtc\" (UID: \"65b9b252-7788-4f34-9046-a58499e7e849\") " pod="openshift-multus/multus-additional-cni-plugins-6zwtc" Apr 17 14:21:46.577215 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575433 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-multus-conf-dir\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.578059 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575434 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7f633d5b-7896-43f3-b506-dc236c755507-tmp-dir\") pod \"node-resolver-6f2x6\" (UID: \"7f633d5b-7896-43f3-b506-dc236c755507\") " pod="openshift-dns/node-resolver-6f2x6" Apr 17 14:21:46.578059 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575466 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-hostroot\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.578059 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575474 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/65b9b252-7788-4f34-9046-a58499e7e849-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6zwtc\" (UID: \"65b9b252-7788-4f34-9046-a58499e7e849\") " pod="openshift-multus/multus-additional-cni-plugins-6zwtc" Apr 17 14:21:46.578059 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575493 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-run-openvswitch\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.578059 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575511 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/709e5989-ba48-455a-b8a9-25c4eafebaa4-env-overrides\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.578059 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575517 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-run-ovn\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.578059 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575520 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.578059 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575567 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-var-lib-kubelet\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.578059 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575572 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-multus-cni-dir\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.578059 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575593 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zv754\" (UniqueName: \"kubernetes.io/projected/ba2c74ab-e348-46bf-a8a9-3b804800268d-kube-api-access-zv754\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.578059 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575603 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d5d72b15-9ee0-40a2-b530-7847abb993f0-host\") pod \"node-ca-r9nsg\" (UID: \"d5d72b15-9ee0-40a2-b530-7847abb993f0\") " pod="openshift-image-registry/node-ca-r9nsg" Apr 17 14:21:46.578059 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575621 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-host-slash\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.578059 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575630 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/438a43a6-e3d8-4f9f-ac52-b92baf10df16-socket-dir\") pod \"aws-ebs-csi-driver-node-6xbpp\" (UID: \"438a43a6-e3d8-4f9f-ac52-b92baf10df16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6xbpp" Apr 17 14:21:46.578059 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575640 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-multus-socket-dir-parent\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.578059 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575647 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-node-log\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.578059 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575668 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7f633d5b-7896-43f3-b506-dc236c755507-tmp-dir\") pod \"node-resolver-6f2x6\" (UID: \"7f633d5b-7896-43f3-b506-dc236c755507\") " pod="openshift-dns/node-resolver-6f2x6" Apr 17 14:21:46.578059 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575705 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-run-openvswitch\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.578059 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575707 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-host-var-lib-cni-bin\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.578890 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575671 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-host-var-lib-cni-multus\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.578890 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575742 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwnqs\" (UniqueName: \"kubernetes.io/projected/41c68694-ceb3-44f8-a9e8-e0655e8aa848-kube-api-access-qwnqs\") pod \"network-metrics-daemon-tg9jd\" (UID: \"41c68694-ceb3-44f8-a9e8-e0655e8aa848\") " pod="openshift-multus/network-metrics-daemon-tg9jd" Apr 17 14:21:46.578890 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575757 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-var-lib-kubelet\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.578890 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575771 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-host-run-ovn-kubernetes\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.578890 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575801 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-etc-sysctl-conf\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.578890 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575811 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-host-var-lib-cni-bin\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.578890 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575842 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-node-log\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.578890 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575897 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/709e5989-ba48-455a-b8a9-25c4eafebaa4-ovn-node-metrics-cert\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.578890 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.575953 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-host-slash\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.578890 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.576003 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-host-run-ovn-kubernetes\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.578890 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.576026 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/65b9b252-7788-4f34-9046-a58499e7e849-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6zwtc\" (UID: \"65b9b252-7788-4f34-9046-a58499e7e849\") " pod="openshift-multus/multus-additional-cni-plugins-6zwtc" Apr 17 14:21:46.578890 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.576208 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.578890 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.576239 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/65b9b252-7788-4f34-9046-a58499e7e849-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6zwtc\" (UID: \"65b9b252-7788-4f34-9046-a58499e7e849\") " pod="openshift-multus/multus-additional-cni-plugins-6zwtc" Apr 17 14:21:46.578890 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.576264 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-hostroot\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.578890 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.576297 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-sys\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.578890 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.576323 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxm4q\" (UniqueName: \"kubernetes.io/projected/7f633d5b-7896-43f3-b506-dc236c755507-kube-api-access-wxm4q\") pod \"node-resolver-6f2x6\" (UID: \"7f633d5b-7896-43f3-b506-dc236c755507\") " pod="openshift-dns/node-resolver-6f2x6" Apr 17 14:21:46.578890 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.576400 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-sys\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.579565 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.576439 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-systemd-units\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.579565 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.576465 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/438a43a6-e3d8-4f9f-ac52-b92baf10df16-sys-fs\") pod \"aws-ebs-csi-driver-node-6xbpp\" (UID: \"438a43a6-e3d8-4f9f-ac52-b92baf10df16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6xbpp" Apr 17 14:21:46.579565 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.576492 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g8kj8\" (UniqueName: \"kubernetes.io/projected/65b9b252-7788-4f34-9046-a58499e7e849-kube-api-access-g8kj8\") pod \"multus-additional-cni-plugins-6zwtc\" (UID: \"65b9b252-7788-4f34-9046-a58499e7e849\") " pod="openshift-multus/multus-additional-cni-plugins-6zwtc" Apr 17 14:21:46.579565 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.576523 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-etc-sysconfig\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.579565 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.576552 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-etc-tuned\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.579565 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.576548 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/709e5989-ba48-455a-b8a9-25c4eafebaa4-systemd-units\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.579565 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.576584 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-system-cni-dir\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.579565 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.576611 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/438a43a6-e3d8-4f9f-ac52-b92baf10df16-device-dir\") pod \"aws-ebs-csi-driver-node-6xbpp\" (UID: \"438a43a6-e3d8-4f9f-ac52-b92baf10df16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6xbpp" Apr 17 14:21:46.579565 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.576629 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-etc-sysconfig\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.579565 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.576634 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/65b9b252-7788-4f34-9046-a58499e7e849-os-release\") pod \"multus-additional-cni-plugins-6zwtc\" (UID: \"65b9b252-7788-4f34-9046-a58499e7e849\") " pod="openshift-multus/multus-additional-cni-plugins-6zwtc" Apr 17 14:21:46.579565 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.576698 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/65b9b252-7788-4f34-9046-a58499e7e849-os-release\") pod \"multus-additional-cni-plugins-6zwtc\" (UID: \"65b9b252-7788-4f34-9046-a58499e7e849\") " pod="openshift-multus/multus-additional-cni-plugins-6zwtc" Apr 17 14:21:46.579565 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.576720 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/438a43a6-e3d8-4f9f-ac52-b92baf10df16-sys-fs\") pod \"aws-ebs-csi-driver-node-6xbpp\" (UID: \"438a43a6-e3d8-4f9f-ac52-b92baf10df16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6xbpp" Apr 17 14:21:46.579565 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.576785 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba2c74ab-e348-46bf-a8a9-3b804800268d-system-cni-dir\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.579565 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.577185 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-tmp\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.579565 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.576788 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/438a43a6-e3d8-4f9f-ac52-b92baf10df16-device-dir\") pod \"aws-ebs-csi-driver-node-6xbpp\" (UID: \"438a43a6-e3d8-4f9f-ac52-b92baf10df16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6xbpp" Apr 17 14:21:46.579565 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.577425 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/48038e4e-5e89-4f13-aeb7-05e1197d4475-agent-certs\") pod \"konnectivity-agent-r7mvj\" (UID: \"48038e4e-5e89-4f13-aeb7-05e1197d4475\") " pod="kube-system/konnectivity-agent-r7mvj" Apr 17 14:21:46.579565 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.578686 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/709e5989-ba48-455a-b8a9-25c4eafebaa4-ovn-node-metrics-cert\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.580269 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.579407 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-etc-tuned\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.580269 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:46.579512 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:21:46.580269 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:46.579536 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:21:46.580269 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:46.579552 2568 projected.go:194] Error preparing data for projected volume kube-api-access-sns99 for pod openshift-network-diagnostics/network-check-target-6twhf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:21:46.580269 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:46.579777 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b-kube-api-access-sns99 podName:28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b nodeName:}" failed. No retries permitted until 2026-04-17 14:21:47.079759941 +0000 UTC m=+2.113965829 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-sns99" (UniqueName: "kubernetes.io/projected/28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b-kube-api-access-sns99") pod "network-check-target-6twhf" (UID: "28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:21:46.584138 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.584073 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p2jn\" (UniqueName: \"kubernetes.io/projected/438a43a6-e3d8-4f9f-ac52-b92baf10df16-kube-api-access-4p2jn\") pod \"aws-ebs-csi-driver-node-6xbpp\" (UID: \"438a43a6-e3d8-4f9f-ac52-b92baf10df16\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6xbpp" Apr 17 14:21:46.585837 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.585795 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rw44\" (UniqueName: \"kubernetes.io/projected/1e41c42d-5f24-4aea-b32e-08f6d5dedcde-kube-api-access-8rw44\") pod \"tuned-sqzjh\" (UID: \"1e41c42d-5f24-4aea-b32e-08f6d5dedcde\") " pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.586726 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.586699 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wq4lf\" (UniqueName: \"kubernetes.io/projected/641f1866-16ba-4a95-9644-d449d621c322-kube-api-access-wq4lf\") pod \"iptables-alerter-zw8fl\" (UID: \"641f1866-16ba-4a95-9644-d449d621c322\") " pod="openshift-network-operator/iptables-alerter-zw8fl" Apr 17 14:21:46.586968 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.586947 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d6cz\" (UniqueName: \"kubernetes.io/projected/709e5989-ba48-455a-b8a9-25c4eafebaa4-kube-api-access-4d6cz\") pod \"ovnkube-node-5fm94\" (UID: \"709e5989-ba48-455a-b8a9-25c4eafebaa4\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.587100 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.587078 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkm9p\" (UniqueName: \"kubernetes.io/projected/d5d72b15-9ee0-40a2-b530-7847abb993f0-kube-api-access-pkm9p\") pod \"node-ca-r9nsg\" (UID: \"d5d72b15-9ee0-40a2-b530-7847abb993f0\") " pod="openshift-image-registry/node-ca-r9nsg" Apr 17 14:21:46.587671 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.587642 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8kj8\" (UniqueName: \"kubernetes.io/projected/65b9b252-7788-4f34-9046-a58499e7e849-kube-api-access-g8kj8\") pod \"multus-additional-cni-plugins-6zwtc\" (UID: \"65b9b252-7788-4f34-9046-a58499e7e849\") " pod="openshift-multus/multus-additional-cni-plugins-6zwtc" Apr 17 14:21:46.588088 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.588065 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxm4q\" (UniqueName: \"kubernetes.io/projected/7f633d5b-7896-43f3-b506-dc236c755507-kube-api-access-wxm4q\") pod \"node-resolver-6f2x6\" (UID: \"7f633d5b-7896-43f3-b506-dc236c755507\") " pod="openshift-dns/node-resolver-6f2x6" Apr 17 14:21:46.588160 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.588125 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:21:46.588160 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.588128 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwnqs\" (UniqueName: \"kubernetes.io/projected/41c68694-ceb3-44f8-a9e8-e0655e8aa848-kube-api-access-qwnqs\") pod \"network-metrics-daemon-tg9jd\" (UID: \"41c68694-ceb3-44f8-a9e8-e0655e8aa848\") " pod="openshift-multus/network-metrics-daemon-tg9jd" Apr 17 14:21:46.588256 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.588157 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv754\" (UniqueName: \"kubernetes.io/projected/ba2c74ab-e348-46bf-a8a9-3b804800268d-kube-api-access-zv754\") pod \"multus-xgf47\" (UID: \"ba2c74ab-e348-46bf-a8a9-3b804800268d\") " pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.592935 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.592920 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6xbpp" Apr 17 14:21:46.620314 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:46.620276 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3514b8218264fee7cad79c255e536dea.slice/crio-9adad017127ce8b0ab917ad644f37144086355f2881f42e8da68ee562c00dc93 WatchSource:0}: Error finding container 9adad017127ce8b0ab917ad644f37144086355f2881f42e8da68ee562c00dc93: Status 404 returned error can't find the container with id 9adad017127ce8b0ab917ad644f37144086355f2881f42e8da68ee562c00dc93 Apr 17 14:21:46.620738 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:46.620698 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcca1ef78a831aed884654af106b7c2ef.slice/crio-27ee7b8bfe07c737b750516155be4ca8153667f626ef95feb5830a04a8bdeb43 WatchSource:0}: Error finding container 27ee7b8bfe07c737b750516155be4ca8153667f626ef95feb5830a04a8bdeb43: Status 404 returned error can't find the container with id 27ee7b8bfe07c737b750516155be4ca8153667f626ef95feb5830a04a8bdeb43 Apr 17 14:21:46.625250 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.625226 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 14:21:46.768091 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.768058 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-r9nsg" Apr 17 14:21:46.774488 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:46.774315 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5d72b15_9ee0_40a2_b530_7847abb993f0.slice/crio-7381b1015fb7abc2d0a7ff46fad6ba735f50117eae87de316859a89e8c10d3b5 WatchSource:0}: Error finding container 7381b1015fb7abc2d0a7ff46fad6ba735f50117eae87de316859a89e8c10d3b5: Status 404 returned error can't find the container with id 7381b1015fb7abc2d0a7ff46fad6ba735f50117eae87de316859a89e8c10d3b5 Apr 17 14:21:46.782770 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.782750 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6zwtc" Apr 17 14:21:46.788495 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:46.788473 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65b9b252_7788_4f34_9046_a58499e7e849.slice/crio-d9a7019a0b0b16a3aae367bca4f680efdb73256fdf649987e8d36c8b80624e91 WatchSource:0}: Error finding container d9a7019a0b0b16a3aae367bca4f680efdb73256fdf649987e8d36c8b80624e91: Status 404 returned error can't find the container with id d9a7019a0b0b16a3aae367bca4f680efdb73256fdf649987e8d36c8b80624e91 Apr 17 14:21:46.808609 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.808581 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-r7mvj" Apr 17 14:21:46.816542 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:46.816512 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48038e4e_5e89_4f13_aeb7_05e1197d4475.slice/crio-aadddc065fd450ae34e908cbd4cd3ac3bf15220c7e69fce43274f8714e274867 WatchSource:0}: Error finding container aadddc065fd450ae34e908cbd4cd3ac3bf15220c7e69fce43274f8714e274867: Status 404 returned error can't find the container with id aadddc065fd450ae34e908cbd4cd3ac3bf15220c7e69fce43274f8714e274867 Apr 17 14:21:46.820600 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.820583 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" Apr 17 14:21:46.826242 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.826225 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6f2x6" Apr 17 14:21:46.826771 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:46.826744 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e41c42d_5f24_4aea_b32e_08f6d5dedcde.slice/crio-5f0319e084977342cc52b7b9f287385751da3965b101a93fb1a3bcbe87dcddd5 WatchSource:0}: Error finding container 5f0319e084977342cc52b7b9f287385751da3965b101a93fb1a3bcbe87dcddd5: Status 404 returned error can't find the container with id 5f0319e084977342cc52b7b9f287385751da3965b101a93fb1a3bcbe87dcddd5 Apr 17 14:21:46.832020 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:46.831994 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f633d5b_7896_43f3_b506_dc236c755507.slice/crio-871365d8cc503253469af4187ee3fa972b83f1f424413b67b869ac4c8a21eee3 WatchSource:0}: Error finding container 871365d8cc503253469af4187ee3fa972b83f1f424413b67b869ac4c8a21eee3: Status 404 returned error can't find the container with id 871365d8cc503253469af4187ee3fa972b83f1f424413b67b869ac4c8a21eee3 Apr 17 14:21:46.849763 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:46.849731 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod709e5989_ba48_455a_b8a9_25c4eafebaa4.slice/crio-15a001de7903ba4ba03cd32278e6391c488f929b064465614f1c4391769ee4cb WatchSource:0}: Error finding container 15a001de7903ba4ba03cd32278e6391c488f929b064465614f1c4391769ee4cb: Status 404 returned error can't find the container with id 15a001de7903ba4ba03cd32278e6391c488f929b064465614f1c4391769ee4cb Apr 17 14:21:46.855633 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.855615 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xgf47" Apr 17 14:21:46.861490 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:46.861440 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba2c74ab_e348_46bf_a8a9_3b804800268d.slice/crio-273332f173d930401fe8fcbe7a607c0e1dfce3eed7699295e4669d9071b5d814 WatchSource:0}: Error finding container 273332f173d930401fe8fcbe7a607c0e1dfce3eed7699295e4669d9071b5d814: Status 404 returned error can't find the container with id 273332f173d930401fe8fcbe7a607c0e1dfce3eed7699295e4669d9071b5d814 Apr 17 14:21:46.862174 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:46.862134 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zw8fl" Apr 17 14:21:46.868302 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:46.868280 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod641f1866_16ba_4a95_9644_d449d621c322.slice/crio-ea90f7353564e0553534d80bf99763bb2fb5847c74f5e4f18ba7a2b7a621bfd0 WatchSource:0}: Error finding container ea90f7353564e0553534d80bf99763bb2fb5847c74f5e4f18ba7a2b7a621bfd0: Status 404 returned error can't find the container with id ea90f7353564e0553534d80bf99763bb2fb5847c74f5e4f18ba7a2b7a621bfd0 Apr 17 14:21:46.899450 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:21:46.899420 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod438a43a6_e3d8_4f9f_ac52_b92baf10df16.slice/crio-1cad7e0619cc6d225178768d9968181b00ec0f20cf9b9ab7d2d528b66b28f966 WatchSource:0}: Error finding container 1cad7e0619cc6d225178768d9968181b00ec0f20cf9b9ab7d2d528b66b28f966: Status 404 returned error can't find the container with id 1cad7e0619cc6d225178768d9968181b00ec0f20cf9b9ab7d2d528b66b28f966 Apr 17 14:21:47.078848 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:47.078809 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41c68694-ceb3-44f8-a9e8-e0655e8aa848-metrics-certs\") pod \"network-metrics-daemon-tg9jd\" (UID: \"41c68694-ceb3-44f8-a9e8-e0655e8aa848\") " pod="openshift-multus/network-metrics-daemon-tg9jd" Apr 17 14:21:47.079017 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:47.078994 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:21:47.079091 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:47.079080 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41c68694-ceb3-44f8-a9e8-e0655e8aa848-metrics-certs podName:41c68694-ceb3-44f8-a9e8-e0655e8aa848 nodeName:}" failed. No retries permitted until 2026-04-17 14:21:48.079057942 +0000 UTC m=+3.113263821 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41c68694-ceb3-44f8-a9e8-e0655e8aa848-metrics-certs") pod "network-metrics-daemon-tg9jd" (UID: "41c68694-ceb3-44f8-a9e8-e0655e8aa848") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:21:47.180924 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:47.180125 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sns99\" (UniqueName: \"kubernetes.io/projected/28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b-kube-api-access-sns99\") pod \"network-check-target-6twhf\" (UID: \"28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b\") " pod="openshift-network-diagnostics/network-check-target-6twhf" Apr 17 14:21:47.180924 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:47.180340 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:21:47.180924 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:47.180377 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:21:47.180924 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:47.180391 2568 projected.go:194] Error preparing data for projected volume kube-api-access-sns99 for pod openshift-network-diagnostics/network-check-target-6twhf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:21:47.180924 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:47.180486 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b-kube-api-access-sns99 podName:28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b nodeName:}" failed. No retries permitted until 2026-04-17 14:21:48.180465882 +0000 UTC m=+3.214671764 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-sns99" (UniqueName: "kubernetes.io/projected/28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b-kube-api-access-sns99") pod "network-check-target-6twhf" (UID: "28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:21:47.190054 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:47.189835 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:21:47.509625 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:47.509576 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 14:16:46 +0000 UTC" deadline="2027-12-19 03:55:17.207978817 +0000 UTC" Apr 17 14:21:47.509625 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:47.509623 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14653h33m29.698361048s" Apr 17 14:21:47.564079 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:47.564010 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:21:47.597254 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:47.597222 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tg9jd" Apr 17 14:21:47.597435 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:47.597394 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tg9jd" podUID="41c68694-ceb3-44f8-a9e8-e0655e8aa848" Apr 17 14:21:47.609261 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:47.609054 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6xbpp" event={"ID":"438a43a6-e3d8-4f9f-ac52-b92baf10df16","Type":"ContainerStarted","Data":"1cad7e0619cc6d225178768d9968181b00ec0f20cf9b9ab7d2d528b66b28f966"} Apr 17 14:21:47.627936 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:47.627657 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zw8fl" event={"ID":"641f1866-16ba-4a95-9644-d449d621c322","Type":"ContainerStarted","Data":"ea90f7353564e0553534d80bf99763bb2fb5847c74f5e4f18ba7a2b7a621bfd0"} Apr 17 14:21:47.629505 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:47.629421 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xgf47" event={"ID":"ba2c74ab-e348-46bf-a8a9-3b804800268d","Type":"ContainerStarted","Data":"273332f173d930401fe8fcbe7a607c0e1dfce3eed7699295e4669d9071b5d814"} Apr 17 14:21:47.631761 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:47.631701 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6f2x6" event={"ID":"7f633d5b-7896-43f3-b506-dc236c755507","Type":"ContainerStarted","Data":"871365d8cc503253469af4187ee3fa972b83f1f424413b67b869ac4c8a21eee3"} Apr 17 14:21:47.654341 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:47.654293 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-r7mvj" event={"ID":"48038e4e-5e89-4f13-aeb7-05e1197d4475","Type":"ContainerStarted","Data":"aadddc065fd450ae34e908cbd4cd3ac3bf15220c7e69fce43274f8714e274867"} Apr 17 14:21:47.685419 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:47.685378 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6zwtc" event={"ID":"65b9b252-7788-4f34-9046-a58499e7e849","Type":"ContainerStarted","Data":"d9a7019a0b0b16a3aae367bca4f680efdb73256fdf649987e8d36c8b80624e91"} Apr 17 14:21:47.692371 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:47.692336 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" event={"ID":"709e5989-ba48-455a-b8a9-25c4eafebaa4","Type":"ContainerStarted","Data":"15a001de7903ba4ba03cd32278e6391c488f929b064465614f1c4391769ee4cb"} Apr 17 14:21:47.699744 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:47.699711 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" event={"ID":"1e41c42d-5f24-4aea-b32e-08f6d5dedcde","Type":"ContainerStarted","Data":"5f0319e084977342cc52b7b9f287385751da3965b101a93fb1a3bcbe87dcddd5"} Apr 17 14:21:47.739439 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:47.739386 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-r9nsg" event={"ID":"d5d72b15-9ee0-40a2-b530-7847abb993f0","Type":"ContainerStarted","Data":"7381b1015fb7abc2d0a7ff46fad6ba735f50117eae87de316859a89e8c10d3b5"} Apr 17 14:21:47.746435 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:47.746391 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-215.ec2.internal" event={"ID":"cca1ef78a831aed884654af106b7c2ef","Type":"ContainerStarted","Data":"27ee7b8bfe07c737b750516155be4ca8153667f626ef95feb5830a04a8bdeb43"} Apr 17 14:21:47.752320 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:47.752289 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-215.ec2.internal" event={"ID":"3514b8218264fee7cad79c255e536dea","Type":"ContainerStarted","Data":"9adad017127ce8b0ab917ad644f37144086355f2881f42e8da68ee562c00dc93"} Apr 17 14:21:47.828831 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:47.828753 2568 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:21:48.091013 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:48.090829 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41c68694-ceb3-44f8-a9e8-e0655e8aa848-metrics-certs\") pod \"network-metrics-daemon-tg9jd\" (UID: \"41c68694-ceb3-44f8-a9e8-e0655e8aa848\") " pod="openshift-multus/network-metrics-daemon-tg9jd" Apr 17 14:21:48.091013 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:48.090988 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:21:48.091239 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:48.091051 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41c68694-ceb3-44f8-a9e8-e0655e8aa848-metrics-certs podName:41c68694-ceb3-44f8-a9e8-e0655e8aa848 nodeName:}" failed. No retries permitted until 2026-04-17 14:21:50.091033347 +0000 UTC m=+5.125239226 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41c68694-ceb3-44f8-a9e8-e0655e8aa848-metrics-certs") pod "network-metrics-daemon-tg9jd" (UID: "41c68694-ceb3-44f8-a9e8-e0655e8aa848") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:21:48.191325 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:48.191258 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sns99\" (UniqueName: \"kubernetes.io/projected/28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b-kube-api-access-sns99\") pod \"network-check-target-6twhf\" (UID: \"28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b\") " pod="openshift-network-diagnostics/network-check-target-6twhf" Apr 17 14:21:48.191498 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:48.191440 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:21:48.191498 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:48.191461 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:21:48.191498 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:48.191475 2568 projected.go:194] Error preparing data for projected volume kube-api-access-sns99 for pod openshift-network-diagnostics/network-check-target-6twhf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:21:48.191651 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:48.191534 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b-kube-api-access-sns99 podName:28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b nodeName:}" failed. No retries permitted until 2026-04-17 14:21:50.191516441 +0000 UTC m=+5.225722323 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-sns99" (UniqueName: "kubernetes.io/projected/28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b-kube-api-access-sns99") pod "network-check-target-6twhf" (UID: "28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:21:48.510686 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:48.510642 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 14:16:46 +0000 UTC" deadline="2027-11-18 04:49:19.861595301 +0000 UTC" Apr 17 14:21:48.510686 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:48.510683 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13910h27m31.350916538s" Apr 17 14:21:48.579043 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:48.579008 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6twhf" Apr 17 14:21:48.579319 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:48.579142 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6twhf" podUID="28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b" Apr 17 14:21:49.581481 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:49.581445 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tg9jd" Apr 17 14:21:49.581966 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:49.581590 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tg9jd" podUID="41c68694-ceb3-44f8-a9e8-e0655e8aa848" Apr 17 14:21:50.106440 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:50.106401 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41c68694-ceb3-44f8-a9e8-e0655e8aa848-metrics-certs\") pod \"network-metrics-daemon-tg9jd\" (UID: \"41c68694-ceb3-44f8-a9e8-e0655e8aa848\") " pod="openshift-multus/network-metrics-daemon-tg9jd" Apr 17 14:21:50.106629 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:50.106575 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:21:50.106697 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:50.106652 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41c68694-ceb3-44f8-a9e8-e0655e8aa848-metrics-certs podName:41c68694-ceb3-44f8-a9e8-e0655e8aa848 nodeName:}" failed. No retries permitted until 2026-04-17 14:21:54.106629691 +0000 UTC m=+9.140835568 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41c68694-ceb3-44f8-a9e8-e0655e8aa848-metrics-certs") pod "network-metrics-daemon-tg9jd" (UID: "41c68694-ceb3-44f8-a9e8-e0655e8aa848") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:21:50.207320 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:50.207218 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sns99\" (UniqueName: \"kubernetes.io/projected/28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b-kube-api-access-sns99\") pod \"network-check-target-6twhf\" (UID: \"28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b\") " pod="openshift-network-diagnostics/network-check-target-6twhf" Apr 17 14:21:50.207555 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:50.207394 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:21:50.207555 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:50.207411 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:21:50.207555 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:50.207425 2568 projected.go:194] Error preparing data for projected volume kube-api-access-sns99 for pod openshift-network-diagnostics/network-check-target-6twhf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:21:50.207555 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:50.207485 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b-kube-api-access-sns99 podName:28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b nodeName:}" failed. No retries permitted until 2026-04-17 14:21:54.207466417 +0000 UTC m=+9.241672297 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-sns99" (UniqueName: "kubernetes.io/projected/28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b-kube-api-access-sns99") pod "network-check-target-6twhf" (UID: "28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:21:50.579791 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:50.579280 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6twhf" Apr 17 14:21:50.579791 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:50.579421 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6twhf" podUID="28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b" Apr 17 14:21:51.579193 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:51.579138 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tg9jd" Apr 17 14:21:51.579648 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:51.579294 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tg9jd" podUID="41c68694-ceb3-44f8-a9e8-e0655e8aa848" Apr 17 14:21:52.580201 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:52.579714 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6twhf" Apr 17 14:21:52.580201 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:52.579854 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6twhf" podUID="28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b" Apr 17 14:21:53.579752 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:53.579708 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tg9jd" Apr 17 14:21:53.579930 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:53.579862 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tg9jd" podUID="41c68694-ceb3-44f8-a9e8-e0655e8aa848" Apr 17 14:21:54.141980 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:54.141940 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41c68694-ceb3-44f8-a9e8-e0655e8aa848-metrics-certs\") pod \"network-metrics-daemon-tg9jd\" (UID: \"41c68694-ceb3-44f8-a9e8-e0655e8aa848\") " pod="openshift-multus/network-metrics-daemon-tg9jd" Apr 17 14:21:54.142535 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:54.142091 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:21:54.142535 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:54.142184 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41c68694-ceb3-44f8-a9e8-e0655e8aa848-metrics-certs podName:41c68694-ceb3-44f8-a9e8-e0655e8aa848 nodeName:}" failed. No retries permitted until 2026-04-17 14:22:02.14214962 +0000 UTC m=+17.176355502 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41c68694-ceb3-44f8-a9e8-e0655e8aa848-metrics-certs") pod "network-metrics-daemon-tg9jd" (UID: "41c68694-ceb3-44f8-a9e8-e0655e8aa848") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:21:54.243484 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:54.243230 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sns99\" (UniqueName: \"kubernetes.io/projected/28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b-kube-api-access-sns99\") pod \"network-check-target-6twhf\" (UID: \"28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b\") " pod="openshift-network-diagnostics/network-check-target-6twhf" Apr 17 14:21:54.243484 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:54.243387 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:21:54.243484 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:54.243405 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:21:54.243484 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:54.243418 2568 projected.go:194] Error preparing data for projected volume kube-api-access-sns99 for pod openshift-network-diagnostics/network-check-target-6twhf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:21:54.243484 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:54.243479 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b-kube-api-access-sns99 podName:28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b nodeName:}" failed. No retries permitted until 2026-04-17 14:22:02.243461522 +0000 UTC m=+17.277667402 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-sns99" (UniqueName: "kubernetes.io/projected/28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b-kube-api-access-sns99") pod "network-check-target-6twhf" (UID: "28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:21:54.579046 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:54.579009 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6twhf" Apr 17 14:21:54.579242 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:54.579197 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6twhf" podUID="28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b" Apr 17 14:21:55.580481 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:55.580393 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tg9jd" Apr 17 14:21:55.580964 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:55.580519 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tg9jd" podUID="41c68694-ceb3-44f8-a9e8-e0655e8aa848" Apr 17 14:21:56.579952 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:56.579922 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6twhf" Apr 17 14:21:56.580077 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:56.580047 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6twhf" podUID="28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b" Apr 17 14:21:57.579389 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:57.579352 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tg9jd" Apr 17 14:21:57.579843 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:57.579493 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tg9jd" podUID="41c68694-ceb3-44f8-a9e8-e0655e8aa848" Apr 17 14:21:58.579430 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:58.579393 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6twhf" Apr 17 14:21:58.579817 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:58.579498 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6twhf" podUID="28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b" Apr 17 14:21:59.579066 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:21:59.579032 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tg9jd" Apr 17 14:21:59.579261 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:21:59.579183 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tg9jd" podUID="41c68694-ceb3-44f8-a9e8-e0655e8aa848" Apr 17 14:22:00.579999 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:00.579960 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6twhf" Apr 17 14:22:00.580442 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:00.580095 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6twhf" podUID="28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b" Apr 17 14:22:01.580020 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:01.579961 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tg9jd" Apr 17 14:22:01.580487 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:01.580105 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tg9jd" podUID="41c68694-ceb3-44f8-a9e8-e0655e8aa848" Apr 17 14:22:02.199722 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:02.199684 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41c68694-ceb3-44f8-a9e8-e0655e8aa848-metrics-certs\") pod \"network-metrics-daemon-tg9jd\" (UID: \"41c68694-ceb3-44f8-a9e8-e0655e8aa848\") " pod="openshift-multus/network-metrics-daemon-tg9jd" Apr 17 14:22:02.199933 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:02.199851 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:22:02.199933 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:02.199925 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41c68694-ceb3-44f8-a9e8-e0655e8aa848-metrics-certs podName:41c68694-ceb3-44f8-a9e8-e0655e8aa848 nodeName:}" failed. No retries permitted until 2026-04-17 14:22:18.199909797 +0000 UTC m=+33.234115676 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41c68694-ceb3-44f8-a9e8-e0655e8aa848-metrics-certs") pod "network-metrics-daemon-tg9jd" (UID: "41c68694-ceb3-44f8-a9e8-e0655e8aa848") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:22:02.301069 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:02.301029 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sns99\" (UniqueName: \"kubernetes.io/projected/28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b-kube-api-access-sns99\") pod \"network-check-target-6twhf\" (UID: \"28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b\") " pod="openshift-network-diagnostics/network-check-target-6twhf" Apr 17 14:22:02.301255 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:02.301233 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:22:02.301324 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:02.301262 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:22:02.301324 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:02.301276 2568 projected.go:194] Error preparing data for projected volume kube-api-access-sns99 for pod openshift-network-diagnostics/network-check-target-6twhf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:22:02.301395 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:02.301340 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b-kube-api-access-sns99 podName:28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b nodeName:}" failed. No retries permitted until 2026-04-17 14:22:18.301321184 +0000 UTC m=+33.335527059 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-sns99" (UniqueName: "kubernetes.io/projected/28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b-kube-api-access-sns99") pod "network-check-target-6twhf" (UID: "28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:22:02.579902 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:02.579863 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6twhf" Apr 17 14:22:02.580096 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:02.579989 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6twhf" podUID="28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b" Apr 17 14:22:03.579558 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:03.579524 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tg9jd" Apr 17 14:22:03.579707 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:03.579647 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tg9jd" podUID="41c68694-ceb3-44f8-a9e8-e0655e8aa848" Apr 17 14:22:04.579579 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:04.579552 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6twhf" Apr 17 14:22:04.579908 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:04.579653 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6twhf" podUID="28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b" Apr 17 14:22:05.579912 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:05.579885 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tg9jd" Apr 17 14:22:05.581035 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:05.580998 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tg9jd" podUID="41c68694-ceb3-44f8-a9e8-e0655e8aa848" Apr 17 14:22:05.793358 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:05.793324 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xgf47" event={"ID":"ba2c74ab-e348-46bf-a8a9-3b804800268d","Type":"ContainerStarted","Data":"a69fd86cf7ca045dbaec4de17d91fc3360f529bef34fee6f649bb5382e8dc36c"} Apr 17 14:22:05.794695 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:05.794670 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6f2x6" event={"ID":"7f633d5b-7896-43f3-b506-dc236c755507","Type":"ContainerStarted","Data":"54603ba6e22da8a4eeb2812c44284234717f3a08a07f027423e3f4eff131e994"} Apr 17 14:22:05.795941 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:05.795922 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-r7mvj" event={"ID":"48038e4e-5e89-4f13-aeb7-05e1197d4475","Type":"ContainerStarted","Data":"d9dcf6df183e38f1e699c8769c68ff87db90d6a00e42e3936f1fc84c6a7cc9a0"} Apr 17 14:22:05.797384 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:05.797363 2568 generic.go:358] "Generic (PLEG): container finished" podID="65b9b252-7788-4f34-9046-a58499e7e849" containerID="82163c005449af30e4d7cf144a77e8dd9e16b3adbdd0fc17a0fa2fd824a290a2" exitCode=0 Apr 17 14:22:05.797461 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:05.797423 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6zwtc" event={"ID":"65b9b252-7788-4f34-9046-a58499e7e849","Type":"ContainerDied","Data":"82163c005449af30e4d7cf144a77e8dd9e16b3adbdd0fc17a0fa2fd824a290a2"} Apr 17 14:22:05.802036 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:05.802022 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fm94_709e5989-ba48-455a-b8a9-25c4eafebaa4/ovn-acl-logging/0.log" Apr 17 14:22:05.802346 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:05.802328 2568 generic.go:358] "Generic (PLEG): container finished" podID="709e5989-ba48-455a-b8a9-25c4eafebaa4" containerID="4e4d0687c4b7382855d85b6559f5670e3404fa5d248b59033a5145afae6d527a" exitCode=1 Apr 17 14:22:05.802417 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:05.802389 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" event={"ID":"709e5989-ba48-455a-b8a9-25c4eafebaa4","Type":"ContainerStarted","Data":"1eabbe843e55b3a89a1dc15035bbb3ea0d5b21ff793a468ce871c3f3ce86f50d"} Apr 17 14:22:05.802417 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:05.802408 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" event={"ID":"709e5989-ba48-455a-b8a9-25c4eafebaa4","Type":"ContainerStarted","Data":"070e4b10102a4427356b01a7caac9ccab8ba9da676dd2d9dd99cfe4c2081b342"} Apr 17 14:22:05.802417 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:05.802417 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" event={"ID":"709e5989-ba48-455a-b8a9-25c4eafebaa4","Type":"ContainerStarted","Data":"0b33961e78ac42a50a8d49e772d3d24f2217a0341406d554a7726651478fe4d7"} Apr 17 14:22:05.802524 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:05.802427 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" event={"ID":"709e5989-ba48-455a-b8a9-25c4eafebaa4","Type":"ContainerStarted","Data":"b738960b03906be913806c406e965869b83abc5a09ed3ced1b93a65927e65e2e"} Apr 17 14:22:05.802524 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:05.802435 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" event={"ID":"709e5989-ba48-455a-b8a9-25c4eafebaa4","Type":"ContainerDied","Data":"4e4d0687c4b7382855d85b6559f5670e3404fa5d248b59033a5145afae6d527a"} Apr 17 14:22:05.802524 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:05.802444 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" event={"ID":"709e5989-ba48-455a-b8a9-25c4eafebaa4","Type":"ContainerStarted","Data":"d30f78566d3c2a2a8c5adbedaa33f3ce633570693d698ac4e7ca332a33942bea"} Apr 17 14:22:05.803456 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:05.803438 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" event={"ID":"1e41c42d-5f24-4aea-b32e-08f6d5dedcde","Type":"ContainerStarted","Data":"7d9a6f39afad75310cbaa70e15943a0da603a5b1230466135d176a9a66254dd4"} Apr 17 14:22:05.804589 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:05.804566 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-r9nsg" event={"ID":"d5d72b15-9ee0-40a2-b530-7847abb993f0","Type":"ContainerStarted","Data":"2484b492856d41101a065de47d44e30be07f7c0ef18872ce0e522b33255e4076"} Apr 17 14:22:05.805981 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:05.805961 2568 generic.go:358] "Generic (PLEG): container finished" podID="cca1ef78a831aed884654af106b7c2ef" containerID="623f6730bb541777917dc0fac59198e610de1eb5cd5ed8b0c8d73c3093fcf312" exitCode=0 Apr 17 14:22:05.806076 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:05.805984 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-215.ec2.internal" event={"ID":"cca1ef78a831aed884654af106b7c2ef","Type":"ContainerDied","Data":"623f6730bb541777917dc0fac59198e610de1eb5cd5ed8b0c8d73c3093fcf312"} Apr 17 14:22:05.807278 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:05.807257 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-215.ec2.internal" event={"ID":"3514b8218264fee7cad79c255e536dea","Type":"ContainerStarted","Data":"b5c8f8c3927ab9ebd75abf71a6c3f7c7e50648d0663c7bd90ce4b62ba81ad712"} Apr 17 14:22:05.808432 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:05.808413 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6xbpp" event={"ID":"438a43a6-e3d8-4f9f-ac52-b92baf10df16","Type":"ContainerStarted","Data":"9553750dd4642ad74ecc0c9a2306d3c95bbedc4308fc19832fb34b36dba1a0e4"} Apr 17 14:22:05.826629 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:05.826580 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xgf47" podStartSLOduration=2.845561301 podStartE2EDuration="20.826563935s" podCreationTimestamp="2026-04-17 14:21:45 +0000 UTC" firstStartedPulling="2026-04-17 14:21:46.862861634 +0000 UTC m=+1.897067509" lastFinishedPulling="2026-04-17 14:22:04.843864266 +0000 UTC m=+19.878070143" observedRunningTime="2026-04-17 14:22:05.809822413 +0000 UTC m=+20.844028310" watchObservedRunningTime="2026-04-17 14:22:05.826563935 +0000 UTC m=+20.860769833" Apr 17 14:22:05.839293 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:05.839253 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-r9nsg" podStartSLOduration=2.829613749 podStartE2EDuration="20.839239423s" podCreationTimestamp="2026-04-17 14:21:45 +0000 UTC" firstStartedPulling="2026-04-17 14:21:46.775818407 +0000 UTC m=+1.810024286" lastFinishedPulling="2026-04-17 14:22:04.785444072 +0000 UTC m=+19.819649960" observedRunningTime="2026-04-17 14:22:05.838701249 +0000 UTC m=+20.872907147" watchObservedRunningTime="2026-04-17 14:22:05.839239423 +0000 UTC m=+20.873445331" Apr 17 14:22:05.898664 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:05.898621 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-r7mvj" podStartSLOduration=2.901494211 podStartE2EDuration="20.898608426s" podCreationTimestamp="2026-04-17 14:21:45 +0000 UTC" firstStartedPulling="2026-04-17 14:21:46.818161132 +0000 UTC m=+1.852367006" lastFinishedPulling="2026-04-17 14:22:04.815275339 +0000 UTC m=+19.849481221" observedRunningTime="2026-04-17 14:22:05.876558843 +0000 UTC m=+20.910764742" watchObservedRunningTime="2026-04-17 14:22:05.898608426 +0000 UTC m=+20.932814322" Apr 17 14:22:05.898782 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:05.898688 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-sqzjh" podStartSLOduration=2.912812634 podStartE2EDuration="20.898684596s" podCreationTimestamp="2026-04-17 14:21:45 +0000 UTC" firstStartedPulling="2026-04-17 14:21:46.829022345 +0000 UTC m=+1.863228224" lastFinishedPulling="2026-04-17 14:22:04.814894312 +0000 UTC m=+19.849100186" observedRunningTime="2026-04-17 14:22:05.898401238 +0000 UTC m=+20.932607132" watchObservedRunningTime="2026-04-17 14:22:05.898684596 +0000 UTC m=+20.932890492" Apr 17 14:22:05.913004 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:05.912970 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-6f2x6" podStartSLOduration=2.9223157730000002 podStartE2EDuration="20.912957087s" podCreationTimestamp="2026-04-17 14:21:45 +0000 UTC" firstStartedPulling="2026-04-17 14:21:46.833708551 +0000 UTC m=+1.867914426" lastFinishedPulling="2026-04-17 14:22:04.824349852 +0000 UTC m=+19.858555740" observedRunningTime="2026-04-17 14:22:05.912777513 +0000 UTC m=+20.946983409" watchObservedRunningTime="2026-04-17 14:22:05.912957087 +0000 UTC m=+20.947162984" Apr 17 14:22:05.926714 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:05.926667 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-215.ec2.internal" podStartSLOduration=20.926649822999998 podStartE2EDuration="20.926649823s" podCreationTimestamp="2026-04-17 14:21:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:22:05.926325363 +0000 UTC m=+20.960531258" watchObservedRunningTime="2026-04-17 14:22:05.926649823 +0000 UTC m=+20.960855719" Apr 17 14:22:06.579421 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:06.579343 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6twhf" Apr 17 14:22:06.579528 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:06.579477 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6twhf" podUID="28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b" Apr 17 14:22:06.666725 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:06.666701 2568 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 14:22:06.812144 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:06.812097 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-215.ec2.internal" event={"ID":"cca1ef78a831aed884654af106b7c2ef","Type":"ContainerStarted","Data":"860e8472fbd98189ea70ac11048271cb3e33b8edb56823f08382e8b4c37ca8ea"} Apr 17 14:22:06.813593 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:06.813569 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6xbpp" event={"ID":"438a43a6-e3d8-4f9f-ac52-b92baf10df16","Type":"ContainerStarted","Data":"3c467836991f9088489c11bc77d4cc44a2340943e34d8dc09fa398c3b2d5a6d6"} Apr 17 14:22:06.814768 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:06.814745 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zw8fl" event={"ID":"641f1866-16ba-4a95-9644-d449d621c322","Type":"ContainerStarted","Data":"40e8d2957816825501acc5084afc5ac7a4bde8735428580efe947a23fa234213"} Apr 17 14:22:06.826365 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:06.826324 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-215.ec2.internal" podStartSLOduration=21.826312783 podStartE2EDuration="21.826312783s" podCreationTimestamp="2026-04-17 14:21:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:22:06.826072927 +0000 UTC m=+21.860278823" watchObservedRunningTime="2026-04-17 14:22:06.826312783 +0000 UTC m=+21.860518678" Apr 17 14:22:06.838939 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:06.838850 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-zw8fl" podStartSLOduration=3.922901909 podStartE2EDuration="21.838838947s" podCreationTimestamp="2026-04-17 14:21:45 +0000 UTC" firstStartedPulling="2026-04-17 14:21:46.869727334 +0000 UTC m=+1.903933209" lastFinishedPulling="2026-04-17 14:22:04.785664368 +0000 UTC m=+19.819870247" observedRunningTime="2026-04-17 14:22:06.838637043 +0000 UTC m=+21.872842935" watchObservedRunningTime="2026-04-17 14:22:06.838838947 +0000 UTC m=+21.873044843" Apr 17 14:22:07.380351 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:07.380299 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-r7mvj" Apr 17 14:22:07.381069 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:07.381047 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-r7mvj" Apr 17 14:22:07.551920 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:07.551796 2568 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T14:22:06.666722542Z","UUID":"75bfba76-5d0d-4a64-b4f0-5907c5c10ae5","Handler":null,"Name":"","Endpoint":""} Apr 17 14:22:07.553957 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:07.553786 2568 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 14:22:07.553957 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:07.553961 2568 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 14:22:07.583227 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:07.583201 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tg9jd" Apr 17 14:22:07.583443 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:07.583312 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tg9jd" podUID="41c68694-ceb3-44f8-a9e8-e0655e8aa848" Apr 17 14:22:07.818136 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:07.818093 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6xbpp" event={"ID":"438a43a6-e3d8-4f9f-ac52-b92baf10df16","Type":"ContainerStarted","Data":"e035b40c36b6de3c20642635ecfca637f631d4d4bb4f96916ce2f83eb771ad74"} Apr 17 14:22:07.821282 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:07.821260 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fm94_709e5989-ba48-455a-b8a9-25c4eafebaa4/ovn-acl-logging/0.log" Apr 17 14:22:07.821767 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:07.821739 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" event={"ID":"709e5989-ba48-455a-b8a9-25c4eafebaa4","Type":"ContainerStarted","Data":"b99cdf0b60c7afec887a08f3af90efaf6d92cbda59e9e614b13a752a67d127bf"} Apr 17 14:22:07.834352 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:07.834304 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6xbpp" podStartSLOduration=2.05320498 podStartE2EDuration="22.834290356s" podCreationTimestamp="2026-04-17 14:21:45 +0000 UTC" firstStartedPulling="2026-04-17 14:21:46.901653121 +0000 UTC m=+1.935859000" lastFinishedPulling="2026-04-17 14:22:07.682738497 +0000 UTC m=+22.716944376" observedRunningTime="2026-04-17 14:22:07.833990209 +0000 UTC m=+22.868196107" watchObservedRunningTime="2026-04-17 14:22:07.834290356 +0000 UTC m=+22.868496252" Apr 17 14:22:08.579446 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:08.579403 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6twhf" Apr 17 14:22:08.579662 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:08.579546 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6twhf" podUID="28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b" Apr 17 14:22:08.823629 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:08.823594 2568 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 14:22:09.582991 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:09.582964 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tg9jd" Apr 17 14:22:09.583196 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:09.583083 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tg9jd" podUID="41c68694-ceb3-44f8-a9e8-e0655e8aa848" Apr 17 14:22:09.858944 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:09.858914 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-r7mvj" Apr 17 14:22:09.859538 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:09.859032 2568 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 14:22:09.859796 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:09.859761 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-r7mvj" Apr 17 14:22:10.579777 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:10.579613 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6twhf" Apr 17 14:22:10.579923 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:10.579844 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6twhf" podUID="28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b" Apr 17 14:22:10.829629 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:10.829594 2568 generic.go:358] "Generic (PLEG): container finished" podID="65b9b252-7788-4f34-9046-a58499e7e849" containerID="431cc8a6d15341cd684097e530ec8a22919b3b5db4348ab5729923e14d5dbe87" exitCode=0 Apr 17 14:22:10.829818 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:10.829672 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6zwtc" event={"ID":"65b9b252-7788-4f34-9046-a58499e7e849","Type":"ContainerDied","Data":"431cc8a6d15341cd684097e530ec8a22919b3b5db4348ab5729923e14d5dbe87"} Apr 17 14:22:10.832635 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:10.832611 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fm94_709e5989-ba48-455a-b8a9-25c4eafebaa4/ovn-acl-logging/0.log" Apr 17 14:22:10.833048 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:10.833029 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" event={"ID":"709e5989-ba48-455a-b8a9-25c4eafebaa4","Type":"ContainerStarted","Data":"aa0ec1f9d7ef74247e7cff000b48ce40bbb365d5ac81b6c4fdb124b7af7a6c72"} Apr 17 14:22:10.833421 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:10.833403 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:22:10.833498 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:10.833432 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:22:10.833580 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:10.833565 2568 scope.go:117] "RemoveContainer" containerID="4e4d0687c4b7382855d85b6559f5670e3404fa5d248b59033a5145afae6d527a" Apr 17 14:22:10.849443 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:10.849368 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:22:11.579853 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:11.579825 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tg9jd" Apr 17 14:22:11.580413 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:11.579931 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tg9jd" podUID="41c68694-ceb3-44f8-a9e8-e0655e8aa848" Apr 17 14:22:11.837444 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:11.837407 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6zwtc" event={"ID":"65b9b252-7788-4f34-9046-a58499e7e849","Type":"ContainerStarted","Data":"00098a489981be79c7a2ca6afe04bdef3c0585462e4dcaf4b3a110d5de5a3826"} Apr 17 14:22:11.840966 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:11.840947 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fm94_709e5989-ba48-455a-b8a9-25c4eafebaa4/ovn-acl-logging/0.log" Apr 17 14:22:11.843248 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:11.843218 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" event={"ID":"709e5989-ba48-455a-b8a9-25c4eafebaa4","Type":"ContainerStarted","Data":"3a7c9d9c3f75b1679e7a17174f815c9365f1a2a05ac57fbc087965e8c5ea61a3"} Apr 17 14:22:11.843663 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:11.843644 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:22:11.862555 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:11.862533 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:22:11.883786 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:11.883735 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" podStartSLOduration=8.900220247 podStartE2EDuration="26.883720172s" podCreationTimestamp="2026-04-17 14:21:45 +0000 UTC" firstStartedPulling="2026-04-17 14:21:46.85133848 +0000 UTC m=+1.885544357" lastFinishedPulling="2026-04-17 14:22:04.834838394 +0000 UTC m=+19.869044282" observedRunningTime="2026-04-17 14:22:11.882247675 +0000 UTC m=+26.916453583" watchObservedRunningTime="2026-04-17 14:22:11.883720172 +0000 UTC m=+26.917926092" Apr 17 14:22:12.147581 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:12.147499 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tg9jd"] Apr 17 14:22:12.147738 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:12.147646 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tg9jd" Apr 17 14:22:12.147828 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:12.147759 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tg9jd" podUID="41c68694-ceb3-44f8-a9e8-e0655e8aa848" Apr 17 14:22:12.149980 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:12.149956 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-6twhf"] Apr 17 14:22:12.150083 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:12.150058 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6twhf" Apr 17 14:22:12.150177 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:12.150146 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6twhf" podUID="28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b" Apr 17 14:22:12.846955 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:12.846922 2568 generic.go:358] "Generic (PLEG): container finished" podID="65b9b252-7788-4f34-9046-a58499e7e849" containerID="00098a489981be79c7a2ca6afe04bdef3c0585462e4dcaf4b3a110d5de5a3826" exitCode=0 Apr 17 14:22:12.847327 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:12.847002 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6zwtc" event={"ID":"65b9b252-7788-4f34-9046-a58499e7e849","Type":"ContainerDied","Data":"00098a489981be79c7a2ca6afe04bdef3c0585462e4dcaf4b3a110d5de5a3826"} Apr 17 14:22:13.581905 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:13.581879 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6twhf" Apr 17 14:22:13.582068 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:13.582005 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6twhf" podUID="28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b" Apr 17 14:22:13.852391 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:13.852305 2568 generic.go:358] "Generic (PLEG): container finished" podID="65b9b252-7788-4f34-9046-a58499e7e849" containerID="fcb92c4f8ace4d0b5d84b27caf6cac73ea431d53da403b4d7145c1255710570d" exitCode=0 Apr 17 14:22:13.852926 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:13.852399 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6zwtc" event={"ID":"65b9b252-7788-4f34-9046-a58499e7e849","Type":"ContainerDied","Data":"fcb92c4f8ace4d0b5d84b27caf6cac73ea431d53da403b4d7145c1255710570d"} Apr 17 14:22:14.579183 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:14.579138 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tg9jd" Apr 17 14:22:14.579361 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:14.579275 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tg9jd" podUID="41c68694-ceb3-44f8-a9e8-e0655e8aa848" Apr 17 14:22:15.582446 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:15.582414 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6twhf" Apr 17 14:22:15.583052 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:15.582540 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6twhf" podUID="28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b" Apr 17 14:22:16.579361 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:16.579326 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tg9jd" Apr 17 14:22:16.579572 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:16.579461 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tg9jd" podUID="41c68694-ceb3-44f8-a9e8-e0655e8aa848" Apr 17 14:22:16.754510 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:16.754479 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-215.ec2.internal" event="NodeReady" Apr 17 14:22:16.754892 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:16.754623 2568 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 14:22:16.798245 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:16.798148 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-gzm7s"] Apr 17 14:22:16.830105 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:16.830058 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-4nk2q"] Apr 17 14:22:16.830277 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:16.830248 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gzm7s" Apr 17 14:22:16.832816 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:16.832685 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 14:22:16.832816 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:16.832765 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-rkhth\"" Apr 17 14:22:16.833029 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:16.832847 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 14:22:16.845849 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:16.845815 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gzm7s"] Apr 17 14:22:16.845849 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:16.845844 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4nk2q"] Apr 17 14:22:16.846035 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:16.845972 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4nk2q" Apr 17 14:22:16.848696 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:16.848610 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 14:22:16.848696 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:16.848623 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 14:22:16.848696 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:16.848644 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 14:22:16.848696 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:16.848679 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-w9wr2\"" Apr 17 14:22:16.916781 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:16.916733 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a9c4445f-88cc-4c46-800e-db32500ad34d-config-volume\") pod \"dns-default-gzm7s\" (UID: \"a9c4445f-88cc-4c46-800e-db32500ad34d\") " pod="openshift-dns/dns-default-gzm7s" Apr 17 14:22:16.917006 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:16.916807 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7-cert\") pod \"ingress-canary-4nk2q\" (UID: \"0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7\") " pod="openshift-ingress-canary/ingress-canary-4nk2q" Apr 17 14:22:16.917006 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:16.916887 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9c4445f-88cc-4c46-800e-db32500ad34d-metrics-tls\") pod \"dns-default-gzm7s\" (UID: \"a9c4445f-88cc-4c46-800e-db32500ad34d\") " pod="openshift-dns/dns-default-gzm7s" Apr 17 14:22:16.917006 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:16.916979 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7frg\" (UniqueName: \"kubernetes.io/projected/0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7-kube-api-access-m7frg\") pod \"ingress-canary-4nk2q\" (UID: \"0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7\") " pod="openshift-ingress-canary/ingress-canary-4nk2q" Apr 17 14:22:16.917150 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:16.917006 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a9c4445f-88cc-4c46-800e-db32500ad34d-tmp-dir\") pod \"dns-default-gzm7s\" (UID: \"a9c4445f-88cc-4c46-800e-db32500ad34d\") " pod="openshift-dns/dns-default-gzm7s" Apr 17 14:22:16.917150 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:16.917033 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj97v\" (UniqueName: \"kubernetes.io/projected/a9c4445f-88cc-4c46-800e-db32500ad34d-kube-api-access-hj97v\") pod \"dns-default-gzm7s\" (UID: \"a9c4445f-88cc-4c46-800e-db32500ad34d\") " pod="openshift-dns/dns-default-gzm7s" Apr 17 14:22:17.018298 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:17.018259 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a9c4445f-88cc-4c46-800e-db32500ad34d-config-volume\") pod \"dns-default-gzm7s\" (UID: \"a9c4445f-88cc-4c46-800e-db32500ad34d\") " pod="openshift-dns/dns-default-gzm7s" Apr 17 14:22:17.018469 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:17.018310 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7-cert\") pod \"ingress-canary-4nk2q\" (UID: \"0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7\") " pod="openshift-ingress-canary/ingress-canary-4nk2q" Apr 17 14:22:17.018469 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:17.018338 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9c4445f-88cc-4c46-800e-db32500ad34d-metrics-tls\") pod \"dns-default-gzm7s\" (UID: \"a9c4445f-88cc-4c46-800e-db32500ad34d\") " pod="openshift-dns/dns-default-gzm7s" Apr 17 14:22:17.018469 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:17.018387 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m7frg\" (UniqueName: \"kubernetes.io/projected/0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7-kube-api-access-m7frg\") pod \"ingress-canary-4nk2q\" (UID: \"0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7\") " pod="openshift-ingress-canary/ingress-canary-4nk2q" Apr 17 14:22:17.018614 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:17.018486 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:22:17.018614 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:17.018543 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:22:17.018614 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:17.018568 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9c4445f-88cc-4c46-800e-db32500ad34d-metrics-tls podName:a9c4445f-88cc-4c46-800e-db32500ad34d nodeName:}" failed. No retries permitted until 2026-04-17 14:22:17.518545927 +0000 UTC m=+32.552751806 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a9c4445f-88cc-4c46-800e-db32500ad34d-metrics-tls") pod "dns-default-gzm7s" (UID: "a9c4445f-88cc-4c46-800e-db32500ad34d") : secret "dns-default-metrics-tls" not found Apr 17 14:22:17.018614 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:17.018587 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7-cert podName:0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7 nodeName:}" failed. No retries permitted until 2026-04-17 14:22:17.518576572 +0000 UTC m=+32.552782450 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7-cert") pod "ingress-canary-4nk2q" (UID: "0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7") : secret "canary-serving-cert" not found Apr 17 14:22:17.018796 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:17.018612 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a9c4445f-88cc-4c46-800e-db32500ad34d-tmp-dir\") pod \"dns-default-gzm7s\" (UID: \"a9c4445f-88cc-4c46-800e-db32500ad34d\") " pod="openshift-dns/dns-default-gzm7s" Apr 17 14:22:17.018796 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:17.018644 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hj97v\" (UniqueName: \"kubernetes.io/projected/a9c4445f-88cc-4c46-800e-db32500ad34d-kube-api-access-hj97v\") pod \"dns-default-gzm7s\" (UID: \"a9c4445f-88cc-4c46-800e-db32500ad34d\") " pod="openshift-dns/dns-default-gzm7s" Apr 17 14:22:17.018937 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:17.018907 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a9c4445f-88cc-4c46-800e-db32500ad34d-config-volume\") pod \"dns-default-gzm7s\" (UID: \"a9c4445f-88cc-4c46-800e-db32500ad34d\") " pod="openshift-dns/dns-default-gzm7s" Apr 17 14:22:17.018979 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:17.018954 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a9c4445f-88cc-4c46-800e-db32500ad34d-tmp-dir\") pod \"dns-default-gzm7s\" (UID: \"a9c4445f-88cc-4c46-800e-db32500ad34d\") " pod="openshift-dns/dns-default-gzm7s" Apr 17 14:22:17.030853 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:17.030823 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj97v\" (UniqueName: \"kubernetes.io/projected/a9c4445f-88cc-4c46-800e-db32500ad34d-kube-api-access-hj97v\") pod \"dns-default-gzm7s\" (UID: \"a9c4445f-88cc-4c46-800e-db32500ad34d\") " pod="openshift-dns/dns-default-gzm7s" Apr 17 14:22:17.030977 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:17.030906 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7frg\" (UniqueName: \"kubernetes.io/projected/0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7-kube-api-access-m7frg\") pod \"ingress-canary-4nk2q\" (UID: \"0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7\") " pod="openshift-ingress-canary/ingress-canary-4nk2q" Apr 17 14:22:17.522393 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:17.522352 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7-cert\") pod \"ingress-canary-4nk2q\" (UID: \"0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7\") " pod="openshift-ingress-canary/ingress-canary-4nk2q" Apr 17 14:22:17.522393 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:17.522389 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9c4445f-88cc-4c46-800e-db32500ad34d-metrics-tls\") pod \"dns-default-gzm7s\" (UID: \"a9c4445f-88cc-4c46-800e-db32500ad34d\") " pod="openshift-dns/dns-default-gzm7s" Apr 17 14:22:17.522628 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:17.522527 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:22:17.522628 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:17.522552 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:22:17.522628 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:17.522605 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7-cert podName:0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7 nodeName:}" failed. No retries permitted until 2026-04-17 14:22:18.522584437 +0000 UTC m=+33.556790314 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7-cert") pod "ingress-canary-4nk2q" (UID: "0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7") : secret "canary-serving-cert" not found Apr 17 14:22:17.522628 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:17.522626 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9c4445f-88cc-4c46-800e-db32500ad34d-metrics-tls podName:a9c4445f-88cc-4c46-800e-db32500ad34d nodeName:}" failed. No retries permitted until 2026-04-17 14:22:18.522618064 +0000 UTC m=+33.556823938 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a9c4445f-88cc-4c46-800e-db32500ad34d-metrics-tls") pod "dns-default-gzm7s" (UID: "a9c4445f-88cc-4c46-800e-db32500ad34d") : secret "dns-default-metrics-tls" not found Apr 17 14:22:17.579807 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:17.579774 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6twhf" Apr 17 14:22:17.582528 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:17.582504 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 14:22:17.582528 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:17.582523 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-qbxgh\"" Apr 17 14:22:17.582691 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:17.582539 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 14:22:18.227499 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:18.227457 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41c68694-ceb3-44f8-a9e8-e0655e8aa848-metrics-certs\") pod \"network-metrics-daemon-tg9jd\" (UID: \"41c68694-ceb3-44f8-a9e8-e0655e8aa848\") " pod="openshift-multus/network-metrics-daemon-tg9jd" Apr 17 14:22:18.227982 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:18.227625 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:22:18.227982 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:18.227707 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41c68694-ceb3-44f8-a9e8-e0655e8aa848-metrics-certs podName:41c68694-ceb3-44f8-a9e8-e0655e8aa848 nodeName:}" failed. No retries permitted until 2026-04-17 14:22:50.227688959 +0000 UTC m=+65.261894837 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41c68694-ceb3-44f8-a9e8-e0655e8aa848-metrics-certs") pod "network-metrics-daemon-tg9jd" (UID: "41c68694-ceb3-44f8-a9e8-e0655e8aa848") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:22:18.328384 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:18.328348 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sns99\" (UniqueName: \"kubernetes.io/projected/28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b-kube-api-access-sns99\") pod \"network-check-target-6twhf\" (UID: \"28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b\") " pod="openshift-network-diagnostics/network-check-target-6twhf" Apr 17 14:22:18.331845 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:18.331816 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sns99\" (UniqueName: \"kubernetes.io/projected/28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b-kube-api-access-sns99\") pod \"network-check-target-6twhf\" (UID: \"28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b\") " pod="openshift-network-diagnostics/network-check-target-6twhf" Apr 17 14:22:18.489190 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:18.489081 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6twhf" Apr 17 14:22:18.529328 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:18.529272 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7-cert\") pod \"ingress-canary-4nk2q\" (UID: \"0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7\") " pod="openshift-ingress-canary/ingress-canary-4nk2q" Apr 17 14:22:18.529328 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:18.529324 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9c4445f-88cc-4c46-800e-db32500ad34d-metrics-tls\") pod \"dns-default-gzm7s\" (UID: \"a9c4445f-88cc-4c46-800e-db32500ad34d\") " pod="openshift-dns/dns-default-gzm7s" Apr 17 14:22:18.529549 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:18.529509 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:22:18.529609 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:18.529586 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9c4445f-88cc-4c46-800e-db32500ad34d-metrics-tls podName:a9c4445f-88cc-4c46-800e-db32500ad34d nodeName:}" failed. No retries permitted until 2026-04-17 14:22:20.529567841 +0000 UTC m=+35.563773724 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a9c4445f-88cc-4c46-800e-db32500ad34d-metrics-tls") pod "dns-default-gzm7s" (UID: "a9c4445f-88cc-4c46-800e-db32500ad34d") : secret "dns-default-metrics-tls" not found Apr 17 14:22:18.529790 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:18.529764 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:22:18.529913 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:18.529838 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7-cert podName:0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7 nodeName:}" failed. No retries permitted until 2026-04-17 14:22:20.529820305 +0000 UTC m=+35.564026194 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7-cert") pod "ingress-canary-4nk2q" (UID: "0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7") : secret "canary-serving-cert" not found Apr 17 14:22:18.579396 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:18.579364 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tg9jd" Apr 17 14:22:18.581947 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:18.581917 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-hxvwz\"" Apr 17 14:22:18.581947 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:18.581937 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 14:22:19.890156 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:19.890126 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-6twhf"] Apr 17 14:22:19.894865 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:22:19.894838 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28cf6cf1_4585_4ab8_b8a2_7d3fc29cc38b.slice/crio-d775e98d6719dc6423f2fbc5c8d45e684f8cb29ee44d7bf6d7b9d083aec8df54 WatchSource:0}: Error finding container d775e98d6719dc6423f2fbc5c8d45e684f8cb29ee44d7bf6d7b9d083aec8df54: Status 404 returned error can't find the container with id d775e98d6719dc6423f2fbc5c8d45e684f8cb29ee44d7bf6d7b9d083aec8df54 Apr 17 14:22:20.544507 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:20.544411 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7-cert\") pod \"ingress-canary-4nk2q\" (UID: \"0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7\") " pod="openshift-ingress-canary/ingress-canary-4nk2q" Apr 17 14:22:20.544507 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:20.544459 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9c4445f-88cc-4c46-800e-db32500ad34d-metrics-tls\") pod \"dns-default-gzm7s\" (UID: \"a9c4445f-88cc-4c46-800e-db32500ad34d\") " pod="openshift-dns/dns-default-gzm7s" Apr 17 14:22:20.544733 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:20.544583 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:22:20.544733 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:20.544605 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:22:20.544733 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:20.544648 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9c4445f-88cc-4c46-800e-db32500ad34d-metrics-tls podName:a9c4445f-88cc-4c46-800e-db32500ad34d nodeName:}" failed. No retries permitted until 2026-04-17 14:22:24.544628817 +0000 UTC m=+39.578834698 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a9c4445f-88cc-4c46-800e-db32500ad34d-metrics-tls") pod "dns-default-gzm7s" (UID: "a9c4445f-88cc-4c46-800e-db32500ad34d") : secret "dns-default-metrics-tls" not found Apr 17 14:22:20.544891 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:20.544768 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7-cert podName:0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7 nodeName:}" failed. No retries permitted until 2026-04-17 14:22:24.54474581 +0000 UTC m=+39.578951689 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7-cert") pod "ingress-canary-4nk2q" (UID: "0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7") : secret "canary-serving-cert" not found Apr 17 14:22:20.870327 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:20.870067 2568 generic.go:358] "Generic (PLEG): container finished" podID="65b9b252-7788-4f34-9046-a58499e7e849" containerID="d5a7a32c697d3902b44ca75e8dacd5a173bc5498348242ae11b4fb2b629add06" exitCode=0 Apr 17 14:22:20.870500 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:20.870176 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6zwtc" event={"ID":"65b9b252-7788-4f34-9046-a58499e7e849","Type":"ContainerDied","Data":"d5a7a32c697d3902b44ca75e8dacd5a173bc5498348242ae11b4fb2b629add06"} Apr 17 14:22:20.871723 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:20.871535 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-6twhf" event={"ID":"28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b","Type":"ContainerStarted","Data":"d775e98d6719dc6423f2fbc5c8d45e684f8cb29ee44d7bf6d7b9d083aec8df54"} Apr 17 14:22:21.875897 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:21.875870 2568 generic.go:358] "Generic (PLEG): container finished" podID="65b9b252-7788-4f34-9046-a58499e7e849" containerID="4f66fa5ea0a4701b0eb300d966e075cd31f89f9fe4ea1458e93df827987f1c0c" exitCode=0 Apr 17 14:22:21.876332 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:21.875915 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6zwtc" event={"ID":"65b9b252-7788-4f34-9046-a58499e7e849","Type":"ContainerDied","Data":"4f66fa5ea0a4701b0eb300d966e075cd31f89f9fe4ea1458e93df827987f1c0c"} Apr 17 14:22:23.881496 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:23.881457 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-6twhf" event={"ID":"28cf6cf1-4585-4ab8-b8a2-7d3fc29cc38b","Type":"ContainerStarted","Data":"e6fa7ddb1efbc295590a6c190d07b22572ef60caa143cca0dc0d17402ac0063f"} Apr 17 14:22:23.882141 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:23.881567 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-6twhf" Apr 17 14:22:23.884382 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:23.884358 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6zwtc" event={"ID":"65b9b252-7788-4f34-9046-a58499e7e849","Type":"ContainerStarted","Data":"506bf5ad4e826bfde98f7f362ce941475030d1f516b6d16fd64c589507d8c11a"} Apr 17 14:22:23.897331 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:23.897293 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-6twhf" podStartSLOduration=35.87764491 podStartE2EDuration="38.897282424s" podCreationTimestamp="2026-04-17 14:21:45 +0000 UTC" firstStartedPulling="2026-04-17 14:22:19.99850737 +0000 UTC m=+35.032713245" lastFinishedPulling="2026-04-17 14:22:23.01814487 +0000 UTC m=+38.052350759" observedRunningTime="2026-04-17 14:22:23.89663208 +0000 UTC m=+38.930837979" watchObservedRunningTime="2026-04-17 14:22:23.897282424 +0000 UTC m=+38.931488317" Apr 17 14:22:23.915564 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:23.915526 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6zwtc" podStartSLOduration=5.683777438 podStartE2EDuration="38.915516859s" podCreationTimestamp="2026-04-17 14:21:45 +0000 UTC" firstStartedPulling="2026-04-17 14:21:46.790769111 +0000 UTC m=+1.824974988" lastFinishedPulling="2026-04-17 14:22:20.022508525 +0000 UTC m=+35.056714409" observedRunningTime="2026-04-17 14:22:23.914928611 +0000 UTC m=+38.949134508" watchObservedRunningTime="2026-04-17 14:22:23.915516859 +0000 UTC m=+38.949722753" Apr 17 14:22:24.575341 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:24.575299 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7-cert\") pod \"ingress-canary-4nk2q\" (UID: \"0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7\") " pod="openshift-ingress-canary/ingress-canary-4nk2q" Apr 17 14:22:24.575341 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:24.575343 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9c4445f-88cc-4c46-800e-db32500ad34d-metrics-tls\") pod \"dns-default-gzm7s\" (UID: \"a9c4445f-88cc-4c46-800e-db32500ad34d\") " pod="openshift-dns/dns-default-gzm7s" Apr 17 14:22:24.575595 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:24.575451 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:22:24.575595 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:24.575467 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:22:24.575595 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:24.575511 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7-cert podName:0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7 nodeName:}" failed. No retries permitted until 2026-04-17 14:22:32.575496648 +0000 UTC m=+47.609702522 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7-cert") pod "ingress-canary-4nk2q" (UID: "0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7") : secret "canary-serving-cert" not found Apr 17 14:22:24.575595 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:24.575527 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9c4445f-88cc-4c46-800e-db32500ad34d-metrics-tls podName:a9c4445f-88cc-4c46-800e-db32500ad34d nodeName:}" failed. No retries permitted until 2026-04-17 14:22:32.575520113 +0000 UTC m=+47.609725987 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a9c4445f-88cc-4c46-800e-db32500ad34d-metrics-tls") pod "dns-default-gzm7s" (UID: "a9c4445f-88cc-4c46-800e-db32500ad34d") : secret "dns-default-metrics-tls" not found Apr 17 14:22:32.628380 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:32.628339 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7-cert\") pod \"ingress-canary-4nk2q\" (UID: \"0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7\") " pod="openshift-ingress-canary/ingress-canary-4nk2q" Apr 17 14:22:32.628380 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:32.628382 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9c4445f-88cc-4c46-800e-db32500ad34d-metrics-tls\") pod \"dns-default-gzm7s\" (UID: \"a9c4445f-88cc-4c46-800e-db32500ad34d\") " pod="openshift-dns/dns-default-gzm7s" Apr 17 14:22:32.628875 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:32.628478 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:22:32.628875 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:32.628481 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:22:32.628875 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:32.628531 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9c4445f-88cc-4c46-800e-db32500ad34d-metrics-tls podName:a9c4445f-88cc-4c46-800e-db32500ad34d nodeName:}" failed. No retries permitted until 2026-04-17 14:22:48.6285171 +0000 UTC m=+63.662722975 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a9c4445f-88cc-4c46-800e-db32500ad34d-metrics-tls") pod "dns-default-gzm7s" (UID: "a9c4445f-88cc-4c46-800e-db32500ad34d") : secret "dns-default-metrics-tls" not found Apr 17 14:22:32.628875 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:32.628544 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7-cert podName:0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7 nodeName:}" failed. No retries permitted until 2026-04-17 14:22:48.628538643 +0000 UTC m=+63.662744517 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7-cert") pod "ingress-canary-4nk2q" (UID: "0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7") : secret "canary-serving-cert" not found Apr 17 14:22:43.868708 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:43.868676 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5fm94" Apr 17 14:22:48.634244 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:48.634205 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7-cert\") pod \"ingress-canary-4nk2q\" (UID: \"0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7\") " pod="openshift-ingress-canary/ingress-canary-4nk2q" Apr 17 14:22:48.634244 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:48.634245 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9c4445f-88cc-4c46-800e-db32500ad34d-metrics-tls\") pod \"dns-default-gzm7s\" (UID: \"a9c4445f-88cc-4c46-800e-db32500ad34d\") " pod="openshift-dns/dns-default-gzm7s" Apr 17 14:22:48.634655 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:48.634357 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:22:48.634655 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:48.634416 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9c4445f-88cc-4c46-800e-db32500ad34d-metrics-tls podName:a9c4445f-88cc-4c46-800e-db32500ad34d nodeName:}" failed. No retries permitted until 2026-04-17 14:23:20.634400945 +0000 UTC m=+95.668606821 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a9c4445f-88cc-4c46-800e-db32500ad34d-metrics-tls") pod "dns-default-gzm7s" (UID: "a9c4445f-88cc-4c46-800e-db32500ad34d") : secret "dns-default-metrics-tls" not found Apr 17 14:22:48.634655 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:48.634357 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:22:48.634655 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:48.634495 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7-cert podName:0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7 nodeName:}" failed. No retries permitted until 2026-04-17 14:23:20.634483823 +0000 UTC m=+95.668689703 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7-cert") pod "ingress-canary-4nk2q" (UID: "0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7") : secret "canary-serving-cert" not found Apr 17 14:22:50.245021 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:50.244982 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41c68694-ceb3-44f8-a9e8-e0655e8aa848-metrics-certs\") pod \"network-metrics-daemon-tg9jd\" (UID: \"41c68694-ceb3-44f8-a9e8-e0655e8aa848\") " pod="openshift-multus/network-metrics-daemon-tg9jd" Apr 17 14:22:50.248134 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:50.248113 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 14:22:50.255458 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:50.255435 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 14:22:50.255525 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:22:50.255515 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41c68694-ceb3-44f8-a9e8-e0655e8aa848-metrics-certs podName:41c68694-ceb3-44f8-a9e8-e0655e8aa848 nodeName:}" failed. No retries permitted until 2026-04-17 14:23:54.2554934 +0000 UTC m=+129.289699276 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41c68694-ceb3-44f8-a9e8-e0655e8aa848-metrics-certs") pod "network-metrics-daemon-tg9jd" (UID: "41c68694-ceb3-44f8-a9e8-e0655e8aa848") : secret "metrics-daemon-secret" not found Apr 17 14:22:54.888989 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:22:54.888883 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-6twhf" Apr 17 14:23:17.276367 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.276331 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zhr7l"] Apr 17 14:23:17.280638 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.280619 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zhr7l" Apr 17 14:23:17.283276 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.283253 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:23:17.283401 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.283273 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 17 14:23:17.283401 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.283257 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 17 14:23:17.284419 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.284404 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-5v92j\"" Apr 17 14:23:17.288418 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.288395 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zhr7l"] Apr 17 14:23:17.315180 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.315135 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98phl\" (UniqueName: \"kubernetes.io/projected/ae7c46b5-41a6-4f3c-b2a7-c9701c82e890-kube-api-access-98phl\") pod \"cluster-samples-operator-6dc5bdb6b4-zhr7l\" (UID: \"ae7c46b5-41a6-4f3c-b2a7-c9701c82e890\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zhr7l" Apr 17 14:23:17.315331 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.315203 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae7c46b5-41a6-4f3c-b2a7-c9701c82e890-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zhr7l\" (UID: \"ae7c46b5-41a6-4f3c-b2a7-c9701c82e890\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zhr7l" Apr 17 14:23:17.374333 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.374297 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-drrsx"] Apr 17 14:23:17.377046 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.377031 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-drrsx" Apr 17 14:23:17.379679 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.379652 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-86g4k\"" Apr 17 14:23:17.379780 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.379652 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:23:17.379852 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.379832 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 17 14:23:17.381436 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.381414 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9p4v4"] Apr 17 14:23:17.383999 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.383979 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-ln5j2"] Apr 17 14:23:17.384179 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.384152 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9p4v4" Apr 17 14:23:17.386893 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.386872 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-drrsx"] Apr 17 14:23:17.387009 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.386995 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-ln5j2" Apr 17 14:23:17.387459 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.387436 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:23:17.387664 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.387645 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 17 14:23:17.387753 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.387650 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 17 14:23:17.388068 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.388049 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 17 14:23:17.388254 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.388240 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-tfjsq\"" Apr 17 14:23:17.389491 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.389476 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-btbzt\"" Apr 17 14:23:17.389650 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.389636 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 17 14:23:17.390459 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.390442 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:23:17.390600 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.390583 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 17 14:23:17.391054 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.391026 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 17 14:23:17.396275 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.396242 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 17 14:23:17.397033 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.397006 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9p4v4"] Apr 17 14:23:17.409177 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.409138 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-ln5j2"] Apr 17 14:23:17.415729 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.415705 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn9vx\" (UniqueName: \"kubernetes.io/projected/b9528958-b786-4c25-8d67-30d1493f6002-kube-api-access-pn9vx\") pod \"kube-storage-version-migrator-operator-6769c5d45-9p4v4\" (UID: \"b9528958-b786-4c25-8d67-30d1493f6002\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9p4v4" Apr 17 14:23:17.415854 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.415738 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93531d07-7bae-4782-818d-d6e8ceecf396-config\") pod \"console-operator-9d4b6777b-ln5j2\" (UID: \"93531d07-7bae-4782-818d-d6e8ceecf396\") " pod="openshift-console-operator/console-operator-9d4b6777b-ln5j2" Apr 17 14:23:17.415854 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.415759 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9528958-b786-4c25-8d67-30d1493f6002-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-9p4v4\" (UID: \"b9528958-b786-4c25-8d67-30d1493f6002\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9p4v4" Apr 17 14:23:17.415854 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.415836 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/93531d07-7bae-4782-818d-d6e8ceecf396-trusted-ca\") pod \"console-operator-9d4b6777b-ln5j2\" (UID: \"93531d07-7bae-4782-818d-d6e8ceecf396\") " pod="openshift-console-operator/console-operator-9d4b6777b-ln5j2" Apr 17 14:23:17.415973 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.415867 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae7c46b5-41a6-4f3c-b2a7-c9701c82e890-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zhr7l\" (UID: \"ae7c46b5-41a6-4f3c-b2a7-c9701c82e890\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zhr7l" Apr 17 14:23:17.415973 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.415889 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngtgn\" (UniqueName: \"kubernetes.io/projected/bec6e69e-bcfa-4627-9496-dbf9608ffd71-kube-api-access-ngtgn\") pod \"volume-data-source-validator-7c6cbb6c87-drrsx\" (UID: \"bec6e69e-bcfa-4627-9496-dbf9608ffd71\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-drrsx" Apr 17 14:23:17.415973 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.415937 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9528958-b786-4c25-8d67-30d1493f6002-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-9p4v4\" (UID: \"b9528958-b786-4c25-8d67-30d1493f6002\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9p4v4" Apr 17 14:23:17.416074 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.416008 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-98phl\" (UniqueName: \"kubernetes.io/projected/ae7c46b5-41a6-4f3c-b2a7-c9701c82e890-kube-api-access-98phl\") pod \"cluster-samples-operator-6dc5bdb6b4-zhr7l\" (UID: \"ae7c46b5-41a6-4f3c-b2a7-c9701c82e890\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zhr7l" Apr 17 14:23:17.416074 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:23:17.416014 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 14:23:17.416074 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.416039 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93531d07-7bae-4782-818d-d6e8ceecf396-serving-cert\") pod \"console-operator-9d4b6777b-ln5j2\" (UID: \"93531d07-7bae-4782-818d-d6e8ceecf396\") " pod="openshift-console-operator/console-operator-9d4b6777b-ln5j2" Apr 17 14:23:17.416401 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.416382 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdghx\" (UniqueName: \"kubernetes.io/projected/93531d07-7bae-4782-818d-d6e8ceecf396-kube-api-access-rdghx\") pod \"console-operator-9d4b6777b-ln5j2\" (UID: \"93531d07-7bae-4782-818d-d6e8ceecf396\") " pod="openshift-console-operator/console-operator-9d4b6777b-ln5j2" Apr 17 14:23:17.416482 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:23:17.416350 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae7c46b5-41a6-4f3c-b2a7-c9701c82e890-samples-operator-tls podName:ae7c46b5-41a6-4f3c-b2a7-c9701c82e890 nodeName:}" failed. No retries permitted until 2026-04-17 14:23:17.916330636 +0000 UTC m=+92.950536511 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ae7c46b5-41a6-4f3c-b2a7-c9701c82e890-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-zhr7l" (UID: "ae7c46b5-41a6-4f3c-b2a7-c9701c82e890") : secret "samples-operator-tls" not found Apr 17 14:23:17.427232 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.427206 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-98phl\" (UniqueName: \"kubernetes.io/projected/ae7c46b5-41a6-4f3c-b2a7-c9701c82e890-kube-api-access-98phl\") pod \"cluster-samples-operator-6dc5bdb6b4-zhr7l\" (UID: \"ae7c46b5-41a6-4f3c-b2a7-c9701c82e890\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zhr7l" Apr 17 14:23:17.517345 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.517303 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93531d07-7bae-4782-818d-d6e8ceecf396-serving-cert\") pod \"console-operator-9d4b6777b-ln5j2\" (UID: \"93531d07-7bae-4782-818d-d6e8ceecf396\") " pod="openshift-console-operator/console-operator-9d4b6777b-ln5j2" Apr 17 14:23:17.517345 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.517348 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rdghx\" (UniqueName: \"kubernetes.io/projected/93531d07-7bae-4782-818d-d6e8ceecf396-kube-api-access-rdghx\") pod \"console-operator-9d4b6777b-ln5j2\" (UID: \"93531d07-7bae-4782-818d-d6e8ceecf396\") " pod="openshift-console-operator/console-operator-9d4b6777b-ln5j2" Apr 17 14:23:17.517599 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.517372 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pn9vx\" (UniqueName: \"kubernetes.io/projected/b9528958-b786-4c25-8d67-30d1493f6002-kube-api-access-pn9vx\") pod \"kube-storage-version-migrator-operator-6769c5d45-9p4v4\" (UID: \"b9528958-b786-4c25-8d67-30d1493f6002\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9p4v4" Apr 17 14:23:17.517599 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.517501 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93531d07-7bae-4782-818d-d6e8ceecf396-config\") pod \"console-operator-9d4b6777b-ln5j2\" (UID: \"93531d07-7bae-4782-818d-d6e8ceecf396\") " pod="openshift-console-operator/console-operator-9d4b6777b-ln5j2" Apr 17 14:23:17.517599 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.517541 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9528958-b786-4c25-8d67-30d1493f6002-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-9p4v4\" (UID: \"b9528958-b786-4c25-8d67-30d1493f6002\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9p4v4" Apr 17 14:23:17.517599 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.517565 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/93531d07-7bae-4782-818d-d6e8ceecf396-trusted-ca\") pod \"console-operator-9d4b6777b-ln5j2\" (UID: \"93531d07-7bae-4782-818d-d6e8ceecf396\") " pod="openshift-console-operator/console-operator-9d4b6777b-ln5j2" Apr 17 14:23:17.517797 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.517684 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ngtgn\" (UniqueName: \"kubernetes.io/projected/bec6e69e-bcfa-4627-9496-dbf9608ffd71-kube-api-access-ngtgn\") pod \"volume-data-source-validator-7c6cbb6c87-drrsx\" (UID: \"bec6e69e-bcfa-4627-9496-dbf9608ffd71\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-drrsx" Apr 17 14:23:17.517797 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.517725 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9528958-b786-4c25-8d67-30d1493f6002-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-9p4v4\" (UID: \"b9528958-b786-4c25-8d67-30d1493f6002\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9p4v4" Apr 17 14:23:17.518193 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.518153 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9528958-b786-4c25-8d67-30d1493f6002-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-9p4v4\" (UID: \"b9528958-b786-4c25-8d67-30d1493f6002\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9p4v4" Apr 17 14:23:17.518336 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.518316 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93531d07-7bae-4782-818d-d6e8ceecf396-config\") pod \"console-operator-9d4b6777b-ln5j2\" (UID: \"93531d07-7bae-4782-818d-d6e8ceecf396\") " pod="openshift-console-operator/console-operator-9d4b6777b-ln5j2" Apr 17 14:23:17.518400 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.518346 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/93531d07-7bae-4782-818d-d6e8ceecf396-trusted-ca\") pod \"console-operator-9d4b6777b-ln5j2\" (UID: \"93531d07-7bae-4782-818d-d6e8ceecf396\") " pod="openshift-console-operator/console-operator-9d4b6777b-ln5j2" Apr 17 14:23:17.519795 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.519771 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93531d07-7bae-4782-818d-d6e8ceecf396-serving-cert\") pod \"console-operator-9d4b6777b-ln5j2\" (UID: \"93531d07-7bae-4782-818d-d6e8ceecf396\") " pod="openshift-console-operator/console-operator-9d4b6777b-ln5j2" Apr 17 14:23:17.519886 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.519797 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9528958-b786-4c25-8d67-30d1493f6002-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-9p4v4\" (UID: \"b9528958-b786-4c25-8d67-30d1493f6002\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9p4v4" Apr 17 14:23:17.525647 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.525616 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdghx\" (UniqueName: \"kubernetes.io/projected/93531d07-7bae-4782-818d-d6e8ceecf396-kube-api-access-rdghx\") pod \"console-operator-9d4b6777b-ln5j2\" (UID: \"93531d07-7bae-4782-818d-d6e8ceecf396\") " pod="openshift-console-operator/console-operator-9d4b6777b-ln5j2" Apr 17 14:23:17.525786 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.525768 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngtgn\" (UniqueName: \"kubernetes.io/projected/bec6e69e-bcfa-4627-9496-dbf9608ffd71-kube-api-access-ngtgn\") pod \"volume-data-source-validator-7c6cbb6c87-drrsx\" (UID: \"bec6e69e-bcfa-4627-9496-dbf9608ffd71\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-drrsx" Apr 17 14:23:17.526014 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.525993 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn9vx\" (UniqueName: \"kubernetes.io/projected/b9528958-b786-4c25-8d67-30d1493f6002-kube-api-access-pn9vx\") pod \"kube-storage-version-migrator-operator-6769c5d45-9p4v4\" (UID: \"b9528958-b786-4c25-8d67-30d1493f6002\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9p4v4" Apr 17 14:23:17.686000 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.685887 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-drrsx" Apr 17 14:23:17.695821 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.695795 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9p4v4" Apr 17 14:23:17.701588 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.701567 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-ln5j2" Apr 17 14:23:17.840069 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.840029 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-drrsx"] Apr 17 14:23:17.843313 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:23:17.843283 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbec6e69e_bcfa_4627_9496_dbf9608ffd71.slice/crio-176ee95bfefd15e32914bb57d59858393dacfb6e67d6802cd7509b8016b15334 WatchSource:0}: Error finding container 176ee95bfefd15e32914bb57d59858393dacfb6e67d6802cd7509b8016b15334: Status 404 returned error can't find the container with id 176ee95bfefd15e32914bb57d59858393dacfb6e67d6802cd7509b8016b15334 Apr 17 14:23:17.857732 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.857690 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9p4v4"] Apr 17 14:23:17.861406 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:23:17.861376 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9528958_b786_4c25_8d67_30d1493f6002.slice/crio-b5bd9e683971291762adfa7756b231414b98ea3829a53c111b640e6a5376a3b4 WatchSource:0}: Error finding container b5bd9e683971291762adfa7756b231414b98ea3829a53c111b640e6a5376a3b4: Status 404 returned error can't find the container with id b5bd9e683971291762adfa7756b231414b98ea3829a53c111b640e6a5376a3b4 Apr 17 14:23:17.874387 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.874351 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-ln5j2"] Apr 17 14:23:17.877133 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:23:17.877103 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93531d07_7bae_4782_818d_d6e8ceecf396.slice/crio-03976d1abd31389bd8df2b49dfc1389f9eea5b82cb13ce109982c19d3d4ad3d5 WatchSource:0}: Error finding container 03976d1abd31389bd8df2b49dfc1389f9eea5b82cb13ce109982c19d3d4ad3d5: Status 404 returned error can't find the container with id 03976d1abd31389bd8df2b49dfc1389f9eea5b82cb13ce109982c19d3d4ad3d5 Apr 17 14:23:17.920720 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.920674 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae7c46b5-41a6-4f3c-b2a7-c9701c82e890-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zhr7l\" (UID: \"ae7c46b5-41a6-4f3c-b2a7-c9701c82e890\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zhr7l" Apr 17 14:23:17.920888 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:23:17.920826 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 14:23:17.920926 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:23:17.920898 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae7c46b5-41a6-4f3c-b2a7-c9701c82e890-samples-operator-tls podName:ae7c46b5-41a6-4f3c-b2a7-c9701c82e890 nodeName:}" failed. No retries permitted until 2026-04-17 14:23:18.920880712 +0000 UTC m=+93.955086586 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ae7c46b5-41a6-4f3c-b2a7-c9701c82e890-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-zhr7l" (UID: "ae7c46b5-41a6-4f3c-b2a7-c9701c82e890") : secret "samples-operator-tls" not found Apr 17 14:23:17.989222 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.989192 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-drrsx" event={"ID":"bec6e69e-bcfa-4627-9496-dbf9608ffd71","Type":"ContainerStarted","Data":"176ee95bfefd15e32914bb57d59858393dacfb6e67d6802cd7509b8016b15334"} Apr 17 14:23:17.990111 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.990088 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ln5j2" event={"ID":"93531d07-7bae-4782-818d-d6e8ceecf396","Type":"ContainerStarted","Data":"03976d1abd31389bd8df2b49dfc1389f9eea5b82cb13ce109982c19d3d4ad3d5"} Apr 17 14:23:17.990906 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:17.990883 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9p4v4" event={"ID":"b9528958-b786-4c25-8d67-30d1493f6002","Type":"ContainerStarted","Data":"b5bd9e683971291762adfa7756b231414b98ea3829a53c111b640e6a5376a3b4"} Apr 17 14:23:18.930422 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:18.930370 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae7c46b5-41a6-4f3c-b2a7-c9701c82e890-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zhr7l\" (UID: \"ae7c46b5-41a6-4f3c-b2a7-c9701c82e890\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zhr7l" Apr 17 14:23:18.930901 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:23:18.930545 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 14:23:18.930901 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:23:18.930626 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae7c46b5-41a6-4f3c-b2a7-c9701c82e890-samples-operator-tls podName:ae7c46b5-41a6-4f3c-b2a7-c9701c82e890 nodeName:}" failed. No retries permitted until 2026-04-17 14:23:20.93060624 +0000 UTC m=+95.964812116 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ae7c46b5-41a6-4f3c-b2a7-c9701c82e890-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-zhr7l" (UID: "ae7c46b5-41a6-4f3c-b2a7-c9701c82e890") : secret "samples-operator-tls" not found Apr 17 14:23:19.995639 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:19.995594 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-drrsx" event={"ID":"bec6e69e-bcfa-4627-9496-dbf9608ffd71","Type":"ContainerStarted","Data":"7556f6cddfe5e5fcaeb07575b120381691041d6f827b0c20453d69a946c63398"} Apr 17 14:23:20.010261 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:20.010209 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-drrsx" podStartSLOduration=1.476509814 podStartE2EDuration="3.010190674s" podCreationTimestamp="2026-04-17 14:23:17 +0000 UTC" firstStartedPulling="2026-04-17 14:23:17.845831006 +0000 UTC m=+92.880036882" lastFinishedPulling="2026-04-17 14:23:19.37951186 +0000 UTC m=+94.413717742" observedRunningTime="2026-04-17 14:23:20.009242305 +0000 UTC m=+95.043448208" watchObservedRunningTime="2026-04-17 14:23:20.010190674 +0000 UTC m=+95.044396571" Apr 17 14:23:20.642970 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:20.642936 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9c4445f-88cc-4c46-800e-db32500ad34d-metrics-tls\") pod \"dns-default-gzm7s\" (UID: \"a9c4445f-88cc-4c46-800e-db32500ad34d\") " pod="openshift-dns/dns-default-gzm7s" Apr 17 14:23:20.643118 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:20.643052 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7-cert\") pod \"ingress-canary-4nk2q\" (UID: \"0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7\") " pod="openshift-ingress-canary/ingress-canary-4nk2q" Apr 17 14:23:20.643118 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:23:20.643083 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:23:20.643273 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:23:20.643145 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:23:20.643273 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:23:20.643152 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9c4445f-88cc-4c46-800e-db32500ad34d-metrics-tls podName:a9c4445f-88cc-4c46-800e-db32500ad34d nodeName:}" failed. No retries permitted until 2026-04-17 14:24:24.643135144 +0000 UTC m=+159.677341020 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a9c4445f-88cc-4c46-800e-db32500ad34d-metrics-tls") pod "dns-default-gzm7s" (UID: "a9c4445f-88cc-4c46-800e-db32500ad34d") : secret "dns-default-metrics-tls" not found Apr 17 14:23:20.643273 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:23:20.643220 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7-cert podName:0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7 nodeName:}" failed. No retries permitted until 2026-04-17 14:24:24.643201745 +0000 UTC m=+159.677407634 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7-cert") pod "ingress-canary-4nk2q" (UID: "0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7") : secret "canary-serving-cert" not found Apr 17 14:23:20.946058 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:20.945963 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae7c46b5-41a6-4f3c-b2a7-c9701c82e890-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zhr7l\" (UID: \"ae7c46b5-41a6-4f3c-b2a7-c9701c82e890\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zhr7l" Apr 17 14:23:20.946224 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:23:20.946106 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 14:23:20.946224 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:23:20.946193 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae7c46b5-41a6-4f3c-b2a7-c9701c82e890-samples-operator-tls podName:ae7c46b5-41a6-4f3c-b2a7-c9701c82e890 nodeName:}" failed. No retries permitted until 2026-04-17 14:23:24.94615752 +0000 UTC m=+99.980363398 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ae7c46b5-41a6-4f3c-b2a7-c9701c82e890-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-zhr7l" (UID: "ae7c46b5-41a6-4f3c-b2a7-c9701c82e890") : secret "samples-operator-tls" not found Apr 17 14:23:20.999470 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:20.999442 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln5j2_93531d07-7bae-4782-818d-d6e8ceecf396/console-operator/0.log" Apr 17 14:23:20.999877 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:20.999484 2568 generic.go:358] "Generic (PLEG): container finished" podID="93531d07-7bae-4782-818d-d6e8ceecf396" containerID="2ebe6a5e7cb76e816856c47ae0403c0e57f834c0aa294088abdaaec9fb2a0a49" exitCode=255 Apr 17 14:23:20.999877 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:20.999553 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ln5j2" event={"ID":"93531d07-7bae-4782-818d-d6e8ceecf396","Type":"ContainerDied","Data":"2ebe6a5e7cb76e816856c47ae0403c0e57f834c0aa294088abdaaec9fb2a0a49"} Apr 17 14:23:20.999877 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:20.999766 2568 scope.go:117] "RemoveContainer" containerID="2ebe6a5e7cb76e816856c47ae0403c0e57f834c0aa294088abdaaec9fb2a0a49" Apr 17 14:23:21.001007 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:21.000975 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9p4v4" event={"ID":"b9528958-b786-4c25-8d67-30d1493f6002","Type":"ContainerStarted","Data":"f17c4574c4d9b1b605a5b5394d6edecd57153af198e493f0bed18a000d8d62da"} Apr 17 14:23:21.028821 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:21.028777 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9p4v4" podStartSLOduration=1.283791233 podStartE2EDuration="4.028762345s" podCreationTimestamp="2026-04-17 14:23:17 +0000 UTC" firstStartedPulling="2026-04-17 14:23:17.863115876 +0000 UTC m=+92.897321752" lastFinishedPulling="2026-04-17 14:23:20.608086988 +0000 UTC m=+95.642292864" observedRunningTime="2026-04-17 14:23:21.028444819 +0000 UTC m=+96.062650745" watchObservedRunningTime="2026-04-17 14:23:21.028762345 +0000 UTC m=+96.062968243" Apr 17 14:23:22.004765 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:22.004735 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln5j2_93531d07-7bae-4782-818d-d6e8ceecf396/console-operator/1.log" Apr 17 14:23:22.005244 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:22.005130 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln5j2_93531d07-7bae-4782-818d-d6e8ceecf396/console-operator/0.log" Apr 17 14:23:22.005244 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:22.005190 2568 generic.go:358] "Generic (PLEG): container finished" podID="93531d07-7bae-4782-818d-d6e8ceecf396" containerID="0fa730c18a8e854a26d86e1ea3602ca3401635d0e156685a4f118351268e1614" exitCode=255 Apr 17 14:23:22.005357 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:22.005275 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ln5j2" event={"ID":"93531d07-7bae-4782-818d-d6e8ceecf396","Type":"ContainerDied","Data":"0fa730c18a8e854a26d86e1ea3602ca3401635d0e156685a4f118351268e1614"} Apr 17 14:23:22.005357 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:22.005320 2568 scope.go:117] "RemoveContainer" containerID="2ebe6a5e7cb76e816856c47ae0403c0e57f834c0aa294088abdaaec9fb2a0a49" Apr 17 14:23:22.005548 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:22.005528 2568 scope.go:117] "RemoveContainer" containerID="0fa730c18a8e854a26d86e1ea3602ca3401635d0e156685a4f118351268e1614" Apr 17 14:23:22.005776 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:23:22.005753 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-ln5j2_openshift-console-operator(93531d07-7bae-4782-818d-d6e8ceecf396)\"" pod="openshift-console-operator/console-operator-9d4b6777b-ln5j2" podUID="93531d07-7bae-4782-818d-d6e8ceecf396" Apr 17 14:23:22.338265 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:22.338180 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-2pchd"] Apr 17 14:23:22.342064 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:22.342048 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2pchd" Apr 17 14:23:22.344605 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:22.344576 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-zml4v\"" Apr 17 14:23:22.349075 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:22.349048 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-2pchd"] Apr 17 14:23:22.457118 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:22.457083 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d55c\" (UniqueName: \"kubernetes.io/projected/fb1edc68-ffea-48a4-bf76-be19ef97be78-kube-api-access-2d55c\") pod \"network-check-source-8894fc9bd-2pchd\" (UID: \"fb1edc68-ffea-48a4-bf76-be19ef97be78\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2pchd" Apr 17 14:23:22.558180 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:22.558113 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2d55c\" (UniqueName: \"kubernetes.io/projected/fb1edc68-ffea-48a4-bf76-be19ef97be78-kube-api-access-2d55c\") pod \"network-check-source-8894fc9bd-2pchd\" (UID: \"fb1edc68-ffea-48a4-bf76-be19ef97be78\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2pchd" Apr 17 14:23:22.565670 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:22.565643 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d55c\" (UniqueName: \"kubernetes.io/projected/fb1edc68-ffea-48a4-bf76-be19ef97be78-kube-api-access-2d55c\") pod \"network-check-source-8894fc9bd-2pchd\" (UID: \"fb1edc68-ffea-48a4-bf76-be19ef97be78\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2pchd" Apr 17 14:23:22.651324 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:22.651227 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2pchd" Apr 17 14:23:22.769487 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:22.769457 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-2pchd"] Apr 17 14:23:22.772623 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:23:22.772591 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb1edc68_ffea_48a4_bf76_be19ef97be78.slice/crio-ba8d116db43774f1bf29295203df5238696b47ef90017cb44451bd0be62e1292 WatchSource:0}: Error finding container ba8d116db43774f1bf29295203df5238696b47ef90017cb44451bd0be62e1292: Status 404 returned error can't find the container with id ba8d116db43774f1bf29295203df5238696b47ef90017cb44451bd0be62e1292 Apr 17 14:23:23.010979 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:23.010950 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln5j2_93531d07-7bae-4782-818d-d6e8ceecf396/console-operator/1.log" Apr 17 14:23:23.011453 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:23.011383 2568 scope.go:117] "RemoveContainer" containerID="0fa730c18a8e854a26d86e1ea3602ca3401635d0e156685a4f118351268e1614" Apr 17 14:23:23.011608 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:23:23.011589 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-ln5j2_openshift-console-operator(93531d07-7bae-4782-818d-d6e8ceecf396)\"" pod="openshift-console-operator/console-operator-9d4b6777b-ln5j2" podUID="93531d07-7bae-4782-818d-d6e8ceecf396" Apr 17 14:23:23.012303 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:23.012283 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2pchd" event={"ID":"fb1edc68-ffea-48a4-bf76-be19ef97be78","Type":"ContainerStarted","Data":"e1701713debe81f0f0ae771fdabdc18ea3e6b61be5d1b43dbbe2ca594766d6b3"} Apr 17 14:23:23.012407 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:23.012312 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2pchd" event={"ID":"fb1edc68-ffea-48a4-bf76-be19ef97be78","Type":"ContainerStarted","Data":"ba8d116db43774f1bf29295203df5238696b47ef90017cb44451bd0be62e1292"} Apr 17 14:23:23.040509 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:23.040454 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2pchd" podStartSLOduration=1.040437038 podStartE2EDuration="1.040437038s" podCreationTimestamp="2026-04-17 14:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:23:23.039462518 +0000 UTC m=+98.073668426" watchObservedRunningTime="2026-04-17 14:23:23.040437038 +0000 UTC m=+98.074642937" Apr 17 14:23:23.264386 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:23.264308 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6f2x6_7f633d5b-7896-43f3-b506-dc236c755507/dns-node-resolver/0.log" Apr 17 14:23:23.916854 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:23.916815 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-nrsk2"] Apr 17 14:23:23.919793 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:23.919771 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-nrsk2" Apr 17 14:23:23.922449 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:23.922425 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 17 14:23:23.923496 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:23.923476 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-zvvvq\"" Apr 17 14:23:23.923550 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:23.923488 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 17 14:23:23.931672 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:23.931616 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-nrsk2"] Apr 17 14:23:23.968300 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:23.968262 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf4vx\" (UniqueName: \"kubernetes.io/projected/8c673e08-3719-497d-8a89-99ae8c4bd1ee-kube-api-access-mf4vx\") pod \"migrator-74bb7799d9-nrsk2\" (UID: \"8c673e08-3719-497d-8a89-99ae8c4bd1ee\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-nrsk2" Apr 17 14:23:24.068657 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:24.068619 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mf4vx\" (UniqueName: \"kubernetes.io/projected/8c673e08-3719-497d-8a89-99ae8c4bd1ee-kube-api-access-mf4vx\") pod \"migrator-74bb7799d9-nrsk2\" (UID: \"8c673e08-3719-497d-8a89-99ae8c4bd1ee\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-nrsk2" Apr 17 14:23:24.075882 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:24.075851 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf4vx\" (UniqueName: \"kubernetes.io/projected/8c673e08-3719-497d-8a89-99ae8c4bd1ee-kube-api-access-mf4vx\") pod \"migrator-74bb7799d9-nrsk2\" (UID: \"8c673e08-3719-497d-8a89-99ae8c4bd1ee\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-nrsk2" Apr 17 14:23:24.233224 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:24.233183 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-nrsk2" Apr 17 14:23:24.344803 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:24.344769 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-nrsk2"] Apr 17 14:23:24.347649 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:23:24.347618 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c673e08_3719_497d_8a89_99ae8c4bd1ee.slice/crio-be7cb66e163ee266a32111e297b8bd0fc7588c3a771847d21fdab1914f437552 WatchSource:0}: Error finding container be7cb66e163ee266a32111e297b8bd0fc7588c3a771847d21fdab1914f437552: Status 404 returned error can't find the container with id be7cb66e163ee266a32111e297b8bd0fc7588c3a771847d21fdab1914f437552 Apr 17 14:23:24.663922 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:24.663846 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-r9nsg_d5d72b15-9ee0-40a2-b530-7847abb993f0/node-ca/0.log" Apr 17 14:23:24.975581 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:24.975546 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae7c46b5-41a6-4f3c-b2a7-c9701c82e890-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zhr7l\" (UID: \"ae7c46b5-41a6-4f3c-b2a7-c9701c82e890\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zhr7l" Apr 17 14:23:24.975777 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:23:24.975727 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 14:23:24.975858 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:23:24.975844 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae7c46b5-41a6-4f3c-b2a7-c9701c82e890-samples-operator-tls podName:ae7c46b5-41a6-4f3c-b2a7-c9701c82e890 nodeName:}" failed. No retries permitted until 2026-04-17 14:23:32.975813009 +0000 UTC m=+108.010018899 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ae7c46b5-41a6-4f3c-b2a7-c9701c82e890-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-zhr7l" (UID: "ae7c46b5-41a6-4f3c-b2a7-c9701c82e890") : secret "samples-operator-tls" not found Apr 17 14:23:25.018686 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:25.018646 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-nrsk2" event={"ID":"8c673e08-3719-497d-8a89-99ae8c4bd1ee","Type":"ContainerStarted","Data":"be7cb66e163ee266a32111e297b8bd0fc7588c3a771847d21fdab1914f437552"} Apr 17 14:23:26.022818 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:26.022780 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-nrsk2" event={"ID":"8c673e08-3719-497d-8a89-99ae8c4bd1ee","Type":"ContainerStarted","Data":"aee55650ea74eb1ab1b695dc74e2494783a24c898249b989b23312d94a3708e5"} Apr 17 14:23:26.022818 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:26.022818 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-nrsk2" event={"ID":"8c673e08-3719-497d-8a89-99ae8c4bd1ee","Type":"ContainerStarted","Data":"08514e7e087e3dc9ca0ed27d62ac50ac395d364f4dfe6332edfcbbd7eb90834f"} Apr 17 14:23:26.045259 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:26.045210 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-nrsk2" podStartSLOduration=1.6857689869999999 podStartE2EDuration="3.045194008s" podCreationTimestamp="2026-04-17 14:23:23 +0000 UTC" firstStartedPulling="2026-04-17 14:23:24.349479978 +0000 UTC m=+99.383685853" lastFinishedPulling="2026-04-17 14:23:25.708904994 +0000 UTC m=+100.743110874" observedRunningTime="2026-04-17 14:23:26.044416971 +0000 UTC m=+101.078622868" watchObservedRunningTime="2026-04-17 14:23:26.045194008 +0000 UTC m=+101.079399898" Apr 17 14:23:27.702203 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:27.702134 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-ln5j2" Apr 17 14:23:27.702203 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:27.702208 2568 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-ln5j2" Apr 17 14:23:27.702636 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:27.702595 2568 scope.go:117] "RemoveContainer" containerID="0fa730c18a8e854a26d86e1ea3602ca3401635d0e156685a4f118351268e1614" Apr 17 14:23:27.702776 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:23:27.702757 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-ln5j2_openshift-console-operator(93531d07-7bae-4782-818d-d6e8ceecf396)\"" pod="openshift-console-operator/console-operator-9d4b6777b-ln5j2" podUID="93531d07-7bae-4782-818d-d6e8ceecf396" Apr 17 14:23:33.036614 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:33.036574 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae7c46b5-41a6-4f3c-b2a7-c9701c82e890-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zhr7l\" (UID: \"ae7c46b5-41a6-4f3c-b2a7-c9701c82e890\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zhr7l" Apr 17 14:23:33.038984 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:33.038947 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae7c46b5-41a6-4f3c-b2a7-c9701c82e890-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zhr7l\" (UID: \"ae7c46b5-41a6-4f3c-b2a7-c9701c82e890\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zhr7l" Apr 17 14:23:33.189393 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:33.189342 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zhr7l" Apr 17 14:23:33.302834 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:33.302757 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zhr7l"] Apr 17 14:23:34.043679 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:34.043639 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zhr7l" event={"ID":"ae7c46b5-41a6-4f3c-b2a7-c9701c82e890","Type":"ContainerStarted","Data":"5f192c270d574517bccf21313ba9e0d5ac07275f3ccca01d2a62801476665058"} Apr 17 14:23:36.050210 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:36.050156 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zhr7l" event={"ID":"ae7c46b5-41a6-4f3c-b2a7-c9701c82e890","Type":"ContainerStarted","Data":"f7f2ad3843c96acbdf63e3acd0ef82333240621efd81d9424e5799ea07009522"} Apr 17 14:23:36.050210 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:36.050209 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zhr7l" event={"ID":"ae7c46b5-41a6-4f3c-b2a7-c9701c82e890","Type":"ContainerStarted","Data":"a8463430a7f070aa1f50baeb195ac3f2c8c5e82b347c3195e434ba0a8506147c"} Apr 17 14:23:36.066675 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:36.066629 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zhr7l" podStartSLOduration=17.061452238 podStartE2EDuration="19.066616701s" podCreationTimestamp="2026-04-17 14:23:17 +0000 UTC" firstStartedPulling="2026-04-17 14:23:33.345063126 +0000 UTC m=+108.379269001" lastFinishedPulling="2026-04-17 14:23:35.350227586 +0000 UTC m=+110.384433464" observedRunningTime="2026-04-17 14:23:36.064970898 +0000 UTC m=+111.099176795" watchObservedRunningTime="2026-04-17 14:23:36.066616701 +0000 UTC m=+111.100822591" Apr 17 14:23:38.579396 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:38.579366 2568 scope.go:117] "RemoveContainer" containerID="0fa730c18a8e854a26d86e1ea3602ca3401635d0e156685a4f118351268e1614" Apr 17 14:23:39.059059 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:39.059024 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln5j2_93531d07-7bae-4782-818d-d6e8ceecf396/console-operator/2.log" Apr 17 14:23:39.059392 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:39.059377 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln5j2_93531d07-7bae-4782-818d-d6e8ceecf396/console-operator/1.log" Apr 17 14:23:39.059452 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:39.059408 2568 generic.go:358] "Generic (PLEG): container finished" podID="93531d07-7bae-4782-818d-d6e8ceecf396" containerID="cbf32ba2fb02ae2d6ca4906d0203d2b5a15c6ab144157b43ce63a1baeff06fbe" exitCode=255 Apr 17 14:23:39.059452 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:39.059440 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ln5j2" event={"ID":"93531d07-7bae-4782-818d-d6e8ceecf396","Type":"ContainerDied","Data":"cbf32ba2fb02ae2d6ca4906d0203d2b5a15c6ab144157b43ce63a1baeff06fbe"} Apr 17 14:23:39.059540 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:39.059468 2568 scope.go:117] "RemoveContainer" containerID="0fa730c18a8e854a26d86e1ea3602ca3401635d0e156685a4f118351268e1614" Apr 17 14:23:39.059817 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:39.059799 2568 scope.go:117] "RemoveContainer" containerID="cbf32ba2fb02ae2d6ca4906d0203d2b5a15c6ab144157b43ce63a1baeff06fbe" Apr 17 14:23:39.060027 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:23:39.060009 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-ln5j2_openshift-console-operator(93531d07-7bae-4782-818d-d6e8ceecf396)\"" pod="openshift-console-operator/console-operator-9d4b6777b-ln5j2" podUID="93531d07-7bae-4782-818d-d6e8ceecf396" Apr 17 14:23:40.063324 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:40.063299 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln5j2_93531d07-7bae-4782-818d-d6e8ceecf396/console-operator/2.log" Apr 17 14:23:47.701718 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:47.701669 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-ln5j2" Apr 17 14:23:47.702147 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:47.701772 2568 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-ln5j2" Apr 17 14:23:47.702147 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:47.702029 2568 scope.go:117] "RemoveContainer" containerID="cbf32ba2fb02ae2d6ca4906d0203d2b5a15c6ab144157b43ce63a1baeff06fbe" Apr 17 14:23:47.702247 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:23:47.702204 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-ln5j2_openshift-console-operator(93531d07-7bae-4782-818d-d6e8ceecf396)\"" pod="openshift-console-operator/console-operator-9d4b6777b-ln5j2" podUID="93531d07-7bae-4782-818d-d6e8ceecf396" Apr 17 14:23:48.082095 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:48.082068 2568 scope.go:117] "RemoveContainer" containerID="cbf32ba2fb02ae2d6ca4906d0203d2b5a15c6ab144157b43ce63a1baeff06fbe" Apr 17 14:23:48.082273 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:23:48.082250 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-ln5j2_openshift-console-operator(93531d07-7bae-4782-818d-d6e8ceecf396)\"" pod="openshift-console-operator/console-operator-9d4b6777b-ln5j2" podUID="93531d07-7bae-4782-818d-d6e8ceecf396" Apr 17 14:23:48.331486 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:48.331455 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-hkr86"] Apr 17 14:23:48.335571 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:48.335518 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-hkr86" Apr 17 14:23:48.338015 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:48.337997 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 14:23:48.338136 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:48.338111 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 14:23:48.338543 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:48.338529 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 14:23:48.338613 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:48.338565 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 14:23:48.338613 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:48.338602 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-j9ndf\"" Apr 17 14:23:48.343068 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:48.343050 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-hkr86"] Apr 17 14:23:48.452761 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:48.452719 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1fe1f6e6-dc60-4bb7-9375-e72c5d01275d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hkr86\" (UID: \"1fe1f6e6-dc60-4bb7-9375-e72c5d01275d\") " pod="openshift-insights/insights-runtime-extractor-hkr86" Apr 17 14:23:48.452929 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:48.452807 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1fe1f6e6-dc60-4bb7-9375-e72c5d01275d-data-volume\") pod \"insights-runtime-extractor-hkr86\" (UID: \"1fe1f6e6-dc60-4bb7-9375-e72c5d01275d\") " pod="openshift-insights/insights-runtime-extractor-hkr86" Apr 17 14:23:48.452929 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:48.452826 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1fe1f6e6-dc60-4bb7-9375-e72c5d01275d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hkr86\" (UID: \"1fe1f6e6-dc60-4bb7-9375-e72c5d01275d\") " pod="openshift-insights/insights-runtime-extractor-hkr86" Apr 17 14:23:48.452929 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:48.452841 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqjdp\" (UniqueName: \"kubernetes.io/projected/1fe1f6e6-dc60-4bb7-9375-e72c5d01275d-kube-api-access-cqjdp\") pod \"insights-runtime-extractor-hkr86\" (UID: \"1fe1f6e6-dc60-4bb7-9375-e72c5d01275d\") " pod="openshift-insights/insights-runtime-extractor-hkr86" Apr 17 14:23:48.452929 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:48.452861 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1fe1f6e6-dc60-4bb7-9375-e72c5d01275d-crio-socket\") pod \"insights-runtime-extractor-hkr86\" (UID: \"1fe1f6e6-dc60-4bb7-9375-e72c5d01275d\") " pod="openshift-insights/insights-runtime-extractor-hkr86" Apr 17 14:23:48.553492 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:48.553437 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1fe1f6e6-dc60-4bb7-9375-e72c5d01275d-data-volume\") pod \"insights-runtime-extractor-hkr86\" (UID: \"1fe1f6e6-dc60-4bb7-9375-e72c5d01275d\") " pod="openshift-insights/insights-runtime-extractor-hkr86" Apr 17 14:23:48.553648 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:48.553577 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1fe1f6e6-dc60-4bb7-9375-e72c5d01275d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hkr86\" (UID: \"1fe1f6e6-dc60-4bb7-9375-e72c5d01275d\") " pod="openshift-insights/insights-runtime-extractor-hkr86" Apr 17 14:23:48.553648 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:48.553605 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqjdp\" (UniqueName: \"kubernetes.io/projected/1fe1f6e6-dc60-4bb7-9375-e72c5d01275d-kube-api-access-cqjdp\") pod \"insights-runtime-extractor-hkr86\" (UID: \"1fe1f6e6-dc60-4bb7-9375-e72c5d01275d\") " pod="openshift-insights/insights-runtime-extractor-hkr86" Apr 17 14:23:48.553648 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:48.553636 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1fe1f6e6-dc60-4bb7-9375-e72c5d01275d-crio-socket\") pod \"insights-runtime-extractor-hkr86\" (UID: \"1fe1f6e6-dc60-4bb7-9375-e72c5d01275d\") " pod="openshift-insights/insights-runtime-extractor-hkr86" Apr 17 14:23:48.553769 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:48.553735 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1fe1f6e6-dc60-4bb7-9375-e72c5d01275d-crio-socket\") pod \"insights-runtime-extractor-hkr86\" (UID: \"1fe1f6e6-dc60-4bb7-9375-e72c5d01275d\") " pod="openshift-insights/insights-runtime-extractor-hkr86" Apr 17 14:23:48.553831 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:48.553798 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1fe1f6e6-dc60-4bb7-9375-e72c5d01275d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hkr86\" (UID: \"1fe1f6e6-dc60-4bb7-9375-e72c5d01275d\") " pod="openshift-insights/insights-runtime-extractor-hkr86" Apr 17 14:23:48.553887 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:48.553849 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1fe1f6e6-dc60-4bb7-9375-e72c5d01275d-data-volume\") pod \"insights-runtime-extractor-hkr86\" (UID: \"1fe1f6e6-dc60-4bb7-9375-e72c5d01275d\") " pod="openshift-insights/insights-runtime-extractor-hkr86" Apr 17 14:23:48.554107 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:48.554088 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1fe1f6e6-dc60-4bb7-9375-e72c5d01275d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hkr86\" (UID: \"1fe1f6e6-dc60-4bb7-9375-e72c5d01275d\") " pod="openshift-insights/insights-runtime-extractor-hkr86" Apr 17 14:23:48.556088 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:48.556072 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1fe1f6e6-dc60-4bb7-9375-e72c5d01275d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hkr86\" (UID: \"1fe1f6e6-dc60-4bb7-9375-e72c5d01275d\") " pod="openshift-insights/insights-runtime-extractor-hkr86" Apr 17 14:23:48.561397 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:48.561373 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqjdp\" (UniqueName: \"kubernetes.io/projected/1fe1f6e6-dc60-4bb7-9375-e72c5d01275d-kube-api-access-cqjdp\") pod \"insights-runtime-extractor-hkr86\" (UID: \"1fe1f6e6-dc60-4bb7-9375-e72c5d01275d\") " pod="openshift-insights/insights-runtime-extractor-hkr86" Apr 17 14:23:48.645202 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:48.645081 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-hkr86" Apr 17 14:23:48.762886 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:48.762858 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-hkr86"] Apr 17 14:23:48.766531 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:23:48.766502 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fe1f6e6_dc60_4bb7_9375_e72c5d01275d.slice/crio-820a77bb64f2a28b792aa60de41be4fe885b1e23653ba6d6804bee0741e80e2c WatchSource:0}: Error finding container 820a77bb64f2a28b792aa60de41be4fe885b1e23653ba6d6804bee0741e80e2c: Status 404 returned error can't find the container with id 820a77bb64f2a28b792aa60de41be4fe885b1e23653ba6d6804bee0741e80e2c Apr 17 14:23:49.085643 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:49.085605 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hkr86" event={"ID":"1fe1f6e6-dc60-4bb7-9375-e72c5d01275d","Type":"ContainerStarted","Data":"8a2360e3231e2b0427fe43804bb8daae898d780c1a7d02576278831df8ac828f"} Apr 17 14:23:49.085643 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:49.085646 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hkr86" event={"ID":"1fe1f6e6-dc60-4bb7-9375-e72c5d01275d","Type":"ContainerStarted","Data":"820a77bb64f2a28b792aa60de41be4fe885b1e23653ba6d6804bee0741e80e2c"} Apr 17 14:23:50.089560 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:50.089521 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hkr86" event={"ID":"1fe1f6e6-dc60-4bb7-9375-e72c5d01275d","Type":"ContainerStarted","Data":"2e0cc5642c6f5420fa8139bca2ed2f77a69ef6a03a1013145389338b209c8675"} Apr 17 14:23:52.064412 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:52.064378 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-c7rdz"] Apr 17 14:23:52.068471 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:52.068456 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-c7rdz" Apr 17 14:23:52.071039 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:52.071016 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 17 14:23:52.071376 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:52.071359 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-9cr2j\"" Apr 17 14:23:52.074856 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:52.074661 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-c7rdz"] Apr 17 14:23:52.095734 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:52.095707 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hkr86" event={"ID":"1fe1f6e6-dc60-4bb7-9375-e72c5d01275d","Type":"ContainerStarted","Data":"236b12fcf41efbe6685db6d47440aeff6490ad43d51b2429c84ebc2133e85e88"} Apr 17 14:23:52.112180 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:52.112122 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-hkr86" podStartSLOduration=1.875382834 podStartE2EDuration="4.112110212s" podCreationTimestamp="2026-04-17 14:23:48 +0000 UTC" firstStartedPulling="2026-04-17 14:23:48.827428626 +0000 UTC m=+123.861634509" lastFinishedPulling="2026-04-17 14:23:51.064156008 +0000 UTC m=+126.098361887" observedRunningTime="2026-04-17 14:23:52.111753023 +0000 UTC m=+127.145958921" watchObservedRunningTime="2026-04-17 14:23:52.112110212 +0000 UTC m=+127.146316109" Apr 17 14:23:52.178940 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:52.178905 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/3dd4d766-4849-4e53-83b5-a6a7f75bed97-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-c7rdz\" (UID: \"3dd4d766-4849-4e53-83b5-a6a7f75bed97\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-c7rdz" Apr 17 14:23:52.280180 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:52.280127 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/3dd4d766-4849-4e53-83b5-a6a7f75bed97-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-c7rdz\" (UID: \"3dd4d766-4849-4e53-83b5-a6a7f75bed97\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-c7rdz" Apr 17 14:23:52.280302 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:23:52.280269 2568 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 17 14:23:52.280359 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:23:52.280345 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dd4d766-4849-4e53-83b5-a6a7f75bed97-tls-certificates podName:3dd4d766-4849-4e53-83b5-a6a7f75bed97 nodeName:}" failed. No retries permitted until 2026-04-17 14:23:52.780328312 +0000 UTC m=+127.814534188 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/3dd4d766-4849-4e53-83b5-a6a7f75bed97-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-c7rdz" (UID: "3dd4d766-4849-4e53-83b5-a6a7f75bed97") : secret "prometheus-operator-admission-webhook-tls" not found Apr 17 14:23:52.783591 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:52.783550 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/3dd4d766-4849-4e53-83b5-a6a7f75bed97-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-c7rdz\" (UID: \"3dd4d766-4849-4e53-83b5-a6a7f75bed97\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-c7rdz" Apr 17 14:23:52.785907 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:52.785886 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/3dd4d766-4849-4e53-83b5-a6a7f75bed97-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-c7rdz\" (UID: \"3dd4d766-4849-4e53-83b5-a6a7f75bed97\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-c7rdz" Apr 17 14:23:52.977276 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:52.977226 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-c7rdz" Apr 17 14:23:53.090097 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:53.090065 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-c7rdz"] Apr 17 14:23:53.093027 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:23:53.092999 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dd4d766_4849_4e53_83b5_a6a7f75bed97.slice/crio-9aea2beb52fb478c3bb25de73c778f410f07774f65dff1f2902746a1e7b157f9 WatchSource:0}: Error finding container 9aea2beb52fb478c3bb25de73c778f410f07774f65dff1f2902746a1e7b157f9: Status 404 returned error can't find the container with id 9aea2beb52fb478c3bb25de73c778f410f07774f65dff1f2902746a1e7b157f9 Apr 17 14:23:53.098992 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:53.098969 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-c7rdz" event={"ID":"3dd4d766-4849-4e53-83b5-a6a7f75bed97","Type":"ContainerStarted","Data":"9aea2beb52fb478c3bb25de73c778f410f07774f65dff1f2902746a1e7b157f9"} Apr 17 14:23:54.295551 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:54.295512 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41c68694-ceb3-44f8-a9e8-e0655e8aa848-metrics-certs\") pod \"network-metrics-daemon-tg9jd\" (UID: \"41c68694-ceb3-44f8-a9e8-e0655e8aa848\") " pod="openshift-multus/network-metrics-daemon-tg9jd" Apr 17 14:23:54.297788 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:54.297754 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41c68694-ceb3-44f8-a9e8-e0655e8aa848-metrics-certs\") pod \"network-metrics-daemon-tg9jd\" (UID: \"41c68694-ceb3-44f8-a9e8-e0655e8aa848\") " pod="openshift-multus/network-metrics-daemon-tg9jd" Apr 17 14:23:54.591840 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:54.591764 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-hxvwz\"" Apr 17 14:23:54.599920 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:54.599896 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tg9jd" Apr 17 14:23:54.715451 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:54.715419 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tg9jd"] Apr 17 14:23:54.718702 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:23:54.718668 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41c68694_ceb3_44f8_a9e8_e0655e8aa848.slice/crio-0089d88de7841aa658fb3e603d20a076b3737debb3abbb4333419e8a377127c4 WatchSource:0}: Error finding container 0089d88de7841aa658fb3e603d20a076b3737debb3abbb4333419e8a377127c4: Status 404 returned error can't find the container with id 0089d88de7841aa658fb3e603d20a076b3737debb3abbb4333419e8a377127c4 Apr 17 14:23:55.108433 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:55.108398 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tg9jd" event={"ID":"41c68694-ceb3-44f8-a9e8-e0655e8aa848","Type":"ContainerStarted","Data":"0089d88de7841aa658fb3e603d20a076b3737debb3abbb4333419e8a377127c4"} Apr 17 14:23:55.109669 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:55.109640 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-c7rdz" event={"ID":"3dd4d766-4849-4e53-83b5-a6a7f75bed97","Type":"ContainerStarted","Data":"ee4ddf741b30b4a4734da69c5cf765fb7deeb12584cea648c210e603d45acdac"} Apr 17 14:23:55.109887 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:55.109869 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-c7rdz" Apr 17 14:23:55.114283 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:55.114265 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-c7rdz" Apr 17 14:23:55.124646 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:55.124597 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-c7rdz" podStartSLOduration=1.782327234 podStartE2EDuration="3.12458598s" podCreationTimestamp="2026-04-17 14:23:52 +0000 UTC" firstStartedPulling="2026-04-17 14:23:53.094803216 +0000 UTC m=+128.129009091" lastFinishedPulling="2026-04-17 14:23:54.437061955 +0000 UTC m=+129.471267837" observedRunningTime="2026-04-17 14:23:55.123623701 +0000 UTC m=+130.157829598" watchObservedRunningTime="2026-04-17 14:23:55.12458598 +0000 UTC m=+130.158791876" Apr 17 14:23:56.110936 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:56.110901 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-fn9g6"] Apr 17 14:23:56.113825 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:56.113794 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-fn9g6" Apr 17 14:23:56.114570 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:56.114528 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tg9jd" event={"ID":"41c68694-ceb3-44f8-a9e8-e0655e8aa848","Type":"ContainerStarted","Data":"d493fc900e0fd43b48c831ec20a17b51091ff628a4fd8478670266aa34d39cc3"} Apr 17 14:23:56.116453 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:56.116429 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 14:23:56.117994 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:56.117711 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 17 14:23:56.117994 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:56.117743 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 14:23:56.117994 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:56.117819 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 17 14:23:56.117994 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:56.117747 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-gjxdl\"" Apr 17 14:23:56.117994 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:56.117989 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 14:23:56.121475 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:56.121449 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-fn9g6"] Apr 17 14:23:56.210241 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:56.210213 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f1cfbda9-213d-401f-85eb-8006efac438b-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-fn9g6\" (UID: \"f1cfbda9-213d-401f-85eb-8006efac438b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fn9g6" Apr 17 14:23:56.210241 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:56.210250 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqzjn\" (UniqueName: \"kubernetes.io/projected/f1cfbda9-213d-401f-85eb-8006efac438b-kube-api-access-pqzjn\") pod \"prometheus-operator-5676c8c784-fn9g6\" (UID: \"f1cfbda9-213d-401f-85eb-8006efac438b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fn9g6" Apr 17 14:23:56.210406 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:56.210283 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f1cfbda9-213d-401f-85eb-8006efac438b-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-fn9g6\" (UID: \"f1cfbda9-213d-401f-85eb-8006efac438b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fn9g6" Apr 17 14:23:56.210457 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:56.210437 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f1cfbda9-213d-401f-85eb-8006efac438b-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-fn9g6\" (UID: \"f1cfbda9-213d-401f-85eb-8006efac438b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fn9g6" Apr 17 14:23:56.311146 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:56.311049 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f1cfbda9-213d-401f-85eb-8006efac438b-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-fn9g6\" (UID: \"f1cfbda9-213d-401f-85eb-8006efac438b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fn9g6" Apr 17 14:23:56.311146 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:56.311122 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f1cfbda9-213d-401f-85eb-8006efac438b-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-fn9g6\" (UID: \"f1cfbda9-213d-401f-85eb-8006efac438b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fn9g6" Apr 17 14:23:56.311146 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:56.311141 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pqzjn\" (UniqueName: \"kubernetes.io/projected/f1cfbda9-213d-401f-85eb-8006efac438b-kube-api-access-pqzjn\") pod \"prometheus-operator-5676c8c784-fn9g6\" (UID: \"f1cfbda9-213d-401f-85eb-8006efac438b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fn9g6" Apr 17 14:23:56.311398 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:56.311186 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f1cfbda9-213d-401f-85eb-8006efac438b-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-fn9g6\" (UID: \"f1cfbda9-213d-401f-85eb-8006efac438b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fn9g6" Apr 17 14:23:56.311398 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:23:56.311289 2568 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 17 14:23:56.311398 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:23:56.311353 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1cfbda9-213d-401f-85eb-8006efac438b-prometheus-operator-tls podName:f1cfbda9-213d-401f-85eb-8006efac438b nodeName:}" failed. No retries permitted until 2026-04-17 14:23:56.811334428 +0000 UTC m=+131.845540304 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/f1cfbda9-213d-401f-85eb-8006efac438b-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-fn9g6" (UID: "f1cfbda9-213d-401f-85eb-8006efac438b") : secret "prometheus-operator-tls" not found Apr 17 14:23:56.311745 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:56.311722 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f1cfbda9-213d-401f-85eb-8006efac438b-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-fn9g6\" (UID: \"f1cfbda9-213d-401f-85eb-8006efac438b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fn9g6" Apr 17 14:23:56.313643 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:56.313621 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f1cfbda9-213d-401f-85eb-8006efac438b-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-fn9g6\" (UID: \"f1cfbda9-213d-401f-85eb-8006efac438b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fn9g6" Apr 17 14:23:56.319774 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:56.319747 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqzjn\" (UniqueName: \"kubernetes.io/projected/f1cfbda9-213d-401f-85eb-8006efac438b-kube-api-access-pqzjn\") pod \"prometheus-operator-5676c8c784-fn9g6\" (UID: \"f1cfbda9-213d-401f-85eb-8006efac438b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fn9g6" Apr 17 14:23:56.813470 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:56.813420 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f1cfbda9-213d-401f-85eb-8006efac438b-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-fn9g6\" (UID: \"f1cfbda9-213d-401f-85eb-8006efac438b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fn9g6" Apr 17 14:23:56.815833 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:56.815814 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f1cfbda9-213d-401f-85eb-8006efac438b-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-fn9g6\" (UID: \"f1cfbda9-213d-401f-85eb-8006efac438b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fn9g6" Apr 17 14:23:57.027297 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:57.027263 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-fn9g6" Apr 17 14:23:57.118729 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:57.118693 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tg9jd" event={"ID":"41c68694-ceb3-44f8-a9e8-e0655e8aa848","Type":"ContainerStarted","Data":"b19287db464841ab80829528a720e00c270cffb884cbfc1299f5f2e64ce08211"} Apr 17 14:23:57.135923 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:57.135878 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-tg9jd" podStartSLOduration=130.867029691 podStartE2EDuration="2m12.135864927s" podCreationTimestamp="2026-04-17 14:21:45 +0000 UTC" firstStartedPulling="2026-04-17 14:23:54.720940005 +0000 UTC m=+129.755145880" lastFinishedPulling="2026-04-17 14:23:55.989775236 +0000 UTC m=+131.023981116" observedRunningTime="2026-04-17 14:23:57.134732009 +0000 UTC m=+132.168937907" watchObservedRunningTime="2026-04-17 14:23:57.135864927 +0000 UTC m=+132.170070825" Apr 17 14:23:57.139587 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:57.139563 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-fn9g6"] Apr 17 14:23:57.142154 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:23:57.142128 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1cfbda9_213d_401f_85eb_8006efac438b.slice/crio-1574f30adb92a61af38ec8a278be86899e6800b0dc2a388ce25803ea923e5db0 WatchSource:0}: Error finding container 1574f30adb92a61af38ec8a278be86899e6800b0dc2a388ce25803ea923e5db0: Status 404 returned error can't find the container with id 1574f30adb92a61af38ec8a278be86899e6800b0dc2a388ce25803ea923e5db0 Apr 17 14:23:58.122900 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:58.122859 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-fn9g6" event={"ID":"f1cfbda9-213d-401f-85eb-8006efac438b","Type":"ContainerStarted","Data":"1574f30adb92a61af38ec8a278be86899e6800b0dc2a388ce25803ea923e5db0"} Apr 17 14:23:59.127803 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:59.127768 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-fn9g6" event={"ID":"f1cfbda9-213d-401f-85eb-8006efac438b","Type":"ContainerStarted","Data":"e08e04fa96234e6ea618d64d29ce2909b57fe7edc48a4c9695c6ce0aa182f8a3"} Apr 17 14:23:59.127803 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:59.127805 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-fn9g6" event={"ID":"f1cfbda9-213d-401f-85eb-8006efac438b","Type":"ContainerStarted","Data":"fa7bb32b5c3c231b184d40405e679fd585597825f4970c8d572298a937b24724"} Apr 17 14:23:59.144778 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:23:59.144722 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-fn9g6" podStartSLOduration=2.058874933 podStartE2EDuration="3.14470935s" podCreationTimestamp="2026-04-17 14:23:56 +0000 UTC" firstStartedPulling="2026-04-17 14:23:57.148240465 +0000 UTC m=+132.182446343" lastFinishedPulling="2026-04-17 14:23:58.234074881 +0000 UTC m=+133.268280760" observedRunningTime="2026-04-17 14:23:59.143144137 +0000 UTC m=+134.177350034" watchObservedRunningTime="2026-04-17 14:23:59.14470935 +0000 UTC m=+134.178915247" Apr 17 14:24:01.440030 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.439993 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-rcpxg"] Apr 17 14:24:01.442430 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.442413 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-rcpxg" Apr 17 14:24:01.446688 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.446661 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 17 14:24:01.446830 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.446786 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-xpx5b\"" Apr 17 14:24:01.446830 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.446799 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 17 14:24:01.447831 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.447812 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 17 14:24:01.470692 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.470659 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-rcpxg"] Apr 17 14:24:01.484148 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.484114 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-pjgwp"] Apr 17 14:24:01.486287 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.486266 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-pjgwp" Apr 17 14:24:01.489408 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.489385 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 14:24:01.489543 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.489388 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-c4k9f\"" Apr 17 14:24:01.489543 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.489455 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 14:24:01.489543 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.489392 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 14:24:01.551995 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.551957 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/45c30a2a-d629-471c-b0f0-30401c8f5083-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-rcpxg\" (UID: \"45c30a2a-d629-471c-b0f0-30401c8f5083\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rcpxg" Apr 17 14:24:01.551995 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.551996 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/45c30a2a-d629-471c-b0f0-30401c8f5083-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-rcpxg\" (UID: \"45c30a2a-d629-471c-b0f0-30401c8f5083\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rcpxg" Apr 17 14:24:01.552276 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.552021 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/45c30a2a-d629-471c-b0f0-30401c8f5083-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-rcpxg\" (UID: \"45c30a2a-d629-471c-b0f0-30401c8f5083\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rcpxg" Apr 17 14:24:01.552276 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.552043 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/45c30a2a-d629-471c-b0f0-30401c8f5083-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-rcpxg\" (UID: \"45c30a2a-d629-471c-b0f0-30401c8f5083\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rcpxg" Apr 17 14:24:01.552276 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.552115 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/45c30a2a-d629-471c-b0f0-30401c8f5083-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-rcpxg\" (UID: \"45c30a2a-d629-471c-b0f0-30401c8f5083\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rcpxg" Apr 17 14:24:01.552276 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.552142 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x92v9\" (UniqueName: \"kubernetes.io/projected/45c30a2a-d629-471c-b0f0-30401c8f5083-kube-api-access-x92v9\") pod \"kube-state-metrics-69db897b98-rcpxg\" (UID: \"45c30a2a-d629-471c-b0f0-30401c8f5083\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rcpxg" Apr 17 14:24:01.652932 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.652889 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c22e4e5b-cdfb-4f36-892c-be821cb5bb18-root\") pod \"node-exporter-pjgwp\" (UID: \"c22e4e5b-cdfb-4f36-892c-be821cb5bb18\") " pod="openshift-monitoring/node-exporter-pjgwp" Apr 17 14:24:01.652932 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.652933 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c22e4e5b-cdfb-4f36-892c-be821cb5bb18-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pjgwp\" (UID: \"c22e4e5b-cdfb-4f36-892c-be821cb5bb18\") " pod="openshift-monitoring/node-exporter-pjgwp" Apr 17 14:24:01.653207 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.652969 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/45c30a2a-d629-471c-b0f0-30401c8f5083-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-rcpxg\" (UID: \"45c30a2a-d629-471c-b0f0-30401c8f5083\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rcpxg" Apr 17 14:24:01.653207 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.653000 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/45c30a2a-d629-471c-b0f0-30401c8f5083-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-rcpxg\" (UID: \"45c30a2a-d629-471c-b0f0-30401c8f5083\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rcpxg" Apr 17 14:24:01.653207 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.653030 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/45c30a2a-d629-471c-b0f0-30401c8f5083-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-rcpxg\" (UID: \"45c30a2a-d629-471c-b0f0-30401c8f5083\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rcpxg" Apr 17 14:24:01.653207 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.653151 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/45c30a2a-d629-471c-b0f0-30401c8f5083-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-rcpxg\" (UID: \"45c30a2a-d629-471c-b0f0-30401c8f5083\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rcpxg" Apr 17 14:24:01.653418 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.653263 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c22e4e5b-cdfb-4f36-892c-be821cb5bb18-node-exporter-textfile\") pod \"node-exporter-pjgwp\" (UID: \"c22e4e5b-cdfb-4f36-892c-be821cb5bb18\") " pod="openshift-monitoring/node-exporter-pjgwp" Apr 17 14:24:01.653418 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.653299 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/45c30a2a-d629-471c-b0f0-30401c8f5083-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-rcpxg\" (UID: \"45c30a2a-d629-471c-b0f0-30401c8f5083\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rcpxg" Apr 17 14:24:01.653418 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.653340 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c22e4e5b-cdfb-4f36-892c-be821cb5bb18-node-exporter-accelerators-collector-config\") pod \"node-exporter-pjgwp\" (UID: \"c22e4e5b-cdfb-4f36-892c-be821cb5bb18\") " pod="openshift-monitoring/node-exporter-pjgwp" Apr 17 14:24:01.653418 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.653364 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c22e4e5b-cdfb-4f36-892c-be821cb5bb18-node-exporter-tls\") pod \"node-exporter-pjgwp\" (UID: \"c22e4e5b-cdfb-4f36-892c-be821cb5bb18\") " pod="openshift-monitoring/node-exporter-pjgwp" Apr 17 14:24:01.653418 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.653391 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c22e4e5b-cdfb-4f36-892c-be821cb5bb18-metrics-client-ca\") pod \"node-exporter-pjgwp\" (UID: \"c22e4e5b-cdfb-4f36-892c-be821cb5bb18\") " pod="openshift-monitoring/node-exporter-pjgwp" Apr 17 14:24:01.653675 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.653422 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x92v9\" (UniqueName: \"kubernetes.io/projected/45c30a2a-d629-471c-b0f0-30401c8f5083-kube-api-access-x92v9\") pod \"kube-state-metrics-69db897b98-rcpxg\" (UID: \"45c30a2a-d629-471c-b0f0-30401c8f5083\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rcpxg" Apr 17 14:24:01.653675 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.653452 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjsrn\" (UniqueName: \"kubernetes.io/projected/c22e4e5b-cdfb-4f36-892c-be821cb5bb18-kube-api-access-jjsrn\") pod \"node-exporter-pjgwp\" (UID: \"c22e4e5b-cdfb-4f36-892c-be821cb5bb18\") " pod="openshift-monitoring/node-exporter-pjgwp" Apr 17 14:24:01.653675 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.653501 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c22e4e5b-cdfb-4f36-892c-be821cb5bb18-sys\") pod \"node-exporter-pjgwp\" (UID: \"c22e4e5b-cdfb-4f36-892c-be821cb5bb18\") " pod="openshift-monitoring/node-exporter-pjgwp" Apr 17 14:24:01.653675 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.653525 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c22e4e5b-cdfb-4f36-892c-be821cb5bb18-node-exporter-wtmp\") pod \"node-exporter-pjgwp\" (UID: \"c22e4e5b-cdfb-4f36-892c-be821cb5bb18\") " pod="openshift-monitoring/node-exporter-pjgwp" Apr 17 14:24:01.653675 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.653529 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/45c30a2a-d629-471c-b0f0-30401c8f5083-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-rcpxg\" (UID: \"45c30a2a-d629-471c-b0f0-30401c8f5083\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rcpxg" Apr 17 14:24:01.653918 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.653761 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/45c30a2a-d629-471c-b0f0-30401c8f5083-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-rcpxg\" (UID: \"45c30a2a-d629-471c-b0f0-30401c8f5083\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rcpxg" Apr 17 14:24:01.653918 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.653869 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/45c30a2a-d629-471c-b0f0-30401c8f5083-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-rcpxg\" (UID: \"45c30a2a-d629-471c-b0f0-30401c8f5083\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rcpxg" Apr 17 14:24:01.655681 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.655652 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/45c30a2a-d629-471c-b0f0-30401c8f5083-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-rcpxg\" (UID: \"45c30a2a-d629-471c-b0f0-30401c8f5083\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rcpxg" Apr 17 14:24:01.655681 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.655662 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/45c30a2a-d629-471c-b0f0-30401c8f5083-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-rcpxg\" (UID: \"45c30a2a-d629-471c-b0f0-30401c8f5083\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rcpxg" Apr 17 14:24:01.661724 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.661704 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x92v9\" (UniqueName: \"kubernetes.io/projected/45c30a2a-d629-471c-b0f0-30401c8f5083-kube-api-access-x92v9\") pod \"kube-state-metrics-69db897b98-rcpxg\" (UID: \"45c30a2a-d629-471c-b0f0-30401c8f5083\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rcpxg" Apr 17 14:24:01.751314 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.751272 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-rcpxg" Apr 17 14:24:01.754272 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.754248 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c22e4e5b-cdfb-4f36-892c-be821cb5bb18-node-exporter-textfile\") pod \"node-exporter-pjgwp\" (UID: \"c22e4e5b-cdfb-4f36-892c-be821cb5bb18\") " pod="openshift-monitoring/node-exporter-pjgwp" Apr 17 14:24:01.754360 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.754290 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c22e4e5b-cdfb-4f36-892c-be821cb5bb18-node-exporter-accelerators-collector-config\") pod \"node-exporter-pjgwp\" (UID: \"c22e4e5b-cdfb-4f36-892c-be821cb5bb18\") " pod="openshift-monitoring/node-exporter-pjgwp" Apr 17 14:24:01.754360 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.754310 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c22e4e5b-cdfb-4f36-892c-be821cb5bb18-node-exporter-tls\") pod \"node-exporter-pjgwp\" (UID: \"c22e4e5b-cdfb-4f36-892c-be821cb5bb18\") " pod="openshift-monitoring/node-exporter-pjgwp" Apr 17 14:24:01.754360 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.754333 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c22e4e5b-cdfb-4f36-892c-be821cb5bb18-metrics-client-ca\") pod \"node-exporter-pjgwp\" (UID: \"c22e4e5b-cdfb-4f36-892c-be821cb5bb18\") " pod="openshift-monitoring/node-exporter-pjgwp" Apr 17 14:24:01.754504 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.754362 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jjsrn\" (UniqueName: \"kubernetes.io/projected/c22e4e5b-cdfb-4f36-892c-be821cb5bb18-kube-api-access-jjsrn\") pod \"node-exporter-pjgwp\" (UID: \"c22e4e5b-cdfb-4f36-892c-be821cb5bb18\") " pod="openshift-monitoring/node-exporter-pjgwp" Apr 17 14:24:01.754504 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.754389 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c22e4e5b-cdfb-4f36-892c-be821cb5bb18-sys\") pod \"node-exporter-pjgwp\" (UID: \"c22e4e5b-cdfb-4f36-892c-be821cb5bb18\") " pod="openshift-monitoring/node-exporter-pjgwp" Apr 17 14:24:01.754504 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.754406 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c22e4e5b-cdfb-4f36-892c-be821cb5bb18-node-exporter-wtmp\") pod \"node-exporter-pjgwp\" (UID: \"c22e4e5b-cdfb-4f36-892c-be821cb5bb18\") " pod="openshift-monitoring/node-exporter-pjgwp" Apr 17 14:24:01.754504 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.754438 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c22e4e5b-cdfb-4f36-892c-be821cb5bb18-root\") pod \"node-exporter-pjgwp\" (UID: \"c22e4e5b-cdfb-4f36-892c-be821cb5bb18\") " pod="openshift-monitoring/node-exporter-pjgwp" Apr 17 14:24:01.754504 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.754489 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c22e4e5b-cdfb-4f36-892c-be821cb5bb18-root\") pod \"node-exporter-pjgwp\" (UID: \"c22e4e5b-cdfb-4f36-892c-be821cb5bb18\") " pod="openshift-monitoring/node-exporter-pjgwp" Apr 17 14:24:01.754724 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.754499 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c22e4e5b-cdfb-4f36-892c-be821cb5bb18-sys\") pod \"node-exporter-pjgwp\" (UID: \"c22e4e5b-cdfb-4f36-892c-be821cb5bb18\") " pod="openshift-monitoring/node-exporter-pjgwp" Apr 17 14:24:01.754724 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.754547 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c22e4e5b-cdfb-4f36-892c-be821cb5bb18-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pjgwp\" (UID: \"c22e4e5b-cdfb-4f36-892c-be821cb5bb18\") " pod="openshift-monitoring/node-exporter-pjgwp" Apr 17 14:24:01.754724 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.754583 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c22e4e5b-cdfb-4f36-892c-be821cb5bb18-node-exporter-textfile\") pod \"node-exporter-pjgwp\" (UID: \"c22e4e5b-cdfb-4f36-892c-be821cb5bb18\") " pod="openshift-monitoring/node-exporter-pjgwp" Apr 17 14:24:01.754724 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.754590 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c22e4e5b-cdfb-4f36-892c-be821cb5bb18-node-exporter-wtmp\") pod \"node-exporter-pjgwp\" (UID: \"c22e4e5b-cdfb-4f36-892c-be821cb5bb18\") " pod="openshift-monitoring/node-exporter-pjgwp" Apr 17 14:24:01.755895 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.755873 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c22e4e5b-cdfb-4f36-892c-be821cb5bb18-node-exporter-accelerators-collector-config\") pod \"node-exporter-pjgwp\" (UID: \"c22e4e5b-cdfb-4f36-892c-be821cb5bb18\") " pod="openshift-monitoring/node-exporter-pjgwp" Apr 17 14:24:01.756013 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.755875 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c22e4e5b-cdfb-4f36-892c-be821cb5bb18-metrics-client-ca\") pod \"node-exporter-pjgwp\" (UID: \"c22e4e5b-cdfb-4f36-892c-be821cb5bb18\") " pod="openshift-monitoring/node-exporter-pjgwp" Apr 17 14:24:01.757060 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.757031 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c22e4e5b-cdfb-4f36-892c-be821cb5bb18-node-exporter-tls\") pod \"node-exporter-pjgwp\" (UID: \"c22e4e5b-cdfb-4f36-892c-be821cb5bb18\") " pod="openshift-monitoring/node-exporter-pjgwp" Apr 17 14:24:01.757158 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.757114 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c22e4e5b-cdfb-4f36-892c-be821cb5bb18-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pjgwp\" (UID: \"c22e4e5b-cdfb-4f36-892c-be821cb5bb18\") " pod="openshift-monitoring/node-exporter-pjgwp" Apr 17 14:24:01.762362 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.762339 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjsrn\" (UniqueName: \"kubernetes.io/projected/c22e4e5b-cdfb-4f36-892c-be821cb5bb18-kube-api-access-jjsrn\") pod \"node-exporter-pjgwp\" (UID: \"c22e4e5b-cdfb-4f36-892c-be821cb5bb18\") " pod="openshift-monitoring/node-exporter-pjgwp" Apr 17 14:24:01.810954 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.803336 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-pjgwp" Apr 17 14:24:01.907778 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:01.907743 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-rcpxg"] Apr 17 14:24:01.911254 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:24:01.911213 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45c30a2a_d629_471c_b0f0_30401c8f5083.slice/crio-972f1424f0dcd2e16ebd0279b9441dac7cc66bd19150d408c17b8c0e08259d78 WatchSource:0}: Error finding container 972f1424f0dcd2e16ebd0279b9441dac7cc66bd19150d408c17b8c0e08259d78: Status 404 returned error can't find the container with id 972f1424f0dcd2e16ebd0279b9441dac7cc66bd19150d408c17b8c0e08259d78 Apr 17 14:24:02.137083 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:02.136996 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pjgwp" event={"ID":"c22e4e5b-cdfb-4f36-892c-be821cb5bb18","Type":"ContainerStarted","Data":"8bae8a42a0a953aa76a655f9c4a966a696f441da86feefb240a9d893b3da4458"} Apr 17 14:24:02.137966 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:02.137940 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-rcpxg" event={"ID":"45c30a2a-d629-471c-b0f0-30401c8f5083","Type":"ContainerStarted","Data":"972f1424f0dcd2e16ebd0279b9441dac7cc66bd19150d408c17b8c0e08259d78"} Apr 17 14:24:03.142433 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:03.142400 2568 generic.go:358] "Generic (PLEG): container finished" podID="c22e4e5b-cdfb-4f36-892c-be821cb5bb18" containerID="0041ce0c6d2bb7d2dd4b107770b0633d3d956f92f1e8cc08ebb74add59eece8a" exitCode=0 Apr 17 14:24:03.142798 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:03.142457 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pjgwp" event={"ID":"c22e4e5b-cdfb-4f36-892c-be821cb5bb18","Type":"ContainerDied","Data":"0041ce0c6d2bb7d2dd4b107770b0633d3d956f92f1e8cc08ebb74add59eece8a"} Apr 17 14:24:03.579908 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:03.579868 2568 scope.go:117] "RemoveContainer" containerID="cbf32ba2fb02ae2d6ca4906d0203d2b5a15c6ab144157b43ce63a1baeff06fbe" Apr 17 14:24:04.147182 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:04.147136 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pjgwp" event={"ID":"c22e4e5b-cdfb-4f36-892c-be821cb5bb18","Type":"ContainerStarted","Data":"0521daa5755c32e9130ad3b0e967bde11e10f2d81f298c404d7fb9b5864e5618"} Apr 17 14:24:04.147646 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:04.147192 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pjgwp" event={"ID":"c22e4e5b-cdfb-4f36-892c-be821cb5bb18","Type":"ContainerStarted","Data":"2db104c797fc0c73b51e3ae3f86c87cbf702affb09c6bf910fb589ad480b50d0"} Apr 17 14:24:04.148693 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:04.148676 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln5j2_93531d07-7bae-4782-818d-d6e8ceecf396/console-operator/2.log" Apr 17 14:24:04.148863 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:04.148747 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ln5j2" event={"ID":"93531d07-7bae-4782-818d-d6e8ceecf396","Type":"ContainerStarted","Data":"37fa6d0c42508afe1476b1367d0252795b752626bc0a5e91fa6c8806e9b59f5c"} Apr 17 14:24:04.149046 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:04.149013 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-ln5j2" Apr 17 14:24:04.150706 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:04.150688 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-rcpxg" event={"ID":"45c30a2a-d629-471c-b0f0-30401c8f5083","Type":"ContainerStarted","Data":"fb242477645b06605212a1a629b4218d0bba9c69be89f10f02d35762aad11bdf"} Apr 17 14:24:04.150791 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:04.150711 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-rcpxg" event={"ID":"45c30a2a-d629-471c-b0f0-30401c8f5083","Type":"ContainerStarted","Data":"5fff1adec9566312b5d540bd839b832a8c2c880f18349ed16baa5222062cd91f"} Apr 17 14:24:04.150791 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:04.150721 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-rcpxg" event={"ID":"45c30a2a-d629-471c-b0f0-30401c8f5083","Type":"ContainerStarted","Data":"4dc9cedc53d73afdd775cf2736d1cf31b060ac44f8e45d5cf112d1aee5974909"} Apr 17 14:24:04.166223 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:04.166156 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-pjgwp" podStartSLOduration=2.465633112 podStartE2EDuration="3.166143575s" podCreationTimestamp="2026-04-17 14:24:01 +0000 UTC" firstStartedPulling="2026-04-17 14:24:01.821818452 +0000 UTC m=+136.856024327" lastFinishedPulling="2026-04-17 14:24:02.522328912 +0000 UTC m=+137.556534790" observedRunningTime="2026-04-17 14:24:04.164653523 +0000 UTC m=+139.198859421" watchObservedRunningTime="2026-04-17 14:24:04.166143575 +0000 UTC m=+139.200349466" Apr 17 14:24:04.183801 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:04.183753 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-ln5j2" podStartSLOduration=44.45685669 podStartE2EDuration="47.183740202s" podCreationTimestamp="2026-04-17 14:23:17 +0000 UTC" firstStartedPulling="2026-04-17 14:23:17.87879523 +0000 UTC m=+92.913001108" lastFinishedPulling="2026-04-17 14:23:20.605678742 +0000 UTC m=+95.639884620" observedRunningTime="2026-04-17 14:24:04.182932488 +0000 UTC m=+139.217138386" watchObservedRunningTime="2026-04-17 14:24:04.183740202 +0000 UTC m=+139.217946099" Apr 17 14:24:04.198836 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:04.198790 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-rcpxg" podStartSLOduration=1.955768522 podStartE2EDuration="3.198777872s" podCreationTimestamp="2026-04-17 14:24:01 +0000 UTC" firstStartedPulling="2026-04-17 14:24:01.913135977 +0000 UTC m=+136.947341852" lastFinishedPulling="2026-04-17 14:24:03.156145314 +0000 UTC m=+138.190351202" observedRunningTime="2026-04-17 14:24:04.197839513 +0000 UTC m=+139.232045423" watchObservedRunningTime="2026-04-17 14:24:04.198777872 +0000 UTC m=+139.232983769" Apr 17 14:24:04.296214 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:04.296160 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-ln5j2" Apr 17 14:24:04.472008 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:04.471974 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-dwmhd"] Apr 17 14:24:04.474497 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:04.474461 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-dwmhd" Apr 17 14:24:04.477151 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:04.477129 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 14:24:04.477284 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:04.477185 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 14:24:04.477284 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:04.477197 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-54kt6\"" Apr 17 14:24:04.485848 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:04.485822 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-dwmhd"] Apr 17 14:24:04.578069 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:04.578036 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc666\" (UniqueName: \"kubernetes.io/projected/5d101190-d888-4039-937e-bdefcee0eb15-kube-api-access-vc666\") pod \"downloads-6bcc868b7-dwmhd\" (UID: \"5d101190-d888-4039-937e-bdefcee0eb15\") " pod="openshift-console/downloads-6bcc868b7-dwmhd" Apr 17 14:24:04.678826 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:04.678781 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vc666\" (UniqueName: \"kubernetes.io/projected/5d101190-d888-4039-937e-bdefcee0eb15-kube-api-access-vc666\") pod \"downloads-6bcc868b7-dwmhd\" (UID: \"5d101190-d888-4039-937e-bdefcee0eb15\") " pod="openshift-console/downloads-6bcc868b7-dwmhd" Apr 17 14:24:04.686746 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:04.686721 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc666\" (UniqueName: \"kubernetes.io/projected/5d101190-d888-4039-937e-bdefcee0eb15-kube-api-access-vc666\") pod \"downloads-6bcc868b7-dwmhd\" (UID: \"5d101190-d888-4039-937e-bdefcee0eb15\") " pod="openshift-console/downloads-6bcc868b7-dwmhd" Apr 17 14:24:04.784540 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:04.784437 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-dwmhd" Apr 17 14:24:04.903973 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:04.903912 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-dwmhd"] Apr 17 14:24:04.906762 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:24:04.906728 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d101190_d888_4039_937e_bdefcee0eb15.slice/crio-56281708f021118cd9a1c600a5f752f95efaa4d20c82e8b9ac919c355353d631 WatchSource:0}: Error finding container 56281708f021118cd9a1c600a5f752f95efaa4d20c82e8b9ac919c355353d631: Status 404 returned error can't find the container with id 56281708f021118cd9a1c600a5f752f95efaa4d20c82e8b9ac919c355353d631 Apr 17 14:24:05.154906 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:05.154811 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-dwmhd" event={"ID":"5d101190-d888-4039-937e-bdefcee0eb15","Type":"ContainerStarted","Data":"56281708f021118cd9a1c600a5f752f95efaa4d20c82e8b9ac919c355353d631"} Apr 17 14:24:06.220230 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:06.220197 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-r7x4j"] Apr 17 14:24:06.229706 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:06.229091 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r7x4j" Apr 17 14:24:06.231070 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:06.231040 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-r7x4j"] Apr 17 14:24:06.231809 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:06.231669 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 17 14:24:06.231809 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:06.231680 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-gr795\"" Apr 17 14:24:06.394364 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:06.394323 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/aebdc27c-3d37-4850-8498-6e2aa14e37c6-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-r7x4j\" (UID: \"aebdc27c-3d37-4850-8498-6e2aa14e37c6\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r7x4j" Apr 17 14:24:06.495376 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:06.495288 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/aebdc27c-3d37-4850-8498-6e2aa14e37c6-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-r7x4j\" (UID: \"aebdc27c-3d37-4850-8498-6e2aa14e37c6\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r7x4j" Apr 17 14:24:06.495610 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:24:06.495459 2568 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 17 14:24:06.495737 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:24:06.495721 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aebdc27c-3d37-4850-8498-6e2aa14e37c6-monitoring-plugin-cert podName:aebdc27c-3d37-4850-8498-6e2aa14e37c6 nodeName:}" failed. No retries permitted until 2026-04-17 14:24:06.995697607 +0000 UTC m=+142.029903482 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/aebdc27c-3d37-4850-8498-6e2aa14e37c6-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-r7x4j" (UID: "aebdc27c-3d37-4850-8498-6e2aa14e37c6") : secret "monitoring-plugin-cert" not found Apr 17 14:24:07.000150 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.000110 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/aebdc27c-3d37-4850-8498-6e2aa14e37c6-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-r7x4j\" (UID: \"aebdc27c-3d37-4850-8498-6e2aa14e37c6\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r7x4j" Apr 17 14:24:07.002523 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.002490 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/aebdc27c-3d37-4850-8498-6e2aa14e37c6-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-r7x4j\" (UID: \"aebdc27c-3d37-4850-8498-6e2aa14e37c6\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r7x4j" Apr 17 14:24:07.141524 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.141489 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r7x4j" Apr 17 14:24:07.263953 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.263878 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-r7x4j"] Apr 17 14:24:07.268426 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:24:07.268390 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaebdc27c_3d37_4850_8498_6e2aa14e37c6.slice/crio-ee83f511f3983ef62bcb882f9c1336fb117d8f16dcfe18e851fad419490eaa1b WatchSource:0}: Error finding container ee83f511f3983ef62bcb882f9c1336fb117d8f16dcfe18e851fad419490eaa1b: Status 404 returned error can't find the container with id ee83f511f3983ef62bcb882f9c1336fb117d8f16dcfe18e851fad419490eaa1b Apr 17 14:24:07.609970 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.609723 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 14:24:07.614700 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.614675 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.617904 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.617260 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 14:24:07.617904 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.617483 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 14:24:07.617904 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.617508 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-mrj6khmfdqv8\"" Apr 17 14:24:07.617904 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.617642 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 14:24:07.617904 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.617751 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-8d6tk\"" Apr 17 14:24:07.617904 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.617761 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 14:24:07.618707 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.618616 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 14:24:07.620649 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.618922 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 14:24:07.620649 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.619117 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 14:24:07.620649 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.619388 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 14:24:07.620649 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.619622 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 14:24:07.620649 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.619805 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 14:24:07.620649 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.620022 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 14:24:07.620649 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.620581 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 14:24:07.623185 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.622969 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 14:24:07.631546 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.631502 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 14:24:07.706286 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.706247 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-config\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.706471 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.706317 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.706471 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.706350 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d469e034-4c97-4729-90fb-5a3448054898-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.706471 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.706377 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d469e034-4c97-4729-90fb-5a3448054898-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.706471 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.706406 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.706686 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.706469 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.706686 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.706499 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d469e034-4c97-4729-90fb-5a3448054898-config-out\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.706686 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.706523 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d469e034-4c97-4729-90fb-5a3448054898-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.706686 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.706553 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-web-config\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.706686 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.706580 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d469e034-4c97-4729-90fb-5a3448054898-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.706686 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.706610 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.706686 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.706640 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.706686 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.706664 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.706686 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.706688 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.707105 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.706708 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57wq5\" (UniqueName: \"kubernetes.io/projected/d469e034-4c97-4729-90fb-5a3448054898-kube-api-access-57wq5\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.707105 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.706781 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d469e034-4c97-4729-90fb-5a3448054898-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.707105 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.706830 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d469e034-4c97-4729-90fb-5a3448054898-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.707105 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.706861 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d469e034-4c97-4729-90fb-5a3448054898-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.808042 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.807949 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-config\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.808042 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.808019 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.808343 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.808052 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d469e034-4c97-4729-90fb-5a3448054898-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.808343 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.808079 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d469e034-4c97-4729-90fb-5a3448054898-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.808343 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.808105 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.808343 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.808135 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.808343 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.808175 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d469e034-4c97-4729-90fb-5a3448054898-config-out\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.808343 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.808201 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d469e034-4c97-4729-90fb-5a3448054898-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.808343 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.808231 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-web-config\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.808343 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.808255 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d469e034-4c97-4729-90fb-5a3448054898-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.808343 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.808287 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.808343 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.808318 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.808343 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.808342 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.808837 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.808374 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.808837 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.808398 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-57wq5\" (UniqueName: \"kubernetes.io/projected/d469e034-4c97-4729-90fb-5a3448054898-kube-api-access-57wq5\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.808837 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.808443 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d469e034-4c97-4729-90fb-5a3448054898-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.808837 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.808475 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d469e034-4c97-4729-90fb-5a3448054898-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.808837 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.808515 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d469e034-4c97-4729-90fb-5a3448054898-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.810505 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.809392 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d469e034-4c97-4729-90fb-5a3448054898-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.810505 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.810393 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d469e034-4c97-4729-90fb-5a3448054898-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.811077 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.810967 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d469e034-4c97-4729-90fb-5a3448054898-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.813903 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.813851 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d469e034-4c97-4729-90fb-5a3448054898-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.814209 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.814185 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d469e034-4c97-4729-90fb-5a3448054898-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.814621 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.814599 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d469e034-4c97-4729-90fb-5a3448054898-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.815677 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.815627 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.816568 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.816462 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d469e034-4c97-4729-90fb-5a3448054898-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.818074 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.818025 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d469e034-4c97-4729-90fb-5a3448054898-config-out\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.818253 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.818208 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-web-config\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.818470 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.818451 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.818536 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.818466 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.818588 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.818565 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.819219 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.819107 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.819219 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.819145 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.819219 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.819183 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-config\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.819444 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.819422 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.821435 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.821389 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-57wq5\" (UniqueName: \"kubernetes.io/projected/d469e034-4c97-4729-90fb-5a3448054898-kube-api-access-57wq5\") pod \"prometheus-k8s-0\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:07.934595 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:07.933986 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:08.108340 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:08.108271 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 14:24:08.113047 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:24:08.113015 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd469e034_4c97_4729_90fb_5a3448054898.slice/crio-548494d56733a8a1ad184b1aa19e38bc7772a0616d3676df8f8ad31cdfc9431b WatchSource:0}: Error finding container 548494d56733a8a1ad184b1aa19e38bc7772a0616d3676df8f8ad31cdfc9431b: Status 404 returned error can't find the container with id 548494d56733a8a1ad184b1aa19e38bc7772a0616d3676df8f8ad31cdfc9431b Apr 17 14:24:08.168928 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:08.168871 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d469e034-4c97-4729-90fb-5a3448054898","Type":"ContainerStarted","Data":"548494d56733a8a1ad184b1aa19e38bc7772a0616d3676df8f8ad31cdfc9431b"} Apr 17 14:24:08.170224 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:08.170194 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r7x4j" event={"ID":"aebdc27c-3d37-4850-8498-6e2aa14e37c6","Type":"ContainerStarted","Data":"ee83f511f3983ef62bcb882f9c1336fb117d8f16dcfe18e851fad419490eaa1b"} Apr 17 14:24:09.174952 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:09.174860 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r7x4j" event={"ID":"aebdc27c-3d37-4850-8498-6e2aa14e37c6","Type":"ContainerStarted","Data":"ca7f680f78af5b11f49492cb6b84b87149d82e194ce6bfe18fc3a648fc1fc162"} Apr 17 14:24:09.175417 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:09.175107 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r7x4j" Apr 17 14:24:09.181917 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:09.181885 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r7x4j" Apr 17 14:24:09.190745 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:09.190681 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r7x4j" podStartSLOduration=1.670697847 podStartE2EDuration="3.190662265s" podCreationTimestamp="2026-04-17 14:24:06 +0000 UTC" firstStartedPulling="2026-04-17 14:24:07.270190988 +0000 UTC m=+142.304396863" lastFinishedPulling="2026-04-17 14:24:08.790155393 +0000 UTC m=+143.824361281" observedRunningTime="2026-04-17 14:24:09.189786267 +0000 UTC m=+144.223992164" watchObservedRunningTime="2026-04-17 14:24:09.190662265 +0000 UTC m=+144.224868164" Apr 17 14:24:10.179211 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:10.179145 2568 generic.go:358] "Generic (PLEG): container finished" podID="d469e034-4c97-4729-90fb-5a3448054898" containerID="d967e40f52613a8ff97e89f86469681b384787c5eb519873977c7378273f547f" exitCode=0 Apr 17 14:24:10.179665 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:10.179243 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d469e034-4c97-4729-90fb-5a3448054898","Type":"ContainerDied","Data":"d967e40f52613a8ff97e89f86469681b384787c5eb519873977c7378273f547f"} Apr 17 14:24:14.195541 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:14.195488 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d469e034-4c97-4729-90fb-5a3448054898","Type":"ContainerStarted","Data":"997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d"} Apr 17 14:24:14.195541 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:14.195537 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d469e034-4c97-4729-90fb-5a3448054898","Type":"ContainerStarted","Data":"72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80"} Apr 17 14:24:19.840738 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:24:19.840687 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-gzm7s" podUID="a9c4445f-88cc-4c46-800e-db32500ad34d" Apr 17 14:24:19.855892 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:24:19.855843 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-4nk2q" podUID="0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7" Apr 17 14:24:20.215934 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:20.215902 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4nk2q" Apr 17 14:24:20.216123 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:20.215903 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gzm7s" Apr 17 14:24:23.230962 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:23.230923 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-dwmhd" event={"ID":"5d101190-d888-4039-937e-bdefcee0eb15","Type":"ContainerStarted","Data":"56000e0074d312c68ff7441de20d096865150fa5cc47f14a2bc15d08542786bc"} Apr 17 14:24:23.231469 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:23.231098 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-dwmhd" Apr 17 14:24:23.235158 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:23.235134 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d469e034-4c97-4729-90fb-5a3448054898","Type":"ContainerStarted","Data":"7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6"} Apr 17 14:24:23.235298 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:23.235186 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d469e034-4c97-4729-90fb-5a3448054898","Type":"ContainerStarted","Data":"8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b"} Apr 17 14:24:23.235298 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:23.235204 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d469e034-4c97-4729-90fb-5a3448054898","Type":"ContainerStarted","Data":"bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7"} Apr 17 14:24:23.235298 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:23.235217 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d469e034-4c97-4729-90fb-5a3448054898","Type":"ContainerStarted","Data":"9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a"} Apr 17 14:24:23.247707 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:23.247683 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-dwmhd" Apr 17 14:24:23.255012 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:23.254958 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-dwmhd" podStartSLOduration=1.976976837 podStartE2EDuration="19.254942346s" podCreationTimestamp="2026-04-17 14:24:04 +0000 UTC" firstStartedPulling="2026-04-17 14:24:04.908997777 +0000 UTC m=+139.943203652" lastFinishedPulling="2026-04-17 14:24:22.186963286 +0000 UTC m=+157.221169161" observedRunningTime="2026-04-17 14:24:23.253296594 +0000 UTC m=+158.287502492" watchObservedRunningTime="2026-04-17 14:24:23.254942346 +0000 UTC m=+158.289148244" Apr 17 14:24:23.300281 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:23.300219 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.272267347 podStartE2EDuration="16.300200486s" podCreationTimestamp="2026-04-17 14:24:07 +0000 UTC" firstStartedPulling="2026-04-17 14:24:08.118416921 +0000 UTC m=+143.152622811" lastFinishedPulling="2026-04-17 14:24:22.146350062 +0000 UTC m=+157.180555950" observedRunningTime="2026-04-17 14:24:23.298959387 +0000 UTC m=+158.333165285" watchObservedRunningTime="2026-04-17 14:24:23.300200486 +0000 UTC m=+158.334406385" Apr 17 14:24:24.664033 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:24.663941 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7-cert\") pod \"ingress-canary-4nk2q\" (UID: \"0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7\") " pod="openshift-ingress-canary/ingress-canary-4nk2q" Apr 17 14:24:24.664494 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:24.664144 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9c4445f-88cc-4c46-800e-db32500ad34d-metrics-tls\") pod \"dns-default-gzm7s\" (UID: \"a9c4445f-88cc-4c46-800e-db32500ad34d\") " pod="openshift-dns/dns-default-gzm7s" Apr 17 14:24:24.666640 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:24.666617 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9c4445f-88cc-4c46-800e-db32500ad34d-metrics-tls\") pod \"dns-default-gzm7s\" (UID: \"a9c4445f-88cc-4c46-800e-db32500ad34d\") " pod="openshift-dns/dns-default-gzm7s" Apr 17 14:24:24.666756 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:24.666736 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7-cert\") pod \"ingress-canary-4nk2q\" (UID: \"0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7\") " pod="openshift-ingress-canary/ingress-canary-4nk2q" Apr 17 14:24:24.719328 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:24.719289 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-rkhth\"" Apr 17 14:24:24.720316 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:24.720289 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-w9wr2\"" Apr 17 14:24:24.727596 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:24.727559 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gzm7s" Apr 17 14:24:24.727737 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:24.727567 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4nk2q" Apr 17 14:24:24.882847 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:24.882812 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gzm7s"] Apr 17 14:24:24.886290 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:24:24.886255 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9c4445f_88cc_4c46_800e_db32500ad34d.slice/crio-7de09d3fb40cc75eb3685d53cfcd265055d7df14739e6d6e117941e79579e3b0 WatchSource:0}: Error finding container 7de09d3fb40cc75eb3685d53cfcd265055d7df14739e6d6e117941e79579e3b0: Status 404 returned error can't find the container with id 7de09d3fb40cc75eb3685d53cfcd265055d7df14739e6d6e117941e79579e3b0 Apr 17 14:24:24.899839 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:24.899815 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4nk2q"] Apr 17 14:24:24.902740 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:24:24.902716 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c1ca4b0_b6b8_4bf7_8b62_7756a8d140e7.slice/crio-9adb988519b3cf263d294de6c7a7756bbcadff68585d43ed6c8788039b0922cd WatchSource:0}: Error finding container 9adb988519b3cf263d294de6c7a7756bbcadff68585d43ed6c8788039b0922cd: Status 404 returned error can't find the container with id 9adb988519b3cf263d294de6c7a7756bbcadff68585d43ed6c8788039b0922cd Apr 17 14:24:25.243815 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:25.243778 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gzm7s" event={"ID":"a9c4445f-88cc-4c46-800e-db32500ad34d","Type":"ContainerStarted","Data":"7de09d3fb40cc75eb3685d53cfcd265055d7df14739e6d6e117941e79579e3b0"} Apr 17 14:24:25.245153 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:25.245115 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4nk2q" event={"ID":"0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7","Type":"ContainerStarted","Data":"9adb988519b3cf263d294de6c7a7756bbcadff68585d43ed6c8788039b0922cd"} Apr 17 14:24:27.934145 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:27.934108 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:24:28.259867 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:28.259772 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gzm7s" event={"ID":"a9c4445f-88cc-4c46-800e-db32500ad34d","Type":"ContainerStarted","Data":"877a1c0a006115456a9159d99f75d6d3666bc010cdfce275d97d1808608c8854"} Apr 17 14:24:28.259867 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:28.259815 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gzm7s" event={"ID":"a9c4445f-88cc-4c46-800e-db32500ad34d","Type":"ContainerStarted","Data":"bc009a1760514c76dce98b2fb67d3979176b00921183bdb83cb20936d743fc30"} Apr 17 14:24:28.260063 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:28.259888 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-gzm7s" Apr 17 14:24:28.261396 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:28.261365 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4nk2q" event={"ID":"0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7","Type":"ContainerStarted","Data":"6aace06989494dde65d1b8c3e130ee8931119b7e19a95191eadbb27afd2a1292"} Apr 17 14:24:28.277608 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:28.277559 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-gzm7s" podStartSLOduration=129.416898708 podStartE2EDuration="2m12.277543902s" podCreationTimestamp="2026-04-17 14:22:16 +0000 UTC" firstStartedPulling="2026-04-17 14:24:24.888646228 +0000 UTC m=+159.922852102" lastFinishedPulling="2026-04-17 14:24:27.749291406 +0000 UTC m=+162.783497296" observedRunningTime="2026-04-17 14:24:28.275492058 +0000 UTC m=+163.309697970" watchObservedRunningTime="2026-04-17 14:24:28.277543902 +0000 UTC m=+163.311749796" Apr 17 14:24:28.291475 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:28.291416 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-4nk2q" podStartSLOduration=129.441142892 podStartE2EDuration="2m12.291385259s" podCreationTimestamp="2026-04-17 14:22:16 +0000 UTC" firstStartedPulling="2026-04-17 14:24:24.904742126 +0000 UTC m=+159.938948015" lastFinishedPulling="2026-04-17 14:24:27.754984508 +0000 UTC m=+162.789190382" observedRunningTime="2026-04-17 14:24:28.289996493 +0000 UTC m=+163.324202414" watchObservedRunningTime="2026-04-17 14:24:28.291385259 +0000 UTC m=+163.325591166" Apr 17 14:24:38.270915 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:38.270878 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-gzm7s" Apr 17 14:24:43.157734 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:43.157705 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-gzm7s_a9c4445f-88cc-4c46-800e-db32500ad34d/dns/0.log" Apr 17 14:24:43.166210 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:43.166183 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-gzm7s_a9c4445f-88cc-4c46-800e-db32500ad34d/kube-rbac-proxy/0.log" Apr 17 14:24:43.337679 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:43.337652 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6f2x6_7f633d5b-7896-43f3-b506-dc236c755507/dns-node-resolver/0.log" Apr 17 14:24:53.342457 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:53.342425 2568 generic.go:358] "Generic (PLEG): container finished" podID="b9528958-b786-4c25-8d67-30d1493f6002" containerID="f17c4574c4d9b1b605a5b5394d6edecd57153af198e493f0bed18a000d8d62da" exitCode=0 Apr 17 14:24:53.342851 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:53.342507 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9p4v4" event={"ID":"b9528958-b786-4c25-8d67-30d1493f6002","Type":"ContainerDied","Data":"f17c4574c4d9b1b605a5b5394d6edecd57153af198e493f0bed18a000d8d62da"} Apr 17 14:24:53.342935 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:53.342919 2568 scope.go:117] "RemoveContainer" containerID="f17c4574c4d9b1b605a5b5394d6edecd57153af198e493f0bed18a000d8d62da" Apr 17 14:24:54.347799 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:24:54.347762 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9p4v4" event={"ID":"b9528958-b786-4c25-8d67-30d1493f6002","Type":"ContainerStarted","Data":"08ac34bb2749233ec1d988e0a313a74705552408009a6ff46e0d3d025da129f1"} Apr 17 14:25:07.934545 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:07.934504 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:07.997042 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:07.997014 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:08.405987 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:08.405961 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:25.926540 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:25.926499 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 14:25:25.927184 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:25.927111 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="d469e034-4c97-4729-90fb-5a3448054898" containerName="prometheus" containerID="cri-o://72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80" gracePeriod=600 Apr 17 14:25:25.927184 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:25.927133 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="d469e034-4c97-4729-90fb-5a3448054898" containerName="kube-rbac-proxy" containerID="cri-o://bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7" gracePeriod=600 Apr 17 14:25:25.927334 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:25.927212 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="d469e034-4c97-4729-90fb-5a3448054898" containerName="config-reloader" containerID="cri-o://997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d" gracePeriod=600 Apr 17 14:25:25.927334 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:25.927181 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="d469e034-4c97-4729-90fb-5a3448054898" containerName="kube-rbac-proxy-thanos" containerID="cri-o://8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b" gracePeriod=600 Apr 17 14:25:25.927427 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:25.927136 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="d469e034-4c97-4729-90fb-5a3448054898" containerName="thanos-sidecar" containerID="cri-o://7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6" gracePeriod=600 Apr 17 14:25:25.927427 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:25.927140 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="d469e034-4c97-4729-90fb-5a3448054898" containerName="kube-rbac-proxy-web" containerID="cri-o://9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a" gracePeriod=600 Apr 17 14:25:26.181675 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.181611 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.289376 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.289342 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d469e034-4c97-4729-90fb-5a3448054898-prometheus-trusted-ca-bundle\") pod \"d469e034-4c97-4729-90fb-5a3448054898\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " Apr 17 14:25:26.289376 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.289380 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d469e034-4c97-4729-90fb-5a3448054898-configmap-serving-certs-ca-bundle\") pod \"d469e034-4c97-4729-90fb-5a3448054898\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " Apr 17 14:25:26.289570 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.289418 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"d469e034-4c97-4729-90fb-5a3448054898\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " Apr 17 14:25:26.289627 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.289604 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-secret-metrics-client-certs\") pod \"d469e034-4c97-4729-90fb-5a3448054898\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " Apr 17 14:25:26.289675 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.289661 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-secret-grpc-tls\") pod \"d469e034-4c97-4729-90fb-5a3448054898\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " Apr 17 14:25:26.289722 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.289698 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"d469e034-4c97-4729-90fb-5a3448054898\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " Apr 17 14:25:26.289777 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.289724 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-config\") pod \"d469e034-4c97-4729-90fb-5a3448054898\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " Apr 17 14:25:26.289777 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.289743 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57wq5\" (UniqueName: \"kubernetes.io/projected/d469e034-4c97-4729-90fb-5a3448054898-kube-api-access-57wq5\") pod \"d469e034-4c97-4729-90fb-5a3448054898\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " Apr 17 14:25:26.289878 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.289781 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d469e034-4c97-4729-90fb-5a3448054898-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "d469e034-4c97-4729-90fb-5a3448054898" (UID: "d469e034-4c97-4729-90fb-5a3448054898"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:25:26.289878 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.289795 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-web-config\") pod \"d469e034-4c97-4729-90fb-5a3448054898\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " Apr 17 14:25:26.289878 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.289801 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d469e034-4c97-4729-90fb-5a3448054898-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "d469e034-4c97-4729-90fb-5a3448054898" (UID: "d469e034-4c97-4729-90fb-5a3448054898"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:25:26.289878 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.289856 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d469e034-4c97-4729-90fb-5a3448054898-prometheus-k8s-db\") pod \"d469e034-4c97-4729-90fb-5a3448054898\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " Apr 17 14:25:26.290083 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.289886 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d469e034-4c97-4729-90fb-5a3448054898-tls-assets\") pod \"d469e034-4c97-4729-90fb-5a3448054898\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " Apr 17 14:25:26.290083 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.289917 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-thanos-prometheus-http-client-file\") pod \"d469e034-4c97-4729-90fb-5a3448054898\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " Apr 17 14:25:26.290083 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.289957 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-secret-prometheus-k8s-tls\") pod \"d469e034-4c97-4729-90fb-5a3448054898\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " Apr 17 14:25:26.290083 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.289988 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d469e034-4c97-4729-90fb-5a3448054898-config-out\") pod \"d469e034-4c97-4729-90fb-5a3448054898\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " Apr 17 14:25:26.290083 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.290015 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d469e034-4c97-4729-90fb-5a3448054898-configmap-metrics-client-ca\") pod \"d469e034-4c97-4729-90fb-5a3448054898\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " Apr 17 14:25:26.290083 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.290040 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d469e034-4c97-4729-90fb-5a3448054898-prometheus-k8s-rulefiles-0\") pod \"d469e034-4c97-4729-90fb-5a3448054898\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " Apr 17 14:25:26.290083 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.290077 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-secret-kube-rbac-proxy\") pod \"d469e034-4c97-4729-90fb-5a3448054898\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " Apr 17 14:25:26.290448 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.290102 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d469e034-4c97-4729-90fb-5a3448054898-configmap-kubelet-serving-ca-bundle\") pod \"d469e034-4c97-4729-90fb-5a3448054898\" (UID: \"d469e034-4c97-4729-90fb-5a3448054898\") " Apr 17 14:25:26.290448 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.290377 2568 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d469e034-4c97-4729-90fb-5a3448054898-prometheus-trusted-ca-bundle\") on node \"ip-10-0-143-215.ec2.internal\" DevicePath \"\"" Apr 17 14:25:26.290448 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.290399 2568 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d469e034-4c97-4729-90fb-5a3448054898-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-143-215.ec2.internal\" DevicePath \"\"" Apr 17 14:25:26.292426 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.290858 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d469e034-4c97-4729-90fb-5a3448054898-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "d469e034-4c97-4729-90fb-5a3448054898" (UID: "d469e034-4c97-4729-90fb-5a3448054898"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:25:26.292426 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.292005 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d469e034-4c97-4729-90fb-5a3448054898-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "d469e034-4c97-4729-90fb-5a3448054898" (UID: "d469e034-4c97-4729-90fb-5a3448054898"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:25:26.292426 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.292372 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d469e034-4c97-4729-90fb-5a3448054898-kube-api-access-57wq5" (OuterVolumeSpecName: "kube-api-access-57wq5") pod "d469e034-4c97-4729-90fb-5a3448054898" (UID: "d469e034-4c97-4729-90fb-5a3448054898"). InnerVolumeSpecName "kube-api-access-57wq5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:25:26.292837 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.292810 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d469e034-4c97-4729-90fb-5a3448054898-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "d469e034-4c97-4729-90fb-5a3448054898" (UID: "d469e034-4c97-4729-90fb-5a3448054898"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:25:26.293132 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.293095 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-config" (OuterVolumeSpecName: "config") pod "d469e034-4c97-4729-90fb-5a3448054898" (UID: "d469e034-4c97-4729-90fb-5a3448054898"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:25:26.293132 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.293116 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "d469e034-4c97-4729-90fb-5a3448054898" (UID: "d469e034-4c97-4729-90fb-5a3448054898"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:25:26.293388 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.293347 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "d469e034-4c97-4729-90fb-5a3448054898" (UID: "d469e034-4c97-4729-90fb-5a3448054898"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:25:26.293512 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.293406 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "d469e034-4c97-4729-90fb-5a3448054898" (UID: "d469e034-4c97-4729-90fb-5a3448054898"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:25:26.293717 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.293645 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d469e034-4c97-4729-90fb-5a3448054898-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "d469e034-4c97-4729-90fb-5a3448054898" (UID: "d469e034-4c97-4729-90fb-5a3448054898"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:25:26.293717 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.293685 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "d469e034-4c97-4729-90fb-5a3448054898" (UID: "d469e034-4c97-4729-90fb-5a3448054898"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:25:26.294049 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.294024 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "d469e034-4c97-4729-90fb-5a3448054898" (UID: "d469e034-4c97-4729-90fb-5a3448054898"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:25:26.294406 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.294386 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "d469e034-4c97-4729-90fb-5a3448054898" (UID: "d469e034-4c97-4729-90fb-5a3448054898"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:25:26.294499 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.294475 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d469e034-4c97-4729-90fb-5a3448054898-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "d469e034-4c97-4729-90fb-5a3448054898" (UID: "d469e034-4c97-4729-90fb-5a3448054898"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:25:26.294829 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.294811 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "d469e034-4c97-4729-90fb-5a3448054898" (UID: "d469e034-4c97-4729-90fb-5a3448054898"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:25:26.295096 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.295073 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d469e034-4c97-4729-90fb-5a3448054898-config-out" (OuterVolumeSpecName: "config-out") pod "d469e034-4c97-4729-90fb-5a3448054898" (UID: "d469e034-4c97-4729-90fb-5a3448054898"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:25:26.303576 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.303557 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-web-config" (OuterVolumeSpecName: "web-config") pod "d469e034-4c97-4729-90fb-5a3448054898" (UID: "d469e034-4c97-4729-90fb-5a3448054898"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:25:26.390692 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.390662 2568 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-secret-kube-rbac-proxy\") on node \"ip-10-0-143-215.ec2.internal\" DevicePath \"\"" Apr 17 14:25:26.390692 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.390691 2568 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d469e034-4c97-4729-90fb-5a3448054898-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-143-215.ec2.internal\" DevicePath \"\"" Apr 17 14:25:26.390866 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.390702 2568 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-143-215.ec2.internal\" DevicePath \"\"" Apr 17 14:25:26.390866 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.390712 2568 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-secret-metrics-client-certs\") on node \"ip-10-0-143-215.ec2.internal\" DevicePath \"\"" Apr 17 14:25:26.390866 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.390722 2568 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-secret-grpc-tls\") on node \"ip-10-0-143-215.ec2.internal\" DevicePath \"\"" Apr 17 14:25:26.390866 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.390731 2568 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-143-215.ec2.internal\" DevicePath \"\"" Apr 17 14:25:26.390866 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.390740 2568 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-config\") on node \"ip-10-0-143-215.ec2.internal\" DevicePath \"\"" Apr 17 14:25:26.390866 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.390750 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-57wq5\" (UniqueName: \"kubernetes.io/projected/d469e034-4c97-4729-90fb-5a3448054898-kube-api-access-57wq5\") on node \"ip-10-0-143-215.ec2.internal\" DevicePath \"\"" Apr 17 14:25:26.390866 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.390759 2568 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-web-config\") on node \"ip-10-0-143-215.ec2.internal\" DevicePath \"\"" Apr 17 14:25:26.390866 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.390767 2568 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d469e034-4c97-4729-90fb-5a3448054898-prometheus-k8s-db\") on node \"ip-10-0-143-215.ec2.internal\" DevicePath \"\"" Apr 17 14:25:26.390866 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.390775 2568 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d469e034-4c97-4729-90fb-5a3448054898-tls-assets\") on node \"ip-10-0-143-215.ec2.internal\" DevicePath \"\"" Apr 17 14:25:26.390866 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.390784 2568 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-thanos-prometheus-http-client-file\") on node \"ip-10-0-143-215.ec2.internal\" DevicePath \"\"" Apr 17 14:25:26.390866 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.390794 2568 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d469e034-4c97-4729-90fb-5a3448054898-secret-prometheus-k8s-tls\") on node \"ip-10-0-143-215.ec2.internal\" DevicePath \"\"" Apr 17 14:25:26.390866 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.390803 2568 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d469e034-4c97-4729-90fb-5a3448054898-config-out\") on node \"ip-10-0-143-215.ec2.internal\" DevicePath \"\"" Apr 17 14:25:26.390866 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.390812 2568 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d469e034-4c97-4729-90fb-5a3448054898-configmap-metrics-client-ca\") on node \"ip-10-0-143-215.ec2.internal\" DevicePath \"\"" Apr 17 14:25:26.390866 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.390820 2568 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d469e034-4c97-4729-90fb-5a3448054898-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-143-215.ec2.internal\" DevicePath \"\"" Apr 17 14:25:26.445525 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.445444 2568 generic.go:358] "Generic (PLEG): container finished" podID="d469e034-4c97-4729-90fb-5a3448054898" containerID="8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b" exitCode=0 Apr 17 14:25:26.445525 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.445468 2568 generic.go:358] "Generic (PLEG): container finished" podID="d469e034-4c97-4729-90fb-5a3448054898" containerID="bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7" exitCode=0 Apr 17 14:25:26.445525 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.445475 2568 generic.go:358] "Generic (PLEG): container finished" podID="d469e034-4c97-4729-90fb-5a3448054898" containerID="9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a" exitCode=0 Apr 17 14:25:26.445525 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.445481 2568 generic.go:358] "Generic (PLEG): container finished" podID="d469e034-4c97-4729-90fb-5a3448054898" containerID="7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6" exitCode=0 Apr 17 14:25:26.445525 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.445486 2568 generic.go:358] "Generic (PLEG): container finished" podID="d469e034-4c97-4729-90fb-5a3448054898" containerID="997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d" exitCode=0 Apr 17 14:25:26.445525 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.445491 2568 generic.go:358] "Generic (PLEG): container finished" podID="d469e034-4c97-4729-90fb-5a3448054898" containerID="72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80" exitCode=0 Apr 17 14:25:26.445784 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.445533 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d469e034-4c97-4729-90fb-5a3448054898","Type":"ContainerDied","Data":"8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b"} Apr 17 14:25:26.445784 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.445558 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.445784 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.445573 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d469e034-4c97-4729-90fb-5a3448054898","Type":"ContainerDied","Data":"bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7"} Apr 17 14:25:26.445784 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.445584 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d469e034-4c97-4729-90fb-5a3448054898","Type":"ContainerDied","Data":"9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a"} Apr 17 14:25:26.445784 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.445595 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d469e034-4c97-4729-90fb-5a3448054898","Type":"ContainerDied","Data":"7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6"} Apr 17 14:25:26.445784 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.445604 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d469e034-4c97-4729-90fb-5a3448054898","Type":"ContainerDied","Data":"997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d"} Apr 17 14:25:26.445784 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.445613 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d469e034-4c97-4729-90fb-5a3448054898","Type":"ContainerDied","Data":"72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80"} Apr 17 14:25:26.445784 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.445623 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d469e034-4c97-4729-90fb-5a3448054898","Type":"ContainerDied","Data":"548494d56733a8a1ad184b1aa19e38bc7772a0616d3676df8f8ad31cdfc9431b"} Apr 17 14:25:26.445784 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.445639 2568 scope.go:117] "RemoveContainer" containerID="8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b" Apr 17 14:25:26.453404 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.453270 2568 scope.go:117] "RemoveContainer" containerID="bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7" Apr 17 14:25:26.460626 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.460612 2568 scope.go:117] "RemoveContainer" containerID="9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a" Apr 17 14:25:26.466811 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.466795 2568 scope.go:117] "RemoveContainer" containerID="7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6" Apr 17 14:25:26.471450 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.471426 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 14:25:26.475221 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.475200 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 14:25:26.475919 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.475896 2568 scope.go:117] "RemoveContainer" containerID="997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d" Apr 17 14:25:26.483151 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.483137 2568 scope.go:117] "RemoveContainer" containerID="72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80" Apr 17 14:25:26.489702 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.489685 2568 scope.go:117] "RemoveContainer" containerID="d967e40f52613a8ff97e89f86469681b384787c5eb519873977c7378273f547f" Apr 17 14:25:26.495895 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.495795 2568 scope.go:117] "RemoveContainer" containerID="8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b" Apr 17 14:25:26.496109 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:25:26.496083 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b\": container with ID starting with 8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b not found: ID does not exist" containerID="8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b" Apr 17 14:25:26.496214 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.496122 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b"} err="failed to get container status \"8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b\": rpc error: code = NotFound desc = could not find container \"8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b\": container with ID starting with 8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b not found: ID does not exist" Apr 17 14:25:26.496214 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.496191 2568 scope.go:117] "RemoveContainer" containerID="bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7" Apr 17 14:25:26.496650 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:25:26.496528 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7\": container with ID starting with bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7 not found: ID does not exist" containerID="bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7" Apr 17 14:25:26.496650 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.496558 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7"} err="failed to get container status \"bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7\": rpc error: code = NotFound desc = could not find container \"bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7\": container with ID starting with bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7 not found: ID does not exist" Apr 17 14:25:26.496650 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.496586 2568 scope.go:117] "RemoveContainer" containerID="9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a" Apr 17 14:25:26.496958 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:25:26.496924 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a\": container with ID starting with 9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a not found: ID does not exist" containerID="9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a" Apr 17 14:25:26.497028 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.496953 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a"} err="failed to get container status \"9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a\": rpc error: code = NotFound desc = could not find container \"9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a\": container with ID starting with 9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a not found: ID does not exist" Apr 17 14:25:26.497028 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.496974 2568 scope.go:117] "RemoveContainer" containerID="7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6" Apr 17 14:25:26.497293 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:25:26.497274 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6\": container with ID starting with 7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6 not found: ID does not exist" containerID="7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6" Apr 17 14:25:26.497376 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.497303 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6"} err="failed to get container status \"7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6\": rpc error: code = NotFound desc = could not find container \"7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6\": container with ID starting with 7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6 not found: ID does not exist" Apr 17 14:25:26.497376 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.497322 2568 scope.go:117] "RemoveContainer" containerID="997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d" Apr 17 14:25:26.497586 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:25:26.497570 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d\": container with ID starting with 997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d not found: ID does not exist" containerID="997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d" Apr 17 14:25:26.497643 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.497601 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d"} err="failed to get container status \"997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d\": rpc error: code = NotFound desc = could not find container \"997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d\": container with ID starting with 997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d not found: ID does not exist" Apr 17 14:25:26.497643 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.497621 2568 scope.go:117] "RemoveContainer" containerID="72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80" Apr 17 14:25:26.497824 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.497807 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 14:25:26.497881 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:25:26.497823 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80\": container with ID starting with 72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80 not found: ID does not exist" containerID="72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80" Apr 17 14:25:26.497881 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.497843 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80"} err="failed to get container status \"72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80\": rpc error: code = NotFound desc = could not find container \"72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80\": container with ID starting with 72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80 not found: ID does not exist" Apr 17 14:25:26.497881 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.497858 2568 scope.go:117] "RemoveContainer" containerID="d967e40f52613a8ff97e89f86469681b384787c5eb519873977c7378273f547f" Apr 17 14:25:26.498108 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:25:26.498088 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d967e40f52613a8ff97e89f86469681b384787c5eb519873977c7378273f547f\": container with ID starting with d967e40f52613a8ff97e89f86469681b384787c5eb519873977c7378273f547f not found: ID does not exist" containerID="d967e40f52613a8ff97e89f86469681b384787c5eb519873977c7378273f547f" Apr 17 14:25:26.498225 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.498111 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d967e40f52613a8ff97e89f86469681b384787c5eb519873977c7378273f547f"} err="failed to get container status \"d967e40f52613a8ff97e89f86469681b384787c5eb519873977c7378273f547f\": rpc error: code = NotFound desc = could not find container \"d967e40f52613a8ff97e89f86469681b384787c5eb519873977c7378273f547f\": container with ID starting with d967e40f52613a8ff97e89f86469681b384787c5eb519873977c7378273f547f not found: ID does not exist" Apr 17 14:25:26.498225 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.498125 2568 scope.go:117] "RemoveContainer" containerID="8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b" Apr 17 14:25:26.498225 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.498146 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d469e034-4c97-4729-90fb-5a3448054898" containerName="init-config-reloader" Apr 17 14:25:26.498225 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.498177 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d469e034-4c97-4729-90fb-5a3448054898" containerName="init-config-reloader" Apr 17 14:25:26.498225 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.498193 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d469e034-4c97-4729-90fb-5a3448054898" containerName="thanos-sidecar" Apr 17 14:25:26.498225 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.498202 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d469e034-4c97-4729-90fb-5a3448054898" containerName="thanos-sidecar" Apr 17 14:25:26.498225 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.498210 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d469e034-4c97-4729-90fb-5a3448054898" containerName="config-reloader" Apr 17 14:25:26.498225 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.498216 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d469e034-4c97-4729-90fb-5a3448054898" containerName="config-reloader" Apr 17 14:25:26.498225 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.498223 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d469e034-4c97-4729-90fb-5a3448054898" containerName="kube-rbac-proxy-thanos" Apr 17 14:25:26.498225 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.498228 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d469e034-4c97-4729-90fb-5a3448054898" containerName="kube-rbac-proxy-thanos" Apr 17 14:25:26.498572 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.498243 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d469e034-4c97-4729-90fb-5a3448054898" containerName="prometheus" Apr 17 14:25:26.498572 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.498251 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d469e034-4c97-4729-90fb-5a3448054898" containerName="prometheus" Apr 17 14:25:26.498572 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.498260 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d469e034-4c97-4729-90fb-5a3448054898" containerName="kube-rbac-proxy-web" Apr 17 14:25:26.498572 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.498268 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d469e034-4c97-4729-90fb-5a3448054898" containerName="kube-rbac-proxy-web" Apr 17 14:25:26.498572 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.498275 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d469e034-4c97-4729-90fb-5a3448054898" containerName="kube-rbac-proxy" Apr 17 14:25:26.498572 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.498280 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d469e034-4c97-4729-90fb-5a3448054898" containerName="kube-rbac-proxy" Apr 17 14:25:26.498572 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.498333 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="d469e034-4c97-4729-90fb-5a3448054898" containerName="kube-rbac-proxy-web" Apr 17 14:25:26.498572 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.498344 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="d469e034-4c97-4729-90fb-5a3448054898" containerName="kube-rbac-proxy-thanos" Apr 17 14:25:26.498572 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.498350 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="d469e034-4c97-4729-90fb-5a3448054898" containerName="thanos-sidecar" Apr 17 14:25:26.498572 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.498361 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="d469e034-4c97-4729-90fb-5a3448054898" containerName="prometheus" Apr 17 14:25:26.498572 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.498361 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b"} err="failed to get container status \"8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b\": rpc error: code = NotFound desc = could not find container \"8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b\": container with ID starting with 8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b not found: ID does not exist" Apr 17 14:25:26.498572 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.498381 2568 scope.go:117] "RemoveContainer" containerID="bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7" Apr 17 14:25:26.498572 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.498369 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="d469e034-4c97-4729-90fb-5a3448054898" containerName="config-reloader" Apr 17 14:25:26.498572 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.498425 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="d469e034-4c97-4729-90fb-5a3448054898" containerName="kube-rbac-proxy" Apr 17 14:25:26.499025 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.498596 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7"} err="failed to get container status \"bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7\": rpc error: code = NotFound desc = could not find container \"bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7\": container with ID starting with bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7 not found: ID does not exist" Apr 17 14:25:26.499025 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.498613 2568 scope.go:117] "RemoveContainer" containerID="9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a" Apr 17 14:25:26.499025 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.498801 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a"} err="failed to get container status \"9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a\": rpc error: code = NotFound desc = could not find container \"9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a\": container with ID starting with 9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a not found: ID does not exist" Apr 17 14:25:26.499025 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.498815 2568 scope.go:117] "RemoveContainer" containerID="7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6" Apr 17 14:25:26.499025 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.498996 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6"} err="failed to get container status \"7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6\": rpc error: code = NotFound desc = could not find container \"7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6\": container with ID starting with 7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6 not found: ID does not exist" Apr 17 14:25:26.499025 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.499013 2568 scope.go:117] "RemoveContainer" containerID="997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d" Apr 17 14:25:26.499378 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.499357 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d"} err="failed to get container status \"997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d\": rpc error: code = NotFound desc = could not find container \"997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d\": container with ID starting with 997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d not found: ID does not exist" Apr 17 14:25:26.499452 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.499379 2568 scope.go:117] "RemoveContainer" containerID="72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80" Apr 17 14:25:26.499824 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.499773 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80"} err="failed to get container status \"72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80\": rpc error: code = NotFound desc = could not find container \"72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80\": container with ID starting with 72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80 not found: ID does not exist" Apr 17 14:25:26.499824 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.499799 2568 scope.go:117] "RemoveContainer" containerID="d967e40f52613a8ff97e89f86469681b384787c5eb519873977c7378273f547f" Apr 17 14:25:26.500092 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.500058 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d967e40f52613a8ff97e89f86469681b384787c5eb519873977c7378273f547f"} err="failed to get container status \"d967e40f52613a8ff97e89f86469681b384787c5eb519873977c7378273f547f\": rpc error: code = NotFound desc = could not find container \"d967e40f52613a8ff97e89f86469681b384787c5eb519873977c7378273f547f\": container with ID starting with d967e40f52613a8ff97e89f86469681b384787c5eb519873977c7378273f547f not found: ID does not exist" Apr 17 14:25:26.500193 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.500092 2568 scope.go:117] "RemoveContainer" containerID="8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b" Apr 17 14:25:26.500353 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.500330 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b"} err="failed to get container status \"8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b\": rpc error: code = NotFound desc = could not find container \"8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b\": container with ID starting with 8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b not found: ID does not exist" Apr 17 14:25:26.500399 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.500354 2568 scope.go:117] "RemoveContainer" containerID="bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7" Apr 17 14:25:26.500593 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.500569 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7"} err="failed to get container status \"bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7\": rpc error: code = NotFound desc = could not find container \"bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7\": container with ID starting with bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7 not found: ID does not exist" Apr 17 14:25:26.500643 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.500594 2568 scope.go:117] "RemoveContainer" containerID="9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a" Apr 17 14:25:26.500811 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.500791 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a"} err="failed to get container status \"9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a\": rpc error: code = NotFound desc = could not find container \"9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a\": container with ID starting with 9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a not found: ID does not exist" Apr 17 14:25:26.500874 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.500812 2568 scope.go:117] "RemoveContainer" containerID="7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6" Apr 17 14:25:26.501027 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.501008 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6"} err="failed to get container status \"7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6\": rpc error: code = NotFound desc = could not find container \"7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6\": container with ID starting with 7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6 not found: ID does not exist" Apr 17 14:25:26.501088 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.501029 2568 scope.go:117] "RemoveContainer" containerID="997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d" Apr 17 14:25:26.501264 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.501247 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d"} err="failed to get container status \"997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d\": rpc error: code = NotFound desc = could not find container \"997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d\": container with ID starting with 997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d not found: ID does not exist" Apr 17 14:25:26.501264 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.501264 2568 scope.go:117] "RemoveContainer" containerID="72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80" Apr 17 14:25:26.501443 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.501416 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80"} err="failed to get container status \"72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80\": rpc error: code = NotFound desc = could not find container \"72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80\": container with ID starting with 72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80 not found: ID does not exist" Apr 17 14:25:26.501513 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.501444 2568 scope.go:117] "RemoveContainer" containerID="d967e40f52613a8ff97e89f86469681b384787c5eb519873977c7378273f547f" Apr 17 14:25:26.501630 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.501616 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d967e40f52613a8ff97e89f86469681b384787c5eb519873977c7378273f547f"} err="failed to get container status \"d967e40f52613a8ff97e89f86469681b384787c5eb519873977c7378273f547f\": rpc error: code = NotFound desc = could not find container \"d967e40f52613a8ff97e89f86469681b384787c5eb519873977c7378273f547f\": container with ID starting with d967e40f52613a8ff97e89f86469681b384787c5eb519873977c7378273f547f not found: ID does not exist" Apr 17 14:25:26.501672 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.501630 2568 scope.go:117] "RemoveContainer" containerID="8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b" Apr 17 14:25:26.501818 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.501798 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b"} err="failed to get container status \"8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b\": rpc error: code = NotFound desc = could not find container \"8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b\": container with ID starting with 8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b not found: ID does not exist" Apr 17 14:25:26.501818 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.501816 2568 scope.go:117] "RemoveContainer" containerID="bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7" Apr 17 14:25:26.502021 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.502004 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7"} err="failed to get container status \"bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7\": rpc error: code = NotFound desc = could not find container \"bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7\": container with ID starting with bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7 not found: ID does not exist" Apr 17 14:25:26.502021 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.502021 2568 scope.go:117] "RemoveContainer" containerID="9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a" Apr 17 14:25:26.502285 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.502264 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a"} err="failed to get container status \"9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a\": rpc error: code = NotFound desc = could not find container \"9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a\": container with ID starting with 9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a not found: ID does not exist" Apr 17 14:25:26.502285 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.502285 2568 scope.go:117] "RemoveContainer" containerID="7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6" Apr 17 14:25:26.502450 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.502435 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6"} err="failed to get container status \"7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6\": rpc error: code = NotFound desc = could not find container \"7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6\": container with ID starting with 7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6 not found: ID does not exist" Apr 17 14:25:26.502493 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.502450 2568 scope.go:117] "RemoveContainer" containerID="997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d" Apr 17 14:25:26.502588 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.502574 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d"} err="failed to get container status \"997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d\": rpc error: code = NotFound desc = could not find container \"997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d\": container with ID starting with 997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d not found: ID does not exist" Apr 17 14:25:26.502631 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.502588 2568 scope.go:117] "RemoveContainer" containerID="72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80" Apr 17 14:25:26.502814 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.502794 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80"} err="failed to get container status \"72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80\": rpc error: code = NotFound desc = could not find container \"72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80\": container with ID starting with 72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80 not found: ID does not exist" Apr 17 14:25:26.502876 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.502815 2568 scope.go:117] "RemoveContainer" containerID="d967e40f52613a8ff97e89f86469681b384787c5eb519873977c7378273f547f" Apr 17 14:25:26.503056 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.503031 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d967e40f52613a8ff97e89f86469681b384787c5eb519873977c7378273f547f"} err="failed to get container status \"d967e40f52613a8ff97e89f86469681b384787c5eb519873977c7378273f547f\": rpc error: code = NotFound desc = could not find container \"d967e40f52613a8ff97e89f86469681b384787c5eb519873977c7378273f547f\": container with ID starting with d967e40f52613a8ff97e89f86469681b384787c5eb519873977c7378273f547f not found: ID does not exist" Apr 17 14:25:26.503113 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.503057 2568 scope.go:117] "RemoveContainer" containerID="8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b" Apr 17 14:25:26.503328 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.503307 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b"} err="failed to get container status \"8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b\": rpc error: code = NotFound desc = could not find container \"8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b\": container with ID starting with 8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b not found: ID does not exist" Apr 17 14:25:26.503393 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.503329 2568 scope.go:117] "RemoveContainer" containerID="bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7" Apr 17 14:25:26.503593 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.503566 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7"} err="failed to get container status \"bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7\": rpc error: code = NotFound desc = could not find container \"bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7\": container with ID starting with bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7 not found: ID does not exist" Apr 17 14:25:26.503667 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.503593 2568 scope.go:117] "RemoveContainer" containerID="9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a" Apr 17 14:25:26.505417 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.505379 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a"} err="failed to get container status \"9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a\": rpc error: code = NotFound desc = could not find container \"9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a\": container with ID starting with 9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a not found: ID does not exist" Apr 17 14:25:26.505526 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.505418 2568 scope.go:117] "RemoveContainer" containerID="7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6" Apr 17 14:25:26.505588 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.505567 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.505665 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.505647 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6"} err="failed to get container status \"7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6\": rpc error: code = NotFound desc = could not find container \"7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6\": container with ID starting with 7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6 not found: ID does not exist" Apr 17 14:25:26.505719 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.505666 2568 scope.go:117] "RemoveContainer" containerID="997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d" Apr 17 14:25:26.506301 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.506277 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d"} err="failed to get container status \"997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d\": rpc error: code = NotFound desc = could not find container \"997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d\": container with ID starting with 997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d not found: ID does not exist" Apr 17 14:25:26.506301 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.506300 2568 scope.go:117] "RemoveContainer" containerID="72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80" Apr 17 14:25:26.506529 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.506510 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80"} err="failed to get container status \"72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80\": rpc error: code = NotFound desc = could not find container \"72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80\": container with ID starting with 72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80 not found: ID does not exist" Apr 17 14:25:26.506586 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.506530 2568 scope.go:117] "RemoveContainer" containerID="d967e40f52613a8ff97e89f86469681b384787c5eb519873977c7378273f547f" Apr 17 14:25:26.506746 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.506723 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d967e40f52613a8ff97e89f86469681b384787c5eb519873977c7378273f547f"} err="failed to get container status \"d967e40f52613a8ff97e89f86469681b384787c5eb519873977c7378273f547f\": rpc error: code = NotFound desc = could not find container \"d967e40f52613a8ff97e89f86469681b384787c5eb519873977c7378273f547f\": container with ID starting with d967e40f52613a8ff97e89f86469681b384787c5eb519873977c7378273f547f not found: ID does not exist" Apr 17 14:25:26.506797 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.506749 2568 scope.go:117] "RemoveContainer" containerID="8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b" Apr 17 14:25:26.506982 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.506964 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b"} err="failed to get container status \"8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b\": rpc error: code = NotFound desc = could not find container \"8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b\": container with ID starting with 8e3a206d04b1adbd983206d01592d6a2ec7565fd9deed62d653864ec351f5b4b not found: ID does not exist" Apr 17 14:25:26.507034 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.506983 2568 scope.go:117] "RemoveContainer" containerID="bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7" Apr 17 14:25:26.507184 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.507153 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7"} err="failed to get container status \"bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7\": rpc error: code = NotFound desc = could not find container \"bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7\": container with ID starting with bba5c8956e9b653e90973e6e218bb80efbcb4390a7620301446f0f17efba48f7 not found: ID does not exist" Apr 17 14:25:26.507184 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.507180 2568 scope.go:117] "RemoveContainer" containerID="9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a" Apr 17 14:25:26.507434 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.507415 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a"} err="failed to get container status \"9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a\": rpc error: code = NotFound desc = could not find container \"9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a\": container with ID starting with 9e72ce877f59f497129b48a119d7540112f1fe3433410997fe81de8982345f9a not found: ID does not exist" Apr 17 14:25:26.507500 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.507436 2568 scope.go:117] "RemoveContainer" containerID="7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6" Apr 17 14:25:26.507648 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.507629 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6"} err="failed to get container status \"7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6\": rpc error: code = NotFound desc = could not find container \"7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6\": container with ID starting with 7301be5ffcba9a263c1315949aadc56e78d1748064e274065502e0e5ddec95d6 not found: ID does not exist" Apr 17 14:25:26.507648 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.507648 2568 scope.go:117] "RemoveContainer" containerID="997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d" Apr 17 14:25:26.507863 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.507844 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d"} err="failed to get container status \"997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d\": rpc error: code = NotFound desc = could not find container \"997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d\": container with ID starting with 997f2a70dd404e4867ce6621c35aad67b6a053345296681d2941c3dffea0523d not found: ID does not exist" Apr 17 14:25:26.507863 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.507863 2568 scope.go:117] "RemoveContainer" containerID="72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80" Apr 17 14:25:26.508095 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.508065 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80"} err="failed to get container status \"72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80\": rpc error: code = NotFound desc = could not find container \"72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80\": container with ID starting with 72d323f60c00ae77bb55a43a823f271c6356a741635431a7e065238b7abc2c80 not found: ID does not exist" Apr 17 14:25:26.508095 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.508094 2568 scope.go:117] "RemoveContainer" containerID="d967e40f52613a8ff97e89f86469681b384787c5eb519873977c7378273f547f" Apr 17 14:25:26.508381 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.508351 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d967e40f52613a8ff97e89f86469681b384787c5eb519873977c7378273f547f"} err="failed to get container status \"d967e40f52613a8ff97e89f86469681b384787c5eb519873977c7378273f547f\": rpc error: code = NotFound desc = could not find container \"d967e40f52613a8ff97e89f86469681b384787c5eb519873977c7378273f547f\": container with ID starting with d967e40f52613a8ff97e89f86469681b384787c5eb519873977c7378273f547f not found: ID does not exist" Apr 17 14:25:26.508654 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.508423 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 14:25:26.508654 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.508519 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 14:25:26.508654 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.508534 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 14:25:26.508654 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.508549 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 14:25:26.508654 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.508585 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 14:25:26.508978 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.508962 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 14:25:26.509041 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.508964 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-8d6tk\"" Apr 17 14:25:26.509077 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.509049 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-mrj6khmfdqv8\"" Apr 17 14:25:26.509537 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.509523 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 14:25:26.509646 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.509607 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 14:25:26.509646 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.509619 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 14:25:26.509762 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.509648 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 14:25:26.509847 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.509787 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 14:25:26.514418 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.514132 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 14:25:26.514573 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.514553 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 14:25:26.515318 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.515303 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 14:25:26.592789 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.592756 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.592789 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.592794 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.593004 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.592823 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-config-out\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.593004 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.592870 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.593004 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.592928 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.593004 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.592961 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-web-config\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.593004 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.592976 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzm62\" (UniqueName: \"kubernetes.io/projected/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-kube-api-access-kzm62\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.593004 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.593000 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.593211 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.593018 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.593211 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.593047 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.593211 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.593115 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.593211 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.593192 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.593338 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.593214 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-config\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.593338 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.593247 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.593338 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.593273 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.593338 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.593303 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.593338 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.593321 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.593482 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.593341 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.694142 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.694103 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.694142 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.694143 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-config\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.694379 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.694160 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.694379 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.694204 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.695114 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.694759 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.695114 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.694825 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.695114 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.695049 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.695390 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.695263 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.695390 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.695302 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.695390 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.695357 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-config-out\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.695390 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.695387 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.695647 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.695423 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.695823 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.695807 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-web-config\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.695936 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.695922 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kzm62\" (UniqueName: \"kubernetes.io/projected/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-kube-api-access-kzm62\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.696117 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.696033 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.696249 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.696234 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.696399 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.696385 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.696517 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.696498 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.697773 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.697513 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-config\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.698826 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.697891 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.698826 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.697958 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.698826 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.698442 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.698826 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.698598 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.699973 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.699651 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.700593 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.700560 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.700777 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.700755 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.701280 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.701256 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-web-config\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.701800 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.701483 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.701800 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.701594 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.701936 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.701878 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.701936 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.701890 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.702752 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.702240 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.703742 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.703401 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.703742 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.703697 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.704360 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.704341 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-config-out\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.706240 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.706221 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzm62\" (UniqueName: \"kubernetes.io/projected/b5dd786f-1d83-4b2c-a56a-6c01b76118e4-kube-api-access-kzm62\") pod \"prometheus-k8s-0\" (UID: \"b5dd786f-1d83-4b2c-a56a-6c01b76118e4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.817435 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.817396 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:25:26.941533 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:26.941479 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 14:25:26.944345 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:25:26.944311 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5dd786f_1d83_4b2c_a56a_6c01b76118e4.slice/crio-352e5cfcc05ea7d6c1ea9a5bfade4d0a4dfc6e567079d112e58147ae0f1dfdc4 WatchSource:0}: Error finding container 352e5cfcc05ea7d6c1ea9a5bfade4d0a4dfc6e567079d112e58147ae0f1dfdc4: Status 404 returned error can't find the container with id 352e5cfcc05ea7d6c1ea9a5bfade4d0a4dfc6e567079d112e58147ae0f1dfdc4 Apr 17 14:25:27.449311 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:27.449276 2568 generic.go:358] "Generic (PLEG): container finished" podID="b5dd786f-1d83-4b2c-a56a-6c01b76118e4" containerID="a4e656f33b2850c1f9a602e3bde2acbc2373bbac7531ecebbceda15e6e6d44c0" exitCode=0 Apr 17 14:25:27.449506 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:27.449364 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b5dd786f-1d83-4b2c-a56a-6c01b76118e4","Type":"ContainerDied","Data":"a4e656f33b2850c1f9a602e3bde2acbc2373bbac7531ecebbceda15e6e6d44c0"} Apr 17 14:25:27.449506 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:27.449405 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b5dd786f-1d83-4b2c-a56a-6c01b76118e4","Type":"ContainerStarted","Data":"352e5cfcc05ea7d6c1ea9a5bfade4d0a4dfc6e567079d112e58147ae0f1dfdc4"} Apr 17 14:25:27.583820 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:27.583791 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d469e034-4c97-4729-90fb-5a3448054898" path="/var/lib/kubelet/pods/d469e034-4c97-4729-90fb-5a3448054898/volumes" Apr 17 14:25:28.456974 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:28.456943 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b5dd786f-1d83-4b2c-a56a-6c01b76118e4","Type":"ContainerStarted","Data":"284a1d89371d439a53c0b2d6e947222d8b7d00368550c8b93f5390778eac3533"} Apr 17 14:25:28.456974 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:28.456976 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b5dd786f-1d83-4b2c-a56a-6c01b76118e4","Type":"ContainerStarted","Data":"00667d943008e6390af18e896ebab4261d9573735d407cd38600ae7659450fbf"} Apr 17 14:25:28.457409 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:28.456988 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b5dd786f-1d83-4b2c-a56a-6c01b76118e4","Type":"ContainerStarted","Data":"862ab04cad15c8da5b907eb1ab5d3bf6131584cff7adf34c91ce0a7808832608"} Apr 17 14:25:28.457409 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:28.456997 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b5dd786f-1d83-4b2c-a56a-6c01b76118e4","Type":"ContainerStarted","Data":"7ffdc63bc421f069c3f7697766015f34815ccf561b5b0666f74a7a0416322f95"} Apr 17 14:25:28.457409 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:28.457005 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b5dd786f-1d83-4b2c-a56a-6c01b76118e4","Type":"ContainerStarted","Data":"b98d630d0cf3c8eccba2aea83770f5096dfc6155787addfa9eb6973664ba5701"} Apr 17 14:25:28.457409 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:28.457013 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b5dd786f-1d83-4b2c-a56a-6c01b76118e4","Type":"ContainerStarted","Data":"45392d1609b833f5b0319fee77860eb4c0bff9b6a946f6970ca8ed9120976c1d"} Apr 17 14:25:28.485133 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:28.485077 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.485062703 podStartE2EDuration="2.485062703s" podCreationTimestamp="2026-04-17 14:25:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:25:28.482326464 +0000 UTC m=+223.516532377" watchObservedRunningTime="2026-04-17 14:25:28.485062703 +0000 UTC m=+223.519268604" Apr 17 14:25:31.817661 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:25:31.817623 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:26:26.818501 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:26:26.818457 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:26:26.833775 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:26:26.833747 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:26:27.638148 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:26:27.638122 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 14:26:45.477175 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:26:45.477129 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln5j2_93531d07-7bae-4782-818d-d6e8ceecf396/console-operator/2.log" Apr 17 14:26:45.478279 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:26:45.478255 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln5j2_93531d07-7bae-4782-818d-d6e8ceecf396/console-operator/2.log" Apr 17 14:26:45.481546 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:26:45.481518 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fm94_709e5989-ba48-455a-b8a9-25c4eafebaa4/ovn-acl-logging/0.log" Apr 17 14:26:45.482681 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:26:45.482663 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fm94_709e5989-ba48-455a-b8a9-25c4eafebaa4/ovn-acl-logging/0.log" Apr 17 14:26:45.487032 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:26:45.487016 2568 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 14:27:10.512415 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:27:10.512379 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-8kqjl"] Apr 17 14:27:10.515482 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:27:10.515466 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8kqjl" Apr 17 14:27:10.517935 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:27:10.517915 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 14:27:10.522115 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:27:10.522096 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8kqjl"] Apr 17 14:27:10.620325 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:27:10.620290 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bb5253c2-c2a0-4c5e-b333-f7ecd4bb0d29-original-pull-secret\") pod \"global-pull-secret-syncer-8kqjl\" (UID: \"bb5253c2-c2a0-4c5e-b333-f7ecd4bb0d29\") " pod="kube-system/global-pull-secret-syncer-8kqjl" Apr 17 14:27:10.620509 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:27:10.620354 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/bb5253c2-c2a0-4c5e-b333-f7ecd4bb0d29-dbus\") pod \"global-pull-secret-syncer-8kqjl\" (UID: \"bb5253c2-c2a0-4c5e-b333-f7ecd4bb0d29\") " pod="kube-system/global-pull-secret-syncer-8kqjl" Apr 17 14:27:10.620568 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:27:10.620518 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/bb5253c2-c2a0-4c5e-b333-f7ecd4bb0d29-kubelet-config\") pod \"global-pull-secret-syncer-8kqjl\" (UID: \"bb5253c2-c2a0-4c5e-b333-f7ecd4bb0d29\") " pod="kube-system/global-pull-secret-syncer-8kqjl" Apr 17 14:27:10.721773 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:27:10.721727 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/bb5253c2-c2a0-4c5e-b333-f7ecd4bb0d29-kubelet-config\") pod \"global-pull-secret-syncer-8kqjl\" (UID: \"bb5253c2-c2a0-4c5e-b333-f7ecd4bb0d29\") " pod="kube-system/global-pull-secret-syncer-8kqjl" Apr 17 14:27:10.721977 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:27:10.721800 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bb5253c2-c2a0-4c5e-b333-f7ecd4bb0d29-original-pull-secret\") pod \"global-pull-secret-syncer-8kqjl\" (UID: \"bb5253c2-c2a0-4c5e-b333-f7ecd4bb0d29\") " pod="kube-system/global-pull-secret-syncer-8kqjl" Apr 17 14:27:10.721977 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:27:10.721819 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/bb5253c2-c2a0-4c5e-b333-f7ecd4bb0d29-kubelet-config\") pod \"global-pull-secret-syncer-8kqjl\" (UID: \"bb5253c2-c2a0-4c5e-b333-f7ecd4bb0d29\") " pod="kube-system/global-pull-secret-syncer-8kqjl" Apr 17 14:27:10.721977 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:27:10.721881 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/bb5253c2-c2a0-4c5e-b333-f7ecd4bb0d29-dbus\") pod \"global-pull-secret-syncer-8kqjl\" (UID: \"bb5253c2-c2a0-4c5e-b333-f7ecd4bb0d29\") " pod="kube-system/global-pull-secret-syncer-8kqjl" Apr 17 14:27:10.722119 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:27:10.722046 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/bb5253c2-c2a0-4c5e-b333-f7ecd4bb0d29-dbus\") pod \"global-pull-secret-syncer-8kqjl\" (UID: \"bb5253c2-c2a0-4c5e-b333-f7ecd4bb0d29\") " pod="kube-system/global-pull-secret-syncer-8kqjl" Apr 17 14:27:10.724138 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:27:10.724117 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bb5253c2-c2a0-4c5e-b333-f7ecd4bb0d29-original-pull-secret\") pod \"global-pull-secret-syncer-8kqjl\" (UID: \"bb5253c2-c2a0-4c5e-b333-f7ecd4bb0d29\") " pod="kube-system/global-pull-secret-syncer-8kqjl" Apr 17 14:27:10.826301 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:27:10.826198 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8kqjl" Apr 17 14:27:10.942319 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:27:10.942293 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8kqjl"] Apr 17 14:27:10.944661 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:27:10.944627 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb5253c2_c2a0_4c5e_b333_f7ecd4bb0d29.slice/crio-dc47be53642773ee0480e7ab0ba1326462573efaf4177c6e4e921e773076d44e WatchSource:0}: Error finding container dc47be53642773ee0480e7ab0ba1326462573efaf4177c6e4e921e773076d44e: Status 404 returned error can't find the container with id dc47be53642773ee0480e7ab0ba1326462573efaf4177c6e4e921e773076d44e Apr 17 14:27:10.946348 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:27:10.946327 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 14:27:11.745842 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:27:11.745810 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8kqjl" event={"ID":"bb5253c2-c2a0-4c5e-b333-f7ecd4bb0d29","Type":"ContainerStarted","Data":"dc47be53642773ee0480e7ab0ba1326462573efaf4177c6e4e921e773076d44e"} Apr 17 14:27:14.756408 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:27:14.756376 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8kqjl" event={"ID":"bb5253c2-c2a0-4c5e-b333-f7ecd4bb0d29","Type":"ContainerStarted","Data":"43c752e97dc5919a542258aa557a4e65403f4dde803a686fef0cd6645731b39e"} Apr 17 14:27:14.772441 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:27:14.772397 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-8kqjl" podStartSLOduration=1.273581113 podStartE2EDuration="4.772383161s" podCreationTimestamp="2026-04-17 14:27:10 +0000 UTC" firstStartedPulling="2026-04-17 14:27:10.946484625 +0000 UTC m=+325.980690500" lastFinishedPulling="2026-04-17 14:27:14.445286658 +0000 UTC m=+329.479492548" observedRunningTime="2026-04-17 14:27:14.770693902 +0000 UTC m=+329.804899797" watchObservedRunningTime="2026-04-17 14:27:14.772383161 +0000 UTC m=+329.806589056" Apr 17 14:28:16.781368 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:16.781331 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-94ntn"] Apr 17 14:28:16.784603 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:16.784583 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-94ntn" Apr 17 14:28:16.787247 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:16.787226 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 14:28:16.787360 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:16.787290 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-dws7m\"" Apr 17 14:28:16.788431 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:16.788417 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 14:28:16.792232 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:16.792207 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-94ntn"] Apr 17 14:28:16.859032 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:16.859003 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a4ead27e-05d9-475b-b978-a9e0a11ca04a-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-94ntn\" (UID: \"a4ead27e-05d9-475b-b978-a9e0a11ca04a\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-94ntn" Apr 17 14:28:16.859214 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:16.859039 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd4q7\" (UniqueName: \"kubernetes.io/projected/a4ead27e-05d9-475b-b978-a9e0a11ca04a-kube-api-access-qd4q7\") pod \"cert-manager-cainjector-8966b78d4-94ntn\" (UID: \"a4ead27e-05d9-475b-b978-a9e0a11ca04a\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-94ntn" Apr 17 14:28:16.960208 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:16.960160 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qd4q7\" (UniqueName: \"kubernetes.io/projected/a4ead27e-05d9-475b-b978-a9e0a11ca04a-kube-api-access-qd4q7\") pod \"cert-manager-cainjector-8966b78d4-94ntn\" (UID: \"a4ead27e-05d9-475b-b978-a9e0a11ca04a\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-94ntn" Apr 17 14:28:16.960341 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:16.960288 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a4ead27e-05d9-475b-b978-a9e0a11ca04a-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-94ntn\" (UID: \"a4ead27e-05d9-475b-b978-a9e0a11ca04a\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-94ntn" Apr 17 14:28:16.968089 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:16.968062 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a4ead27e-05d9-475b-b978-a9e0a11ca04a-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-94ntn\" (UID: \"a4ead27e-05d9-475b-b978-a9e0a11ca04a\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-94ntn" Apr 17 14:28:16.968308 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:16.968286 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd4q7\" (UniqueName: \"kubernetes.io/projected/a4ead27e-05d9-475b-b978-a9e0a11ca04a-kube-api-access-qd4q7\") pod \"cert-manager-cainjector-8966b78d4-94ntn\" (UID: \"a4ead27e-05d9-475b-b978-a9e0a11ca04a\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-94ntn" Apr 17 14:28:17.109710 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:17.109630 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-94ntn" Apr 17 14:28:17.230062 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:17.230009 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-94ntn"] Apr 17 14:28:17.232890 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:28:17.232852 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4ead27e_05d9_475b_b978_a9e0a11ca04a.slice/crio-17ab696a3d232383cf95506929648044e7efdb5c67ab5487ace111ab756dec67 WatchSource:0}: Error finding container 17ab696a3d232383cf95506929648044e7efdb5c67ab5487ace111ab756dec67: Status 404 returned error can't find the container with id 17ab696a3d232383cf95506929648044e7efdb5c67ab5487ace111ab756dec67 Apr 17 14:28:17.928950 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:17.928915 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-94ntn" event={"ID":"a4ead27e-05d9-475b-b978-a9e0a11ca04a","Type":"ContainerStarted","Data":"17ab696a3d232383cf95506929648044e7efdb5c67ab5487ace111ab756dec67"} Apr 17 14:28:20.940837 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:20.940797 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-94ntn" event={"ID":"a4ead27e-05d9-475b-b978-a9e0a11ca04a","Type":"ContainerStarted","Data":"e5b99c8c8c0eb6bfd17eb33120dcc2a0633614ad75d8b7643bba1c511fdc88c3"} Apr 17 14:28:20.957041 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:20.956995 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-94ntn" podStartSLOduration=1.831324461 podStartE2EDuration="4.956981881s" podCreationTimestamp="2026-04-17 14:28:16 +0000 UTC" firstStartedPulling="2026-04-17 14:28:17.235012992 +0000 UTC m=+392.269218866" lastFinishedPulling="2026-04-17 14:28:20.360670411 +0000 UTC m=+395.394876286" observedRunningTime="2026-04-17 14:28:20.955728157 +0000 UTC m=+395.989934050" watchObservedRunningTime="2026-04-17 14:28:20.956981881 +0000 UTC m=+395.991187778" Apr 17 14:28:48.028569 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:48.028535 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-tfhld"] Apr 17 14:28:48.032001 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:48.031968 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-tfhld" Apr 17 14:28:48.038130 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:48.038105 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 14:28:48.038387 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:48.038109 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 14:28:48.038469 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:48.038126 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 14:28:48.038542 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:48.038156 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-268bl\"" Apr 17 14:28:48.038596 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:48.038159 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 14:28:48.044557 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:48.044536 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-tfhld"] Apr 17 14:28:48.198833 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:48.198802 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nczr\" (UniqueName: \"kubernetes.io/projected/61119e72-7585-49d8-ab6c-37132891c232-kube-api-access-4nczr\") pod \"opendatahub-operator-controller-manager-58c8f88b6d-tfhld\" (UID: \"61119e72-7585-49d8-ab6c-37132891c232\") " pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-tfhld" Apr 17 14:28:48.198986 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:48.198854 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/61119e72-7585-49d8-ab6c-37132891c232-webhook-cert\") pod \"opendatahub-operator-controller-manager-58c8f88b6d-tfhld\" (UID: \"61119e72-7585-49d8-ab6c-37132891c232\") " pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-tfhld" Apr 17 14:28:48.198986 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:48.198925 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/61119e72-7585-49d8-ab6c-37132891c232-apiservice-cert\") pod \"opendatahub-operator-controller-manager-58c8f88b6d-tfhld\" (UID: \"61119e72-7585-49d8-ab6c-37132891c232\") " pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-tfhld" Apr 17 14:28:48.300331 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:48.300238 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4nczr\" (UniqueName: \"kubernetes.io/projected/61119e72-7585-49d8-ab6c-37132891c232-kube-api-access-4nczr\") pod \"opendatahub-operator-controller-manager-58c8f88b6d-tfhld\" (UID: \"61119e72-7585-49d8-ab6c-37132891c232\") " pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-tfhld" Apr 17 14:28:48.300331 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:48.300309 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/61119e72-7585-49d8-ab6c-37132891c232-webhook-cert\") pod \"opendatahub-operator-controller-manager-58c8f88b6d-tfhld\" (UID: \"61119e72-7585-49d8-ab6c-37132891c232\") " pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-tfhld" Apr 17 14:28:48.300663 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:48.300360 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/61119e72-7585-49d8-ab6c-37132891c232-apiservice-cert\") pod \"opendatahub-operator-controller-manager-58c8f88b6d-tfhld\" (UID: \"61119e72-7585-49d8-ab6c-37132891c232\") " pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-tfhld" Apr 17 14:28:48.303671 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:48.303643 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/61119e72-7585-49d8-ab6c-37132891c232-apiservice-cert\") pod \"opendatahub-operator-controller-manager-58c8f88b6d-tfhld\" (UID: \"61119e72-7585-49d8-ab6c-37132891c232\") " pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-tfhld" Apr 17 14:28:48.303791 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:48.303667 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/61119e72-7585-49d8-ab6c-37132891c232-webhook-cert\") pod \"opendatahub-operator-controller-manager-58c8f88b6d-tfhld\" (UID: \"61119e72-7585-49d8-ab6c-37132891c232\") " pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-tfhld" Apr 17 14:28:48.310638 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:48.310616 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nczr\" (UniqueName: \"kubernetes.io/projected/61119e72-7585-49d8-ab6c-37132891c232-kube-api-access-4nczr\") pod \"opendatahub-operator-controller-manager-58c8f88b6d-tfhld\" (UID: \"61119e72-7585-49d8-ab6c-37132891c232\") " pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-tfhld" Apr 17 14:28:48.344701 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:48.344677 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-tfhld" Apr 17 14:28:48.476826 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:48.476785 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-tfhld"] Apr 17 14:28:48.480834 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:28:48.480792 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61119e72_7585_49d8_ab6c_37132891c232.slice/crio-fec5fb23eac71e6556099ad22041c4a6e4e023bfe335289d5026c0784a69cb4d WatchSource:0}: Error finding container fec5fb23eac71e6556099ad22041c4a6e4e023bfe335289d5026c0784a69cb4d: Status 404 returned error can't find the container with id fec5fb23eac71e6556099ad22041c4a6e4e023bfe335289d5026c0784a69cb4d Apr 17 14:28:49.025175 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:49.025139 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-tfhld" event={"ID":"61119e72-7585-49d8-ab6c-37132891c232","Type":"ContainerStarted","Data":"fec5fb23eac71e6556099ad22041c4a6e4e023bfe335289d5026c0784a69cb4d"} Apr 17 14:28:49.720517 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:49.720465 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-5b89f4cf56-f675g"] Apr 17 14:28:49.727275 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:49.727250 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-f675g" Apr 17 14:28:49.731110 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:49.731085 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 14:28:49.731254 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:49.731196 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:28:49.731469 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:49.731446 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 17 14:28:49.731614 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:49.731598 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 17 14:28:49.731707 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:49.731628 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 17 14:28:49.731768 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:49.731724 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5b89f4cf56-f675g"] Apr 17 14:28:49.731821 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:49.731806 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-pml4c\"" Apr 17 14:28:49.814613 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:49.814575 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/b1bb5c1b-be4b-4680-9309-126980eafeac-manager-config\") pod \"lws-controller-manager-5b89f4cf56-f675g\" (UID: \"b1bb5c1b-be4b-4680-9309-126980eafeac\") " pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-f675g" Apr 17 14:28:49.814792 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:49.814657 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1bb5c1b-be4b-4680-9309-126980eafeac-cert\") pod \"lws-controller-manager-5b89f4cf56-f675g\" (UID: \"b1bb5c1b-be4b-4680-9309-126980eafeac\") " pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-f675g" Apr 17 14:28:49.814792 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:49.814695 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/b1bb5c1b-be4b-4680-9309-126980eafeac-metrics-cert\") pod \"lws-controller-manager-5b89f4cf56-f675g\" (UID: \"b1bb5c1b-be4b-4680-9309-126980eafeac\") " pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-f675g" Apr 17 14:28:49.814905 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:49.814806 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mxsg\" (UniqueName: \"kubernetes.io/projected/b1bb5c1b-be4b-4680-9309-126980eafeac-kube-api-access-6mxsg\") pod \"lws-controller-manager-5b89f4cf56-f675g\" (UID: \"b1bb5c1b-be4b-4680-9309-126980eafeac\") " pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-f675g" Apr 17 14:28:49.916303 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:49.916268 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1bb5c1b-be4b-4680-9309-126980eafeac-cert\") pod \"lws-controller-manager-5b89f4cf56-f675g\" (UID: \"b1bb5c1b-be4b-4680-9309-126980eafeac\") " pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-f675g" Apr 17 14:28:49.916303 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:49.916309 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/b1bb5c1b-be4b-4680-9309-126980eafeac-metrics-cert\") pod \"lws-controller-manager-5b89f4cf56-f675g\" (UID: \"b1bb5c1b-be4b-4680-9309-126980eafeac\") " pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-f675g" Apr 17 14:28:49.916512 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:49.916352 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6mxsg\" (UniqueName: \"kubernetes.io/projected/b1bb5c1b-be4b-4680-9309-126980eafeac-kube-api-access-6mxsg\") pod \"lws-controller-manager-5b89f4cf56-f675g\" (UID: \"b1bb5c1b-be4b-4680-9309-126980eafeac\") " pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-f675g" Apr 17 14:28:49.916512 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:49.916375 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/b1bb5c1b-be4b-4680-9309-126980eafeac-manager-config\") pod \"lws-controller-manager-5b89f4cf56-f675g\" (UID: \"b1bb5c1b-be4b-4680-9309-126980eafeac\") " pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-f675g" Apr 17 14:28:49.917118 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:49.917094 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/b1bb5c1b-be4b-4680-9309-126980eafeac-manager-config\") pod \"lws-controller-manager-5b89f4cf56-f675g\" (UID: \"b1bb5c1b-be4b-4680-9309-126980eafeac\") " pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-f675g" Apr 17 14:28:49.919367 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:49.919347 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/b1bb5c1b-be4b-4680-9309-126980eafeac-metrics-cert\") pod \"lws-controller-manager-5b89f4cf56-f675g\" (UID: \"b1bb5c1b-be4b-4680-9309-126980eafeac\") " pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-f675g" Apr 17 14:28:49.919467 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:49.919419 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1bb5c1b-be4b-4680-9309-126980eafeac-cert\") pod \"lws-controller-manager-5b89f4cf56-f675g\" (UID: \"b1bb5c1b-be4b-4680-9309-126980eafeac\") " pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-f675g" Apr 17 14:28:49.924429 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:49.924408 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mxsg\" (UniqueName: \"kubernetes.io/projected/b1bb5c1b-be4b-4680-9309-126980eafeac-kube-api-access-6mxsg\") pod \"lws-controller-manager-5b89f4cf56-f675g\" (UID: \"b1bb5c1b-be4b-4680-9309-126980eafeac\") " pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-f675g" Apr 17 14:28:50.040310 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:50.040243 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-f675g" Apr 17 14:28:51.036277 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:51.036251 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5b89f4cf56-f675g"] Apr 17 14:28:51.038621 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:28:51.038594 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1bb5c1b_be4b_4680_9309_126980eafeac.slice/crio-e5df8e7c0214f03e53acb34dfab317eb80cbd0585135931e70b235e91e26b16e WatchSource:0}: Error finding container e5df8e7c0214f03e53acb34dfab317eb80cbd0585135931e70b235e91e26b16e: Status 404 returned error can't find the container with id e5df8e7c0214f03e53acb34dfab317eb80cbd0585135931e70b235e91e26b16e Apr 17 14:28:52.038725 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:52.038691 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-tfhld" event={"ID":"61119e72-7585-49d8-ab6c-37132891c232","Type":"ContainerStarted","Data":"aabafc357d7fa7e06782b440d305a77c9895dd155c301d657ba76eeb15331099"} Apr 17 14:28:52.039155 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:52.038755 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-tfhld" Apr 17 14:28:52.039720 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:52.039700 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-f675g" event={"ID":"b1bb5c1b-be4b-4680-9309-126980eafeac","Type":"ContainerStarted","Data":"e5df8e7c0214f03e53acb34dfab317eb80cbd0585135931e70b235e91e26b16e"} Apr 17 14:28:52.061022 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:52.060833 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-tfhld" podStartSLOduration=1.58004086 podStartE2EDuration="4.060816094s" podCreationTimestamp="2026-04-17 14:28:48 +0000 UTC" firstStartedPulling="2026-04-17 14:28:48.482687808 +0000 UTC m=+423.516893683" lastFinishedPulling="2026-04-17 14:28:50.96346303 +0000 UTC m=+425.997668917" observedRunningTime="2026-04-17 14:28:52.057741889 +0000 UTC m=+427.091947797" watchObservedRunningTime="2026-04-17 14:28:52.060816094 +0000 UTC m=+427.095021993" Apr 17 14:28:55.051442 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:55.051407 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-f675g" event={"ID":"b1bb5c1b-be4b-4680-9309-126980eafeac","Type":"ContainerStarted","Data":"7666d5296c501f44e2b53ec60903e263e7981e2d45ebeaf9f943eebdda633d9c"} Apr 17 14:28:55.051883 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:55.051525 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-f675g" Apr 17 14:28:55.068440 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:28:55.068392 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-f675g" podStartSLOduration=3.101270998 podStartE2EDuration="6.068378473s" podCreationTimestamp="2026-04-17 14:28:49 +0000 UTC" firstStartedPulling="2026-04-17 14:28:51.040205103 +0000 UTC m=+426.074410985" lastFinishedPulling="2026-04-17 14:28:54.007312585 +0000 UTC m=+429.041518460" observedRunningTime="2026-04-17 14:28:55.06695068 +0000 UTC m=+430.101156571" watchObservedRunningTime="2026-04-17 14:28:55.068378473 +0000 UTC m=+430.102584368" Apr 17 14:29:03.046454 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:29:03.046422 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-tfhld" Apr 17 14:29:05.174284 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:29:05.174249 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-5b474cc896-ndktp"] Apr 17 14:29:05.176597 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:29:05.176567 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-5b474cc896-ndktp" Apr 17 14:29:05.179151 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:29:05.179126 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 17 14:29:05.179297 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:29:05.179137 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-m9jnm\"" Apr 17 14:29:05.179297 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:29:05.179207 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 17 14:29:05.180376 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:29:05.180357 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 14:29:05.180481 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:29:05.180374 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 14:29:05.188107 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:29:05.188084 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-5b474cc896-ndktp"] Apr 17 14:29:05.238247 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:29:05.238157 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkdtl\" (UniqueName: \"kubernetes.io/projected/586d834f-6ebb-4092-9eec-686a0d6fbccc-kube-api-access-vkdtl\") pod \"kube-auth-proxy-5b474cc896-ndktp\" (UID: \"586d834f-6ebb-4092-9eec-686a0d6fbccc\") " pod="openshift-ingress/kube-auth-proxy-5b474cc896-ndktp" Apr 17 14:29:05.238247 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:29:05.238247 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/586d834f-6ebb-4092-9eec-686a0d6fbccc-tmp\") pod \"kube-auth-proxy-5b474cc896-ndktp\" (UID: \"586d834f-6ebb-4092-9eec-686a0d6fbccc\") " pod="openshift-ingress/kube-auth-proxy-5b474cc896-ndktp" Apr 17 14:29:05.238466 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:29:05.238314 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/586d834f-6ebb-4092-9eec-686a0d6fbccc-tls-certs\") pod \"kube-auth-proxy-5b474cc896-ndktp\" (UID: \"586d834f-6ebb-4092-9eec-686a0d6fbccc\") " pod="openshift-ingress/kube-auth-proxy-5b474cc896-ndktp" Apr 17 14:29:05.339139 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:29:05.339104 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vkdtl\" (UniqueName: \"kubernetes.io/projected/586d834f-6ebb-4092-9eec-686a0d6fbccc-kube-api-access-vkdtl\") pod \"kube-auth-proxy-5b474cc896-ndktp\" (UID: \"586d834f-6ebb-4092-9eec-686a0d6fbccc\") " pod="openshift-ingress/kube-auth-proxy-5b474cc896-ndktp" Apr 17 14:29:05.339343 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:29:05.339187 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/586d834f-6ebb-4092-9eec-686a0d6fbccc-tmp\") pod \"kube-auth-proxy-5b474cc896-ndktp\" (UID: \"586d834f-6ebb-4092-9eec-686a0d6fbccc\") " pod="openshift-ingress/kube-auth-proxy-5b474cc896-ndktp" Apr 17 14:29:05.339343 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:29:05.339225 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/586d834f-6ebb-4092-9eec-686a0d6fbccc-tls-certs\") pod \"kube-auth-proxy-5b474cc896-ndktp\" (UID: \"586d834f-6ebb-4092-9eec-686a0d6fbccc\") " pod="openshift-ingress/kube-auth-proxy-5b474cc896-ndktp" Apr 17 14:29:05.339460 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:29:05.339361 2568 secret.go:189] Couldn't get secret openshift-ingress/kube-auth-proxy-tls: secret "kube-auth-proxy-tls" not found Apr 17 14:29:05.339460 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:29:05.339454 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/586d834f-6ebb-4092-9eec-686a0d6fbccc-tls-certs podName:586d834f-6ebb-4092-9eec-686a0d6fbccc nodeName:}" failed. No retries permitted until 2026-04-17 14:29:05.839429982 +0000 UTC m=+440.873635860 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/586d834f-6ebb-4092-9eec-686a0d6fbccc-tls-certs") pod "kube-auth-proxy-5b474cc896-ndktp" (UID: "586d834f-6ebb-4092-9eec-686a0d6fbccc") : secret "kube-auth-proxy-tls" not found Apr 17 14:29:05.341400 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:29:05.341379 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/586d834f-6ebb-4092-9eec-686a0d6fbccc-tmp\") pod \"kube-auth-proxy-5b474cc896-ndktp\" (UID: \"586d834f-6ebb-4092-9eec-686a0d6fbccc\") " pod="openshift-ingress/kube-auth-proxy-5b474cc896-ndktp" Apr 17 14:29:05.353418 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:29:05.353394 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkdtl\" (UniqueName: \"kubernetes.io/projected/586d834f-6ebb-4092-9eec-686a0d6fbccc-kube-api-access-vkdtl\") pod \"kube-auth-proxy-5b474cc896-ndktp\" (UID: \"586d834f-6ebb-4092-9eec-686a0d6fbccc\") " pod="openshift-ingress/kube-auth-proxy-5b474cc896-ndktp" Apr 17 14:29:05.843610 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:29:05.843579 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/586d834f-6ebb-4092-9eec-686a0d6fbccc-tls-certs\") pod \"kube-auth-proxy-5b474cc896-ndktp\" (UID: \"586d834f-6ebb-4092-9eec-686a0d6fbccc\") " pod="openshift-ingress/kube-auth-proxy-5b474cc896-ndktp" Apr 17 14:29:05.845965 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:29:05.845936 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/586d834f-6ebb-4092-9eec-686a0d6fbccc-tls-certs\") pod \"kube-auth-proxy-5b474cc896-ndktp\" (UID: \"586d834f-6ebb-4092-9eec-686a0d6fbccc\") " pod="openshift-ingress/kube-auth-proxy-5b474cc896-ndktp" Apr 17 14:29:06.057536 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:29:06.057501 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-f675g" Apr 17 14:29:06.087067 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:29:06.087034 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-5b474cc896-ndktp" Apr 17 14:29:06.217782 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:29:06.217749 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-5b474cc896-ndktp"] Apr 17 14:29:06.221605 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:29:06.221579 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod586d834f_6ebb_4092_9eec_686a0d6fbccc.slice/crio-c467e9128f03cf784af41773cefe61335468825fb651a9fb79520ef4722bb266 WatchSource:0}: Error finding container c467e9128f03cf784af41773cefe61335468825fb651a9fb79520ef4722bb266: Status 404 returned error can't find the container with id c467e9128f03cf784af41773cefe61335468825fb651a9fb79520ef4722bb266 Apr 17 14:29:07.094600 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:29:07.094558 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-5b474cc896-ndktp" event={"ID":"586d834f-6ebb-4092-9eec-686a0d6fbccc","Type":"ContainerStarted","Data":"c467e9128f03cf784af41773cefe61335468825fb651a9fb79520ef4722bb266"} Apr 17 14:29:10.108664 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:29:10.108551 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-5b474cc896-ndktp" event={"ID":"586d834f-6ebb-4092-9eec-686a0d6fbccc","Type":"ContainerStarted","Data":"378267a7df481ce1821897a29eb27061bc2d3dd0b2a2fff8b73a3fc114f06b4a"} Apr 17 14:29:10.127048 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:29:10.127005 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-5b474cc896-ndktp" podStartSLOduration=1.616408676 podStartE2EDuration="5.126990451s" podCreationTimestamp="2026-04-17 14:29:05 +0000 UTC" firstStartedPulling="2026-04-17 14:29:06.223286938 +0000 UTC m=+441.257492813" lastFinishedPulling="2026-04-17 14:29:09.733868711 +0000 UTC m=+444.768074588" observedRunningTime="2026-04-17 14:29:10.124955496 +0000 UTC m=+445.159161396" watchObservedRunningTime="2026-04-17 14:29:10.126990451 +0000 UTC m=+445.161196411" Apr 17 14:30:48.666444 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:30:48.666409 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vwxf9"] Apr 17 14:30:48.668708 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:30:48.668685 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vwxf9" Apr 17 14:30:48.671452 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:30:48.671431 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 14:30:48.671584 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:30:48.671431 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 17 14:30:48.671584 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:30:48.671434 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 17 14:30:48.671584 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:30:48.671434 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-ngzg4\"" Apr 17 14:30:48.672759 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:30:48.672743 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 14:30:48.679053 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:30:48.679030 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vwxf9"] Apr 17 14:30:48.776641 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:30:48.776608 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a1823878-6307-493c-ac83-fd78a1ba3c17-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-vwxf9\" (UID: \"a1823878-6307-493c-ac83-fd78a1ba3c17\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vwxf9" Apr 17 14:30:48.776777 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:30:48.776676 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd2kp\" (UniqueName: \"kubernetes.io/projected/a1823878-6307-493c-ac83-fd78a1ba3c17-kube-api-access-hd2kp\") pod \"kuadrant-console-plugin-6cb54b5c86-vwxf9\" (UID: \"a1823878-6307-493c-ac83-fd78a1ba3c17\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vwxf9" Apr 17 14:30:48.776777 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:30:48.776719 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1823878-6307-493c-ac83-fd78a1ba3c17-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-vwxf9\" (UID: \"a1823878-6307-493c-ac83-fd78a1ba3c17\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vwxf9" Apr 17 14:30:48.877088 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:30:48.877060 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1823878-6307-493c-ac83-fd78a1ba3c17-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-vwxf9\" (UID: \"a1823878-6307-493c-ac83-fd78a1ba3c17\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vwxf9" Apr 17 14:30:48.877265 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:30:48.877104 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a1823878-6307-493c-ac83-fd78a1ba3c17-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-vwxf9\" (UID: \"a1823878-6307-493c-ac83-fd78a1ba3c17\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vwxf9" Apr 17 14:30:48.877265 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:30:48.877152 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hd2kp\" (UniqueName: \"kubernetes.io/projected/a1823878-6307-493c-ac83-fd78a1ba3c17-kube-api-access-hd2kp\") pod \"kuadrant-console-plugin-6cb54b5c86-vwxf9\" (UID: \"a1823878-6307-493c-ac83-fd78a1ba3c17\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vwxf9" Apr 17 14:30:48.877265 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:30:48.877216 2568 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 17 14:30:48.877382 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:30:48.877287 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1823878-6307-493c-ac83-fd78a1ba3c17-plugin-serving-cert podName:a1823878-6307-493c-ac83-fd78a1ba3c17 nodeName:}" failed. No retries permitted until 2026-04-17 14:30:49.377270321 +0000 UTC m=+544.411476195 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/a1823878-6307-493c-ac83-fd78a1ba3c17-plugin-serving-cert") pod "kuadrant-console-plugin-6cb54b5c86-vwxf9" (UID: "a1823878-6307-493c-ac83-fd78a1ba3c17") : secret "plugin-serving-cert" not found Apr 17 14:30:48.877779 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:30:48.877762 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a1823878-6307-493c-ac83-fd78a1ba3c17-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-vwxf9\" (UID: \"a1823878-6307-493c-ac83-fd78a1ba3c17\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vwxf9" Apr 17 14:30:48.889456 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:30:48.889430 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd2kp\" (UniqueName: \"kubernetes.io/projected/a1823878-6307-493c-ac83-fd78a1ba3c17-kube-api-access-hd2kp\") pod \"kuadrant-console-plugin-6cb54b5c86-vwxf9\" (UID: \"a1823878-6307-493c-ac83-fd78a1ba3c17\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vwxf9" Apr 17 14:30:49.380643 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:30:49.380602 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1823878-6307-493c-ac83-fd78a1ba3c17-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-vwxf9\" (UID: \"a1823878-6307-493c-ac83-fd78a1ba3c17\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vwxf9" Apr 17 14:30:49.383100 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:30:49.383074 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1823878-6307-493c-ac83-fd78a1ba3c17-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-vwxf9\" (UID: \"a1823878-6307-493c-ac83-fd78a1ba3c17\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vwxf9" Apr 17 14:30:49.578291 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:30:49.578251 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vwxf9" Apr 17 14:30:49.694501 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:30:49.694467 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vwxf9"] Apr 17 14:30:49.696880 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:30:49.696854 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1823878_6307_493c_ac83_fd78a1ba3c17.slice/crio-a086da52801db980f1a3a8e58c31ce92895013e4fed145ce72dcebb274af7f44 WatchSource:0}: Error finding container a086da52801db980f1a3a8e58c31ce92895013e4fed145ce72dcebb274af7f44: Status 404 returned error can't find the container with id a086da52801db980f1a3a8e58c31ce92895013e4fed145ce72dcebb274af7f44 Apr 17 14:30:50.428177 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:30:50.428130 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vwxf9" event={"ID":"a1823878-6307-493c-ac83-fd78a1ba3c17","Type":"ContainerStarted","Data":"a086da52801db980f1a3a8e58c31ce92895013e4fed145ce72dcebb274af7f44"} Apr 17 14:31:13.515540 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:31:13.515504 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vwxf9" event={"ID":"a1823878-6307-493c-ac83-fd78a1ba3c17","Type":"ContainerStarted","Data":"fd2161b603980cf9d488cd753fac7253117175a5b50e2d6f1061a58b77ce9ea3"} Apr 17 14:31:13.531112 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:31:13.531051 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vwxf9" podStartSLOduration=2.155184734 podStartE2EDuration="25.531033316s" podCreationTimestamp="2026-04-17 14:30:48 +0000 UTC" firstStartedPulling="2026-04-17 14:30:49.69821548 +0000 UTC m=+544.732421360" lastFinishedPulling="2026-04-17 14:31:13.074064061 +0000 UTC m=+568.108269942" observedRunningTime="2026-04-17 14:31:13.530159321 +0000 UTC m=+568.564365218" watchObservedRunningTime="2026-04-17 14:31:13.531033316 +0000 UTC m=+568.565239225" Apr 17 14:31:32.246419 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:31:32.246383 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:31:32.289027 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:31:32.288995 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:31:32.289027 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:31:32.289025 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:31:32.289244 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:31:32.289143 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-m6hpq" Apr 17 14:31:32.291767 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:31:32.291744 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 17 14:31:32.356230 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:31:32.356201 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/b6bc4de0-19fb-4f6a-a9f1-8d437dcf1f1b-config-file\") pod \"limitador-limitador-78c99df468-m6hpq\" (UID: \"b6bc4de0-19fb-4f6a-a9f1-8d437dcf1f1b\") " pod="kuadrant-system/limitador-limitador-78c99df468-m6hpq" Apr 17 14:31:32.356430 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:31:32.356255 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhhmb\" (UniqueName: \"kubernetes.io/projected/b6bc4de0-19fb-4f6a-a9f1-8d437dcf1f1b-kube-api-access-hhhmb\") pod \"limitador-limitador-78c99df468-m6hpq\" (UID: \"b6bc4de0-19fb-4f6a-a9f1-8d437dcf1f1b\") " pod="kuadrant-system/limitador-limitador-78c99df468-m6hpq" Apr 17 14:31:32.456991 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:31:32.456958 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/b6bc4de0-19fb-4f6a-a9f1-8d437dcf1f1b-config-file\") pod \"limitador-limitador-78c99df468-m6hpq\" (UID: \"b6bc4de0-19fb-4f6a-a9f1-8d437dcf1f1b\") " pod="kuadrant-system/limitador-limitador-78c99df468-m6hpq" Apr 17 14:31:32.457157 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:31:32.457013 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hhhmb\" (UniqueName: \"kubernetes.io/projected/b6bc4de0-19fb-4f6a-a9f1-8d437dcf1f1b-kube-api-access-hhhmb\") pod \"limitador-limitador-78c99df468-m6hpq\" (UID: \"b6bc4de0-19fb-4f6a-a9f1-8d437dcf1f1b\") " pod="kuadrant-system/limitador-limitador-78c99df468-m6hpq" Apr 17 14:31:32.457596 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:31:32.457576 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/b6bc4de0-19fb-4f6a-a9f1-8d437dcf1f1b-config-file\") pod \"limitador-limitador-78c99df468-m6hpq\" (UID: \"b6bc4de0-19fb-4f6a-a9f1-8d437dcf1f1b\") " pod="kuadrant-system/limitador-limitador-78c99df468-m6hpq" Apr 17 14:31:32.464716 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:31:32.464694 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhhmb\" (UniqueName: \"kubernetes.io/projected/b6bc4de0-19fb-4f6a-a9f1-8d437dcf1f1b-kube-api-access-hhhmb\") pod \"limitador-limitador-78c99df468-m6hpq\" (UID: \"b6bc4de0-19fb-4f6a-a9f1-8d437dcf1f1b\") " pod="kuadrant-system/limitador-limitador-78c99df468-m6hpq" Apr 17 14:31:32.604676 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:31:32.604596 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-m6hpq" Apr 17 14:31:32.723513 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:31:32.723373 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:31:32.726176 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:31:32.726137 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6bc4de0_19fb_4f6a_a9f1_8d437dcf1f1b.slice/crio-87f498bb553ed96e8600c7695e4d53012add816840edac397474d397df880e57 WatchSource:0}: Error finding container 87f498bb553ed96e8600c7695e4d53012add816840edac397474d397df880e57: Status 404 returned error can't find the container with id 87f498bb553ed96e8600c7695e4d53012add816840edac397474d397df880e57 Apr 17 14:31:33.585055 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:31:33.585012 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-m6hpq" event={"ID":"b6bc4de0-19fb-4f6a-a9f1-8d437dcf1f1b","Type":"ContainerStarted","Data":"87f498bb553ed96e8600c7695e4d53012add816840edac397474d397df880e57"} Apr 17 14:31:36.595249 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:31:36.595215 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-m6hpq" event={"ID":"b6bc4de0-19fb-4f6a-a9f1-8d437dcf1f1b","Type":"ContainerStarted","Data":"09597de9087eee26fc4971f850d52947984d585b7335790a5f83d565ef494d96"} Apr 17 14:31:36.595715 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:31:36.595268 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-m6hpq" Apr 17 14:31:36.615992 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:31:36.615931 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-m6hpq" podStartSLOduration=1.633518571 podStartE2EDuration="4.615911768s" podCreationTimestamp="2026-04-17 14:31:32 +0000 UTC" firstStartedPulling="2026-04-17 14:31:32.728000436 +0000 UTC m=+587.762206313" lastFinishedPulling="2026-04-17 14:31:35.710393636 +0000 UTC m=+590.744599510" observedRunningTime="2026-04-17 14:31:36.615220286 +0000 UTC m=+591.649426182" watchObservedRunningTime="2026-04-17 14:31:36.615911768 +0000 UTC m=+591.650117664" Apr 17 14:31:45.501898 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:31:45.501870 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln5j2_93531d07-7bae-4782-818d-d6e8ceecf396/console-operator/2.log" Apr 17 14:31:45.502431 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:31:45.502040 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln5j2_93531d07-7bae-4782-818d-d6e8ceecf396/console-operator/2.log" Apr 17 14:31:45.505271 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:31:45.505252 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fm94_709e5989-ba48-455a-b8a9-25c4eafebaa4/ovn-acl-logging/0.log" Apr 17 14:31:45.505380 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:31:45.505299 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fm94_709e5989-ba48-455a-b8a9-25c4eafebaa4/ovn-acl-logging/0.log" Apr 17 14:31:47.599074 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:31:47.599045 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-m6hpq" Apr 17 14:32:07.626763 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:32:07.626733 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:32:45.403926 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:32:45.403885 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:32:50.400193 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:32:50.400134 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:32:53.302934 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:32:53.302896 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:33:01.496577 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:33:01.496531 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:33:08.800236 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:33:08.800197 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:33:20.100032 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:33:20.099996 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:33:29.993677 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:33:29.993597 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:34:28.802507 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:34:28.802473 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:34:33.698690 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:34:33.698659 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:34:39.992381 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:34:39.992341 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:34:50.195625 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:34:50.195589 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:34:58.493589 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:34:58.493505 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:35:08.902263 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:35:08.902228 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:35:18.296556 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:35:18.296517 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:35:28.498572 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:35:28.498537 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:36:29.895997 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:36:29.895909 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:36:45.523312 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:36:45.523278 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln5j2_93531d07-7bae-4782-818d-d6e8ceecf396/console-operator/2.log" Apr 17 14:36:45.525052 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:36:45.525026 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln5j2_93531d07-7bae-4782-818d-d6e8ceecf396/console-operator/2.log" Apr 17 14:36:45.526508 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:36:45.526484 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fm94_709e5989-ba48-455a-b8a9-25c4eafebaa4/ovn-acl-logging/0.log" Apr 17 14:36:45.528307 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:36:45.528282 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fm94_709e5989-ba48-455a-b8a9-25c4eafebaa4/ovn-acl-logging/0.log" Apr 17 14:36:45.608281 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:36:45.608246 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:37:23.495478 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:37:23.495437 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:37:41.092231 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:37:41.092195 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:37:55.189832 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:37:55.189730 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:38:10.691730 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:38:10.691694 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:39:04.990559 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:39:04.990527 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:39:14.892069 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:39:14.892030 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:39:30.292317 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:39:30.292236 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:39:39.894369 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:39:39.894335 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:39:56.293540 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:39:56.293507 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:40:04.092957 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:40:04.092924 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:40:37.596474 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:40:37.596439 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:40:45.493957 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:40:45.493917 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:40:53.987825 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:40:53.987785 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:41:02.495468 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:41:02.495432 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:41:10.791824 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:41:10.791792 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:41:27.491721 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:41:27.491671 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:41:38.296064 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:41:38.296026 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:41:45.544835 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:41:45.544810 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln5j2_93531d07-7bae-4782-818d-d6e8ceecf396/console-operator/2.log" Apr 17 14:41:45.546765 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:41:45.546743 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln5j2_93531d07-7bae-4782-818d-d6e8ceecf396/console-operator/2.log" Apr 17 14:41:45.548710 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:41:45.548692 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fm94_709e5989-ba48-455a-b8a9-25c4eafebaa4/ovn-acl-logging/0.log" Apr 17 14:41:45.550831 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:41:45.550812 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fm94_709e5989-ba48-455a-b8a9-25c4eafebaa4/ovn-acl-logging/0.log" Apr 17 14:42:25.493668 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:42:25.493587 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:42:33.388882 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:42:33.388845 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:42:42.595819 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:42:42.595782 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:42:51.490875 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:42:51.490836 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:43:00.400531 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:43:00.400495 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:43:08.298426 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:43:08.298388 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:43:16.888381 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:43:16.888350 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:43:21.995438 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:43:21.995391 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:43:25.691514 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:43:25.691481 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:43:35.093943 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:43:35.093905 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:43:43.093646 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:43:43.093606 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:43:52.792903 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:43:52.792872 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:44:01.298000 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:44:01.297912 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:44:10.394296 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:44:10.394256 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:44:17.792759 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:44:17.792716 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:44:27.595261 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:44:27.595229 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:44:34.892444 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:44:34.892410 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:44:45.104155 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:44:45.104112 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:44:53.096109 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:44:53.096070 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:45:00.148050 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:45:00.148014 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29607285-k47md"] Apr 17 14:45:00.151416 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:45:00.151393 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607285-k47md" Apr 17 14:45:00.154592 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:45:00.154569 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-lq9r8\"" Apr 17 14:45:00.170264 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:45:00.170232 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607285-k47md"] Apr 17 14:45:00.200279 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:45:00.200245 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pwbx\" (UniqueName: \"kubernetes.io/projected/22218945-b55e-462a-996f-bd7e70801982-kube-api-access-7pwbx\") pod \"maas-api-key-cleanup-29607285-k47md\" (UID: \"22218945-b55e-462a-996f-bd7e70801982\") " pod="opendatahub/maas-api-key-cleanup-29607285-k47md" Apr 17 14:45:00.300931 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:45:00.300895 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7pwbx\" (UniqueName: \"kubernetes.io/projected/22218945-b55e-462a-996f-bd7e70801982-kube-api-access-7pwbx\") pod \"maas-api-key-cleanup-29607285-k47md\" (UID: \"22218945-b55e-462a-996f-bd7e70801982\") " pod="opendatahub/maas-api-key-cleanup-29607285-k47md" Apr 17 14:45:00.310458 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:45:00.310432 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pwbx\" (UniqueName: \"kubernetes.io/projected/22218945-b55e-462a-996f-bd7e70801982-kube-api-access-7pwbx\") pod \"maas-api-key-cleanup-29607285-k47md\" (UID: \"22218945-b55e-462a-996f-bd7e70801982\") " pod="opendatahub/maas-api-key-cleanup-29607285-k47md" Apr 17 14:45:00.462040 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:45:00.461950 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607285-k47md" Apr 17 14:45:00.587700 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:45:00.587674 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607285-k47md"] Apr 17 14:45:00.590210 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:45:00.590183 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22218945_b55e_462a_996f_bd7e70801982.slice/crio-37e8d12f220a00355d559939b93245669157c1101977a504c444af6992858937 WatchSource:0}: Error finding container 37e8d12f220a00355d559939b93245669157c1101977a504c444af6992858937: Status 404 returned error can't find the container with id 37e8d12f220a00355d559939b93245669157c1101977a504c444af6992858937 Apr 17 14:45:00.592031 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:45:00.592018 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 14:45:01.187780 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:45:01.187747 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607285-k47md" event={"ID":"22218945-b55e-462a-996f-bd7e70801982","Type":"ContainerStarted","Data":"37e8d12f220a00355d559939b93245669157c1101977a504c444af6992858937"} Apr 17 14:45:03.196438 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:45:03.196397 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607285-k47md" event={"ID":"22218945-b55e-462a-996f-bd7e70801982","Type":"ContainerStarted","Data":"ec60a5f6cb94c34847bd2217992bbbc9c3998621c8dc055312b0313f643777df"} Apr 17 14:45:03.211807 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:45:03.211756 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29607285-k47md" podStartSLOduration=1.088699653 podStartE2EDuration="3.211742095s" podCreationTimestamp="2026-04-17 14:45:00 +0000 UTC" firstStartedPulling="2026-04-17 14:45:00.592136838 +0000 UTC m=+1395.626342712" lastFinishedPulling="2026-04-17 14:45:02.715179268 +0000 UTC m=+1397.749385154" observedRunningTime="2026-04-17 14:45:03.210203395 +0000 UTC m=+1398.244409282" watchObservedRunningTime="2026-04-17 14:45:03.211742095 +0000 UTC m=+1398.245947991" Apr 17 14:45:24.267692 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:45:24.267655 2568 generic.go:358] "Generic (PLEG): container finished" podID="22218945-b55e-462a-996f-bd7e70801982" containerID="ec60a5f6cb94c34847bd2217992bbbc9c3998621c8dc055312b0313f643777df" exitCode=6 Apr 17 14:45:24.268106 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:45:24.267727 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607285-k47md" event={"ID":"22218945-b55e-462a-996f-bd7e70801982","Type":"ContainerDied","Data":"ec60a5f6cb94c34847bd2217992bbbc9c3998621c8dc055312b0313f643777df"} Apr 17 14:45:24.268106 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:45:24.268058 2568 scope.go:117] "RemoveContainer" containerID="ec60a5f6cb94c34847bd2217992bbbc9c3998621c8dc055312b0313f643777df" Apr 17 14:45:25.272749 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:45:25.272715 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607285-k47md" event={"ID":"22218945-b55e-462a-996f-bd7e70801982","Type":"ContainerStarted","Data":"ac8e899035a7305ff5e3cdf477a6cd842d82d55162b4c0a505ff292aefb75655"} Apr 17 14:45:45.337596 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:45:45.337559 2568 generic.go:358] "Generic (PLEG): container finished" podID="22218945-b55e-462a-996f-bd7e70801982" containerID="ac8e899035a7305ff5e3cdf477a6cd842d82d55162b4c0a505ff292aefb75655" exitCode=6 Apr 17 14:45:45.338021 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:45:45.337623 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607285-k47md" event={"ID":"22218945-b55e-462a-996f-bd7e70801982","Type":"ContainerDied","Data":"ac8e899035a7305ff5e3cdf477a6cd842d82d55162b4c0a505ff292aefb75655"} Apr 17 14:45:45.338021 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:45:45.337654 2568 scope.go:117] "RemoveContainer" containerID="ec60a5f6cb94c34847bd2217992bbbc9c3998621c8dc055312b0313f643777df" Apr 17 14:45:45.338021 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:45:45.338009 2568 scope.go:117] "RemoveContainer" containerID="ac8e899035a7305ff5e3cdf477a6cd842d82d55162b4c0a505ff292aefb75655" Apr 17 14:45:45.338290 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:45:45.338268 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29607285-k47md_opendatahub(22218945-b55e-462a-996f-bd7e70801982)\"" pod="opendatahub/maas-api-key-cleanup-29607285-k47md" podUID="22218945-b55e-462a-996f-bd7e70801982" Apr 17 14:45:56.580254 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:45:56.580210 2568 scope.go:117] "RemoveContainer" containerID="ac8e899035a7305ff5e3cdf477a6cd842d82d55162b4c0a505ff292aefb75655" Apr 17 14:45:57.380196 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:45:57.380143 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607285-k47md" event={"ID":"22218945-b55e-462a-996f-bd7e70801982","Type":"ContainerStarted","Data":"1aedb6464b3b26021cd2589b377dc9bf4dc807b2817b763b9bfd535b3b1f9e8a"} Apr 17 14:45:57.639526 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:45:57.639453 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607285-k47md"] Apr 17 14:45:58.383887 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:45:58.383846 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29607285-k47md" podUID="22218945-b55e-462a-996f-bd7e70801982" containerName="cleanup" containerID="cri-o://1aedb6464b3b26021cd2589b377dc9bf4dc807b2817b763b9bfd535b3b1f9e8a" gracePeriod=30 Apr 17 14:46:17.321526 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:46:17.321502 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607285-k47md" Apr 17 14:46:17.449625 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:46:17.449538 2568 generic.go:358] "Generic (PLEG): container finished" podID="22218945-b55e-462a-996f-bd7e70801982" containerID="1aedb6464b3b26021cd2589b377dc9bf4dc807b2817b763b9bfd535b3b1f9e8a" exitCode=6 Apr 17 14:46:17.449625 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:46:17.449604 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607285-k47md" Apr 17 14:46:17.449820 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:46:17.449618 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607285-k47md" event={"ID":"22218945-b55e-462a-996f-bd7e70801982","Type":"ContainerDied","Data":"1aedb6464b3b26021cd2589b377dc9bf4dc807b2817b763b9bfd535b3b1f9e8a"} Apr 17 14:46:17.449820 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:46:17.449666 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607285-k47md" event={"ID":"22218945-b55e-462a-996f-bd7e70801982","Type":"ContainerDied","Data":"37e8d12f220a00355d559939b93245669157c1101977a504c444af6992858937"} Apr 17 14:46:17.449820 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:46:17.449681 2568 scope.go:117] "RemoveContainer" containerID="1aedb6464b3b26021cd2589b377dc9bf4dc807b2817b763b9bfd535b3b1f9e8a" Apr 17 14:46:17.450224 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:46:17.450209 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pwbx\" (UniqueName: \"kubernetes.io/projected/22218945-b55e-462a-996f-bd7e70801982-kube-api-access-7pwbx\") pod \"22218945-b55e-462a-996f-bd7e70801982\" (UID: \"22218945-b55e-462a-996f-bd7e70801982\") " Apr 17 14:46:17.452407 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:46:17.452384 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22218945-b55e-462a-996f-bd7e70801982-kube-api-access-7pwbx" (OuterVolumeSpecName: "kube-api-access-7pwbx") pod "22218945-b55e-462a-996f-bd7e70801982" (UID: "22218945-b55e-462a-996f-bd7e70801982"). InnerVolumeSpecName "kube-api-access-7pwbx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:46:17.462566 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:46:17.462545 2568 scope.go:117] "RemoveContainer" containerID="ac8e899035a7305ff5e3cdf477a6cd842d82d55162b4c0a505ff292aefb75655" Apr 17 14:46:17.469217 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:46:17.469201 2568 scope.go:117] "RemoveContainer" containerID="1aedb6464b3b26021cd2589b377dc9bf4dc807b2817b763b9bfd535b3b1f9e8a" Apr 17 14:46:17.469441 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:46:17.469424 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1aedb6464b3b26021cd2589b377dc9bf4dc807b2817b763b9bfd535b3b1f9e8a\": container with ID starting with 1aedb6464b3b26021cd2589b377dc9bf4dc807b2817b763b9bfd535b3b1f9e8a not found: ID does not exist" containerID="1aedb6464b3b26021cd2589b377dc9bf4dc807b2817b763b9bfd535b3b1f9e8a" Apr 17 14:46:17.469487 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:46:17.469448 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1aedb6464b3b26021cd2589b377dc9bf4dc807b2817b763b9bfd535b3b1f9e8a"} err="failed to get container status \"1aedb6464b3b26021cd2589b377dc9bf4dc807b2817b763b9bfd535b3b1f9e8a\": rpc error: code = NotFound desc = could not find container \"1aedb6464b3b26021cd2589b377dc9bf4dc807b2817b763b9bfd535b3b1f9e8a\": container with ID starting with 1aedb6464b3b26021cd2589b377dc9bf4dc807b2817b763b9bfd535b3b1f9e8a not found: ID does not exist" Apr 17 14:46:17.469487 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:46:17.469466 2568 scope.go:117] "RemoveContainer" containerID="ac8e899035a7305ff5e3cdf477a6cd842d82d55162b4c0a505ff292aefb75655" Apr 17 14:46:17.469657 ip-10-0-143-215 kubenswrapper[2568]: E0417 14:46:17.469639 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac8e899035a7305ff5e3cdf477a6cd842d82d55162b4c0a505ff292aefb75655\": container with ID starting with ac8e899035a7305ff5e3cdf477a6cd842d82d55162b4c0a505ff292aefb75655 not found: ID does not exist" containerID="ac8e899035a7305ff5e3cdf477a6cd842d82d55162b4c0a505ff292aefb75655" Apr 17 14:46:17.469700 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:46:17.469660 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac8e899035a7305ff5e3cdf477a6cd842d82d55162b4c0a505ff292aefb75655"} err="failed to get container status \"ac8e899035a7305ff5e3cdf477a6cd842d82d55162b4c0a505ff292aefb75655\": rpc error: code = NotFound desc = could not find container \"ac8e899035a7305ff5e3cdf477a6cd842d82d55162b4c0a505ff292aefb75655\": container with ID starting with ac8e899035a7305ff5e3cdf477a6cd842d82d55162b4c0a505ff292aefb75655 not found: ID does not exist" Apr 17 14:46:17.551297 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:46:17.551262 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7pwbx\" (UniqueName: \"kubernetes.io/projected/22218945-b55e-462a-996f-bd7e70801982-kube-api-access-7pwbx\") on node \"ip-10-0-143-215.ec2.internal\" DevicePath \"\"" Apr 17 14:46:17.777096 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:46:17.777059 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607285-k47md"] Apr 17 14:46:17.781664 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:46:17.781637 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607285-k47md"] Apr 17 14:46:19.584162 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:46:19.584123 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22218945-b55e-462a-996f-bd7e70801982" path="/var/lib/kubelet/pods/22218945-b55e-462a-996f-bd7e70801982/volumes" Apr 17 14:46:45.566753 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:46:45.566727 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln5j2_93531d07-7bae-4782-818d-d6e8ceecf396/console-operator/2.log" Apr 17 14:46:45.570043 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:46:45.570017 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln5j2_93531d07-7bae-4782-818d-d6e8ceecf396/console-operator/2.log" Apr 17 14:46:45.570946 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:46:45.570921 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fm94_709e5989-ba48-455a-b8a9-25c4eafebaa4/ovn-acl-logging/0.log" Apr 17 14:46:45.574188 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:46:45.574151 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fm94_709e5989-ba48-455a-b8a9-25c4eafebaa4/ovn-acl-logging/0.log" Apr 17 14:47:11.513870 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:47:11.513835 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:47:16.409037 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:47:16.409001 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:47:41.805637 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:47:41.805593 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:47:48.601224 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:47:48.601185 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:47:57.495503 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:47:57.495464 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:48:08.797035 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:48:08.796995 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:48:16.905425 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:48:16.905389 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:48:27.499836 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:48:27.499758 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:48:36.797265 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:48:36.797226 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:48:47.794302 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:48:47.794266 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:48:55.623474 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:48:55.623439 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:49:06.096475 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:49:06.096435 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:49:15.107122 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:49:15.107085 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:49:48.093346 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:49:48.093308 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:50:30.897627 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:50:30.897548 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:50:40.191931 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:50:40.191892 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:50:49.398132 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:50:49.398096 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:50:56.999180 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:50:56.999129 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:51:06.297088 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:51:06.297047 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:51:18.698221 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:51:18.698185 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:51:27.603198 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:51:27.603109 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:51:35.303288 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:51:35.303252 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:51:43.904078 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:51:43.904039 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:51:45.590453 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:51:45.590426 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln5j2_93531d07-7bae-4782-818d-d6e8ceecf396/console-operator/2.log" Apr 17 14:51:45.593793 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:51:45.593771 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln5j2_93531d07-7bae-4782-818d-d6e8ceecf396/console-operator/2.log" Apr 17 14:51:45.593933 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:51:45.593914 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fm94_709e5989-ba48-455a-b8a9-25c4eafebaa4/ovn-acl-logging/0.log" Apr 17 14:51:45.597635 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:51:45.597616 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fm94_709e5989-ba48-455a-b8a9-25c4eafebaa4/ovn-acl-logging/0.log" Apr 17 14:51:51.797607 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:51:51.797571 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:52:00.302871 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:52:00.302835 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:52:11.304483 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:52:11.304448 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:52:28.135624 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:52:28.135591 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:52:37.309282 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:52:37.309246 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:52:45.814019 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:52:45.813985 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:52:53.514865 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:52:53.514829 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:53:11.414755 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:53:11.414664 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:53:19.223463 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:53:19.223427 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:53:28.009692 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:53:28.009658 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:53:35.828978 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:53:35.828942 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:53:45.704716 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:53:45.704684 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:53:53.415473 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:53:53.415432 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:54:03.409342 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:54:03.409302 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:54:15.908304 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:54:15.908271 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:54:24.506454 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:54:24.506366 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:54:36.493970 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:54:36.493932 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:54:45.509951 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:54:45.509915 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:54:53.603059 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:54:53.603024 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:55:02.702910 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:55:02.702866 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:55:08.404848 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:55:08.404812 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:55:26.300323 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:55:26.299834 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:55:34.099407 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:55:34.099363 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:55:43.094039 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:55:43.094001 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:55:50.496161 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:55:50.496127 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:56:15.800368 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:15.800282 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:56:28.100661 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:28.100609 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:56:32.101232 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:32.101195 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m6hpq"] Apr 17 14:56:33.523332 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:33.523305 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-58c8f88b6d-tfhld_61119e72-7585-49d8-ab6c-37132891c232/manager/0.log" Apr 17 14:56:35.119976 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:35.119950 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-vwxf9_a1823878-6307-493c-ac83-fd78a1ba3c17/kuadrant-console-plugin/0.log" Apr 17 14:56:35.470635 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:35.470610 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-m6hpq_b6bc4de0-19fb-4f6a-a9f1-8d437dcf1f1b/limitador/0.log" Apr 17 14:56:36.240117 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:36.240090 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-5b474cc896-ndktp_586d834f-6ebb-4092-9eec-686a0d6fbccc/kube-auth-proxy/0.log" Apr 17 14:56:41.025490 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:41.025452 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fr76n/must-gather-dbmg8"] Apr 17 14:56:41.025884 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:41.025770 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22218945-b55e-462a-996f-bd7e70801982" containerName="cleanup" Apr 17 14:56:41.025884 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:41.025782 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="22218945-b55e-462a-996f-bd7e70801982" containerName="cleanup" Apr 17 14:56:41.025884 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:41.025794 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22218945-b55e-462a-996f-bd7e70801982" containerName="cleanup" Apr 17 14:56:41.025884 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:41.025800 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="22218945-b55e-462a-996f-bd7e70801982" containerName="cleanup" Apr 17 14:56:41.025884 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:41.025860 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="22218945-b55e-462a-996f-bd7e70801982" containerName="cleanup" Apr 17 14:56:41.025884 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:41.025868 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="22218945-b55e-462a-996f-bd7e70801982" containerName="cleanup" Apr 17 14:56:41.026077 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:41.025940 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22218945-b55e-462a-996f-bd7e70801982" containerName="cleanup" Apr 17 14:56:41.026077 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:41.025946 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="22218945-b55e-462a-996f-bd7e70801982" containerName="cleanup" Apr 17 14:56:41.026077 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:41.025989 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="22218945-b55e-462a-996f-bd7e70801982" containerName="cleanup" Apr 17 14:56:41.029056 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:41.029033 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fr76n/must-gather-dbmg8" Apr 17 14:56:41.032034 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:41.032009 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-fr76n\"/\"kube-root-ca.crt\"" Apr 17 14:56:41.033094 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:41.033073 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-fr76n\"/\"default-dockercfg-n2jw9\"" Apr 17 14:56:41.033238 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:41.033080 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-fr76n\"/\"openshift-service-ca.crt\"" Apr 17 14:56:41.045594 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:41.045565 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fr76n/must-gather-dbmg8"] Apr 17 14:56:41.124821 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:41.124778 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f6dc8fbd-c29f-43ec-98da-87f0a9824ec5-must-gather-output\") pod \"must-gather-dbmg8\" (UID: \"f6dc8fbd-c29f-43ec-98da-87f0a9824ec5\") " pod="openshift-must-gather-fr76n/must-gather-dbmg8" Apr 17 14:56:41.125005 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:41.124840 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2gqj\" (UniqueName: \"kubernetes.io/projected/f6dc8fbd-c29f-43ec-98da-87f0a9824ec5-kube-api-access-j2gqj\") pod \"must-gather-dbmg8\" (UID: \"f6dc8fbd-c29f-43ec-98da-87f0a9824ec5\") " pod="openshift-must-gather-fr76n/must-gather-dbmg8" Apr 17 14:56:41.226109 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:41.226066 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f6dc8fbd-c29f-43ec-98da-87f0a9824ec5-must-gather-output\") pod \"must-gather-dbmg8\" (UID: \"f6dc8fbd-c29f-43ec-98da-87f0a9824ec5\") " pod="openshift-must-gather-fr76n/must-gather-dbmg8" Apr 17 14:56:41.226286 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:41.226131 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j2gqj\" (UniqueName: \"kubernetes.io/projected/f6dc8fbd-c29f-43ec-98da-87f0a9824ec5-kube-api-access-j2gqj\") pod \"must-gather-dbmg8\" (UID: \"f6dc8fbd-c29f-43ec-98da-87f0a9824ec5\") " pod="openshift-must-gather-fr76n/must-gather-dbmg8" Apr 17 14:56:41.226452 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:41.226433 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f6dc8fbd-c29f-43ec-98da-87f0a9824ec5-must-gather-output\") pod \"must-gather-dbmg8\" (UID: \"f6dc8fbd-c29f-43ec-98da-87f0a9824ec5\") " pod="openshift-must-gather-fr76n/must-gather-dbmg8" Apr 17 14:56:41.236430 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:41.236403 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2gqj\" (UniqueName: \"kubernetes.io/projected/f6dc8fbd-c29f-43ec-98da-87f0a9824ec5-kube-api-access-j2gqj\") pod \"must-gather-dbmg8\" (UID: \"f6dc8fbd-c29f-43ec-98da-87f0a9824ec5\") " pod="openshift-must-gather-fr76n/must-gather-dbmg8" Apr 17 14:56:41.338742 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:41.338663 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fr76n/must-gather-dbmg8" Apr 17 14:56:41.461109 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:41.461078 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fr76n/must-gather-dbmg8"] Apr 17 14:56:41.463125 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:56:41.463089 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6dc8fbd_c29f_43ec_98da_87f0a9824ec5.slice/crio-038840f22abef50a4278fa942a0051e566c781721e855b019bf0227e50de47c4 WatchSource:0}: Error finding container 038840f22abef50a4278fa942a0051e566c781721e855b019bf0227e50de47c4: Status 404 returned error can't find the container with id 038840f22abef50a4278fa942a0051e566c781721e855b019bf0227e50de47c4 Apr 17 14:56:41.465369 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:41.465347 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 14:56:41.484895 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:41.484858 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fr76n/must-gather-dbmg8" event={"ID":"f6dc8fbd-c29f-43ec-98da-87f0a9824ec5","Type":"ContainerStarted","Data":"038840f22abef50a4278fa942a0051e566c781721e855b019bf0227e50de47c4"} Apr 17 14:56:42.490570 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:42.490529 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fr76n/must-gather-dbmg8" event={"ID":"f6dc8fbd-c29f-43ec-98da-87f0a9824ec5","Type":"ContainerStarted","Data":"e01627469437b899cb9fa7092205f1a07d1c958ccf1605d718c9216befab3256"} Apr 17 14:56:42.490915 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:42.490578 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fr76n/must-gather-dbmg8" event={"ID":"f6dc8fbd-c29f-43ec-98da-87f0a9824ec5","Type":"ContainerStarted","Data":"d3d2ba4c06688d99074bd6b1330bbe12a5f94cbded3c57e68fe42756b0998bb3"} Apr 17 14:56:42.510562 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:42.510503 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fr76n/must-gather-dbmg8" podStartSLOduration=0.789800295 podStartE2EDuration="1.510484797s" podCreationTimestamp="2026-04-17 14:56:41 +0000 UTC" firstStartedPulling="2026-04-17 14:56:41.46554122 +0000 UTC m=+2096.499747100" lastFinishedPulling="2026-04-17 14:56:42.186225723 +0000 UTC m=+2097.220431602" observedRunningTime="2026-04-17 14:56:42.507889323 +0000 UTC m=+2097.542095220" watchObservedRunningTime="2026-04-17 14:56:42.510484797 +0000 UTC m=+2097.544690694" Apr 17 14:56:43.751237 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:43.751192 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-8kqjl_bb5253c2-c2a0-4c5e-b333-f7ecd4bb0d29/global-pull-secret-syncer/0.log" Apr 17 14:56:43.994304 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:43.994271 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-r7mvj_48038e4e-5e89-4f13-aeb7-05e1197d4475/konnectivity-agent/0.log" Apr 17 14:56:44.086251 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:44.086184 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-143-215.ec2.internal_3514b8218264fee7cad79c255e536dea/haproxy/0.log" Apr 17 14:56:45.643187 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:45.643051 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln5j2_93531d07-7bae-4782-818d-d6e8ceecf396/console-operator/2.log" Apr 17 14:56:45.654187 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:45.651510 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln5j2_93531d07-7bae-4782-818d-d6e8ceecf396/console-operator/2.log" Apr 17 14:56:45.654187 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:45.653355 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fm94_709e5989-ba48-455a-b8a9-25c4eafebaa4/ovn-acl-logging/0.log" Apr 17 14:56:45.662187 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:45.659792 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fm94_709e5989-ba48-455a-b8a9-25c4eafebaa4/ovn-acl-logging/0.log" Apr 17 14:56:48.539008 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:48.538977 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-vwxf9_a1823878-6307-493c-ac83-fd78a1ba3c17/kuadrant-console-plugin/0.log" Apr 17 14:56:48.675984 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:48.675957 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-m6hpq_b6bc4de0-19fb-4f6a-a9f1-8d437dcf1f1b/limitador/0.log" Apr 17 14:56:50.411703 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:50.411675 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-rcpxg_45c30a2a-d629-471c-b0f0-30401c8f5083/kube-state-metrics/0.log" Apr 17 14:56:50.428244 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:50.428086 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-rcpxg_45c30a2a-d629-471c-b0f0-30401c8f5083/kube-rbac-proxy-main/0.log" Apr 17 14:56:50.446216 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:50.446108 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-rcpxg_45c30a2a-d629-471c-b0f0-30401c8f5083/kube-rbac-proxy-self/0.log" Apr 17 14:56:50.501409 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:50.501281 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-r7x4j_aebdc27c-3d37-4850-8498-6e2aa14e37c6/monitoring-plugin/0.log" Apr 17 14:56:50.530971 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:50.530947 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pjgwp_c22e4e5b-cdfb-4f36-892c-be821cb5bb18/node-exporter/0.log" Apr 17 14:56:50.554289 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:50.554257 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pjgwp_c22e4e5b-cdfb-4f36-892c-be821cb5bb18/kube-rbac-proxy/0.log" Apr 17 14:56:50.571098 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:50.571066 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pjgwp_c22e4e5b-cdfb-4f36-892c-be821cb5bb18/init-textfile/0.log" Apr 17 14:56:50.818199 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:50.818109 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b5dd786f-1d83-4b2c-a56a-6c01b76118e4/prometheus/0.log" Apr 17 14:56:50.854096 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:50.854066 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b5dd786f-1d83-4b2c-a56a-6c01b76118e4/config-reloader/0.log" Apr 17 14:56:50.874991 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:50.874962 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b5dd786f-1d83-4b2c-a56a-6c01b76118e4/thanos-sidecar/0.log" Apr 17 14:56:50.896615 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:50.896592 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b5dd786f-1d83-4b2c-a56a-6c01b76118e4/kube-rbac-proxy-web/0.log" Apr 17 14:56:50.916345 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:50.916308 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b5dd786f-1d83-4b2c-a56a-6c01b76118e4/kube-rbac-proxy/0.log" Apr 17 14:56:50.933836 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:50.933813 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b5dd786f-1d83-4b2c-a56a-6c01b76118e4/kube-rbac-proxy-thanos/0.log" Apr 17 14:56:50.952500 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:50.952474 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b5dd786f-1d83-4b2c-a56a-6c01b76118e4/init-config-reloader/0.log" Apr 17 14:56:50.975370 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:50.975342 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-fn9g6_f1cfbda9-213d-401f-85eb-8006efac438b/prometheus-operator/0.log" Apr 17 14:56:50.990657 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:50.990603 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-fn9g6_f1cfbda9-213d-401f-85eb-8006efac438b/kube-rbac-proxy/0.log" Apr 17 14:56:51.017498 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:51.017470 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-c7rdz_3dd4d766-4849-4e53-83b5-a6a7f75bed97/prometheus-operator-admission-webhook/0.log" Apr 17 14:56:52.573578 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:52.573520 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fr76n/perf-node-gather-daemonset-s6kpw"] Apr 17 14:56:52.580416 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:52.580378 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-s6kpw" Apr 17 14:56:52.587905 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:52.587878 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fr76n/perf-node-gather-daemonset-s6kpw"] Apr 17 14:56:52.631410 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:52.631375 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/65640481-ea7a-447d-8d62-2a7a4a2eee4f-sys\") pod \"perf-node-gather-daemonset-s6kpw\" (UID: \"65640481-ea7a-447d-8d62-2a7a4a2eee4f\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-s6kpw" Apr 17 14:56:52.631586 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:52.631431 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r25lk\" (UniqueName: \"kubernetes.io/projected/65640481-ea7a-447d-8d62-2a7a4a2eee4f-kube-api-access-r25lk\") pod \"perf-node-gather-daemonset-s6kpw\" (UID: \"65640481-ea7a-447d-8d62-2a7a4a2eee4f\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-s6kpw" Apr 17 14:56:52.631586 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:52.631479 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/65640481-ea7a-447d-8d62-2a7a4a2eee4f-podres\") pod \"perf-node-gather-daemonset-s6kpw\" (UID: \"65640481-ea7a-447d-8d62-2a7a4a2eee4f\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-s6kpw" Apr 17 14:56:52.631586 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:52.631506 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/65640481-ea7a-447d-8d62-2a7a4a2eee4f-lib-modules\") pod \"perf-node-gather-daemonset-s6kpw\" (UID: \"65640481-ea7a-447d-8d62-2a7a4a2eee4f\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-s6kpw" Apr 17 14:56:52.631586 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:52.631573 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/65640481-ea7a-447d-8d62-2a7a4a2eee4f-proc\") pod \"perf-node-gather-daemonset-s6kpw\" (UID: \"65640481-ea7a-447d-8d62-2a7a4a2eee4f\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-s6kpw" Apr 17 14:56:52.732148 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:52.732102 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/65640481-ea7a-447d-8d62-2a7a4a2eee4f-sys\") pod \"perf-node-gather-daemonset-s6kpw\" (UID: \"65640481-ea7a-447d-8d62-2a7a4a2eee4f\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-s6kpw" Apr 17 14:56:52.732364 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:52.732228 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r25lk\" (UniqueName: \"kubernetes.io/projected/65640481-ea7a-447d-8d62-2a7a4a2eee4f-kube-api-access-r25lk\") pod \"perf-node-gather-daemonset-s6kpw\" (UID: \"65640481-ea7a-447d-8d62-2a7a4a2eee4f\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-s6kpw" Apr 17 14:56:52.732364 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:52.732260 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/65640481-ea7a-447d-8d62-2a7a4a2eee4f-sys\") pod \"perf-node-gather-daemonset-s6kpw\" (UID: \"65640481-ea7a-447d-8d62-2a7a4a2eee4f\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-s6kpw" Apr 17 14:56:52.732364 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:52.732266 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/65640481-ea7a-447d-8d62-2a7a4a2eee4f-podres\") pod \"perf-node-gather-daemonset-s6kpw\" (UID: \"65640481-ea7a-447d-8d62-2a7a4a2eee4f\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-s6kpw" Apr 17 14:56:52.732364 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:52.732317 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/65640481-ea7a-447d-8d62-2a7a4a2eee4f-lib-modules\") pod \"perf-node-gather-daemonset-s6kpw\" (UID: \"65640481-ea7a-447d-8d62-2a7a4a2eee4f\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-s6kpw" Apr 17 14:56:52.732512 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:52.732369 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/65640481-ea7a-447d-8d62-2a7a4a2eee4f-podres\") pod \"perf-node-gather-daemonset-s6kpw\" (UID: \"65640481-ea7a-447d-8d62-2a7a4a2eee4f\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-s6kpw" Apr 17 14:56:52.732512 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:52.732411 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/65640481-ea7a-447d-8d62-2a7a4a2eee4f-proc\") pod \"perf-node-gather-daemonset-s6kpw\" (UID: \"65640481-ea7a-447d-8d62-2a7a4a2eee4f\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-s6kpw" Apr 17 14:56:52.732512 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:52.732466 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/65640481-ea7a-447d-8d62-2a7a4a2eee4f-lib-modules\") pod \"perf-node-gather-daemonset-s6kpw\" (UID: \"65640481-ea7a-447d-8d62-2a7a4a2eee4f\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-s6kpw" Apr 17 14:56:52.732512 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:52.732503 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/65640481-ea7a-447d-8d62-2a7a4a2eee4f-proc\") pod \"perf-node-gather-daemonset-s6kpw\" (UID: \"65640481-ea7a-447d-8d62-2a7a4a2eee4f\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-s6kpw" Apr 17 14:56:52.741667 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:52.741635 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r25lk\" (UniqueName: \"kubernetes.io/projected/65640481-ea7a-447d-8d62-2a7a4a2eee4f-kube-api-access-r25lk\") pod \"perf-node-gather-daemonset-s6kpw\" (UID: \"65640481-ea7a-447d-8d62-2a7a4a2eee4f\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-s6kpw" Apr 17 14:56:52.884608 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:52.884534 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln5j2_93531d07-7bae-4782-818d-d6e8ceecf396/console-operator/2.log" Apr 17 14:56:52.890219 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:52.890183 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ln5j2_93531d07-7bae-4782-818d-d6e8ceecf396/console-operator/3.log" Apr 17 14:56:52.893486 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:52.893459 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-s6kpw" Apr 17 14:56:53.054683 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:53.054648 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fr76n/perf-node-gather-daemonset-s6kpw"] Apr 17 14:56:53.057929 ip-10-0-143-215 kubenswrapper[2568]: W0417 14:56:53.057888 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod65640481_ea7a_447d_8d62_2a7a4a2eee4f.slice/crio-f5148a7701e02e054cf34b4c675651dcbebce47c38ddafc63c674e7b99cce182 WatchSource:0}: Error finding container f5148a7701e02e054cf34b4c675651dcbebce47c38ddafc63c674e7b99cce182: Status 404 returned error can't find the container with id f5148a7701e02e054cf34b4c675651dcbebce47c38ddafc63c674e7b99cce182 Apr 17 14:56:53.420873 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:53.420768 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-dwmhd_5d101190-d888-4039-937e-bdefcee0eb15/download-server/0.log" Apr 17 14:56:53.540457 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:53.540388 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-s6kpw" event={"ID":"65640481-ea7a-447d-8d62-2a7a4a2eee4f","Type":"ContainerStarted","Data":"40a0f2fc0392b1df80d46567949a6ae98d0d6d2479f288e869b78c18a0389355"} Apr 17 14:56:53.540457 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:53.540442 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-s6kpw" event={"ID":"65640481-ea7a-447d-8d62-2a7a4a2eee4f","Type":"ContainerStarted","Data":"f5148a7701e02e054cf34b4c675651dcbebce47c38ddafc63c674e7b99cce182"} Apr 17 14:56:53.540773 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:53.540540 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-s6kpw" Apr 17 14:56:53.560220 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:53.560138 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-s6kpw" podStartSLOduration=1.560120003 podStartE2EDuration="1.560120003s" podCreationTimestamp="2026-04-17 14:56:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:56:53.557803449 +0000 UTC m=+2108.592009384" watchObservedRunningTime="2026-04-17 14:56:53.560120003 +0000 UTC m=+2108.594325899" Apr 17 14:56:53.941245 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:53.941212 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-drrsx_bec6e69e-bcfa-4627-9496-dbf9608ffd71/volume-data-source-validator/0.log" Apr 17 14:56:54.722634 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:54.722607 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-gzm7s_a9c4445f-88cc-4c46-800e-db32500ad34d/dns/0.log" Apr 17 14:56:54.740508 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:54.740481 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-gzm7s_a9c4445f-88cc-4c46-800e-db32500ad34d/kube-rbac-proxy/0.log" Apr 17 14:56:54.866717 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:54.866688 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6f2x6_7f633d5b-7896-43f3-b506-dc236c755507/dns-node-resolver/0.log" Apr 17 14:56:55.404065 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:55.404033 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-r9nsg_d5d72b15-9ee0-40a2-b530-7847abb993f0/node-ca/0.log" Apr 17 14:56:56.299088 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:56.299057 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-5b474cc896-ndktp_586d834f-6ebb-4092-9eec-686a0d6fbccc/kube-auth-proxy/0.log" Apr 17 14:56:56.838927 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:56.838901 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-4nk2q_0c1ca4b0-b6b8-4bf7-8b62-7756a8d140e7/serve-healthcheck-canary/0.log" Apr 17 14:56:57.390996 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:57.390969 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hkr86_1fe1f6e6-dc60-4bb7-9375-e72c5d01275d/kube-rbac-proxy/0.log" Apr 17 14:56:57.407741 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:57.407715 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hkr86_1fe1f6e6-dc60-4bb7-9375-e72c5d01275d/exporter/0.log" Apr 17 14:56:57.427296 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:57.427272 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hkr86_1fe1f6e6-dc60-4bb7-9375-e72c5d01275d/extractor/0.log" Apr 17 14:56:59.491773 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:59.491745 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-58c8f88b6d-tfhld_61119e72-7585-49d8-ab6c-37132891c232/manager/0.log" Apr 17 14:56:59.557391 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:56:59.557364 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-s6kpw" Apr 17 14:57:00.537804 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:57:00.537774 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-5b89f4cf56-f675g_b1bb5c1b-be4b-4680-9309-126980eafeac/manager/0.log" Apr 17 14:57:05.027520 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:57:05.027485 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-nrsk2_8c673e08-3719-497d-8a89-99ae8c4bd1ee/migrator/0.log" Apr 17 14:57:05.062505 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:57:05.062481 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-nrsk2_8c673e08-3719-497d-8a89-99ae8c4bd1ee/graceful-termination/0.log" Apr 17 14:57:05.404951 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:57:05.404858 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-9p4v4_b9528958-b786-4c25-8d67-30d1493f6002/kube-storage-version-migrator-operator/1.log" Apr 17 14:57:05.405987 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:57:05.405952 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-9p4v4_b9528958-b786-4c25-8d67-30d1493f6002/kube-storage-version-migrator-operator/0.log" Apr 17 14:57:06.581629 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:57:06.581596 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6zwtc_65b9b252-7788-4f34-9046-a58499e7e849/kube-multus-additional-cni-plugins/0.log" Apr 17 14:57:06.600384 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:57:06.600356 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6zwtc_65b9b252-7788-4f34-9046-a58499e7e849/egress-router-binary-copy/0.log" Apr 17 14:57:06.620764 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:57:06.620736 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6zwtc_65b9b252-7788-4f34-9046-a58499e7e849/cni-plugins/0.log" Apr 17 14:57:06.638888 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:57:06.638857 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6zwtc_65b9b252-7788-4f34-9046-a58499e7e849/bond-cni-plugin/0.log" Apr 17 14:57:06.658070 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:57:06.658049 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6zwtc_65b9b252-7788-4f34-9046-a58499e7e849/routeoverride-cni/0.log" Apr 17 14:57:06.677039 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:57:06.677015 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6zwtc_65b9b252-7788-4f34-9046-a58499e7e849/whereabouts-cni-bincopy/0.log" Apr 17 14:57:06.712203 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:57:06.712162 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6zwtc_65b9b252-7788-4f34-9046-a58499e7e849/whereabouts-cni/0.log" Apr 17 14:57:06.944373 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:57:06.944298 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xgf47_ba2c74ab-e348-46bf-a8a9-3b804800268d/kube-multus/0.log" Apr 17 14:57:07.048181 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:57:07.048138 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-tg9jd_41c68694-ceb3-44f8-a9e8-e0655e8aa848/network-metrics-daemon/0.log" Apr 17 14:57:07.065606 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:57:07.065586 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-tg9jd_41c68694-ceb3-44f8-a9e8-e0655e8aa848/kube-rbac-proxy/0.log" Apr 17 14:57:07.878680 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:57:07.878649 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fm94_709e5989-ba48-455a-b8a9-25c4eafebaa4/ovn-controller/0.log" Apr 17 14:57:07.893698 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:57:07.893671 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fm94_709e5989-ba48-455a-b8a9-25c4eafebaa4/ovn-acl-logging/0.log" Apr 17 14:57:07.903226 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:57:07.903207 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fm94_709e5989-ba48-455a-b8a9-25c4eafebaa4/ovn-acl-logging/1.log" Apr 17 14:57:07.919446 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:57:07.919428 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fm94_709e5989-ba48-455a-b8a9-25c4eafebaa4/kube-rbac-proxy-node/0.log" Apr 17 14:57:07.936115 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:57:07.936088 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fm94_709e5989-ba48-455a-b8a9-25c4eafebaa4/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 14:57:07.952319 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:57:07.952301 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fm94_709e5989-ba48-455a-b8a9-25c4eafebaa4/northd/0.log" Apr 17 14:57:07.979790 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:57:07.979766 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fm94_709e5989-ba48-455a-b8a9-25c4eafebaa4/nbdb/0.log" Apr 17 14:57:08.029500 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:57:08.029478 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fm94_709e5989-ba48-455a-b8a9-25c4eafebaa4/sbdb/0.log" Apr 17 14:57:08.138729 ip-10-0-143-215 kubenswrapper[2568]: I0417 14:57:08.138625 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fm94_709e5989-ba48-455a-b8a9-25c4eafebaa4/ovnkube-controller/0.log"