Apr 22 19:23:18.041001 ip-10-0-129-175 systemd[1]: Starting Kubernetes Kubelet... Apr 22 19:23:18.469330 ip-10-0-129-175 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:23:18.469330 ip-10-0-129-175 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 19:23:18.469330 ip-10-0-129-175 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:23:18.469330 ip-10-0-129-175 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 19:23:18.469330 ip-10-0-129-175 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:23:18.471274 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.471175 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 19:23:18.475546 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475528 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:18.475546 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475547 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:18.475618 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475551 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:18.475618 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475555 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:18.475618 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475559 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:18.475618 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475563 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:18.475618 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475566 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:18.475618 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475569 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:18.475618 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475572 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:18.475618 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475575 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:18.475618 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475578 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:18.475618 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475581 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:18.475618 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475584 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:18.475618 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475587 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:18.475618 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475590 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:18.475618 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475593 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:18.475618 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475595 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:18.475618 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475598 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:18.475618 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475600 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:18.475618 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475605 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:18.475618 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475608 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:18.476077 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475612 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:18.476077 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475615 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:18.476077 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475617 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:18.476077 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475620 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:18.476077 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475626 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:18.476077 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475629 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:18.476077 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475633 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:18.476077 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475635 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:18.476077 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475638 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:18.476077 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475642 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:18.476077 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475645 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:18.476077 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475647 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:18.476077 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475650 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:18.476077 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475653 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:18.476077 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475655 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:18.476077 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475659 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:18.476077 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475661 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:18.476077 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475664 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:18.476077 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475667 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:18.476077 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475669 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:18.476571 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475672 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:18.476571 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475675 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:18.476571 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475678 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:18.476571 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475680 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:18.476571 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475682 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:18.476571 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475685 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:18.476571 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475688 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:18.476571 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475690 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:18.476571 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475692 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:18.476571 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475695 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:18.476571 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475698 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:18.476571 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475700 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:18.476571 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475703 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:18.476571 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475707 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:18.476571 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475709 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:18.476571 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475712 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:18.476571 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475715 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:18.476571 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475718 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:18.476571 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475721 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:18.476571 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475739 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:18.477068 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475742 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:18.477068 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475745 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:18.477068 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475747 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:18.477068 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475750 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:18.477068 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475754 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:18.477068 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475757 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:18.477068 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475759 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:18.477068 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475762 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:18.477068 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475765 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:18.477068 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475768 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:18.477068 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475771 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:18.477068 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475773 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:18.477068 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475776 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:18.477068 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475779 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:18.477068 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475782 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:18.477068 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475784 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:18.477068 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475787 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:18.477068 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475790 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:18.477068 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475792 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:18.477068 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475795 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:18.477582 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475797 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:18.477582 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475800 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:18.477582 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475802 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:18.477582 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475805 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:18.477582 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.475808 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:18.477582 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476246 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:18.477582 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476252 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:18.477582 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476255 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:18.477582 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476258 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:18.477582 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476261 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:18.477582 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476264 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:18.477582 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476266 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:18.477582 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476269 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:18.477582 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476272 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:18.477582 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476274 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:18.477582 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476277 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:18.477582 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476280 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:18.477582 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476283 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:18.477582 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476285 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:18.477582 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476288 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:18.478071 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476291 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:18.478071 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476294 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:18.478071 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476296 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:18.478071 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476299 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:18.478071 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476302 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:18.478071 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476304 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:18.478071 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476310 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:18.478071 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476313 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:18.478071 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476316 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:18.478071 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476318 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:18.478071 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476321 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:18.478071 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476323 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:18.478071 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476326 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:18.478071 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476328 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:18.478071 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476331 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:18.478071 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476333 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:18.478071 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476336 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:18.478071 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476339 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:18.478071 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476342 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:18.478537 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476345 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:18.478537 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476347 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:18.478537 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476350 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:18.478537 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476353 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:18.478537 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476356 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:18.478537 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476358 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:18.478537 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476361 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:18.478537 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476363 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:18.478537 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476366 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:18.478537 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476368 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:18.478537 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476371 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:18.478537 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476374 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:18.478537 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476376 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:18.478537 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476379 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:18.478537 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476382 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:18.478537 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476384 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:18.478537 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476387 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:18.478537 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476389 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:18.478537 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476392 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:18.478537 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476394 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:18.479042 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476397 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:18.479042 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476400 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:18.479042 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476403 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:18.479042 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476405 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:18.479042 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476408 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:18.479042 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476410 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:18.479042 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476413 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:18.479042 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476415 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:18.479042 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476418 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:18.479042 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476422 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:18.479042 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476425 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:18.479042 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476430 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:18.479042 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476433 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:18.479042 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476436 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:18.479042 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476438 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:18.479042 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476441 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:18.479042 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476443 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:18.479042 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476447 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:18.479042 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476451 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:18.479497 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476454 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:18.479497 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476457 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:18.479497 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476460 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:18.479497 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476464 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:18.479497 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476467 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:18.479497 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476470 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:18.479497 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476473 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:18.479497 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476476 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:18.479497 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476479 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:18.479497 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476481 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:18.479497 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476484 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:18.479497 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476486 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:18.479497 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.476489 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:18.479497 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476572 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 19:23:18.479497 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476584 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 19:23:18.479497 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476593 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 19:23:18.479497 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476600 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 19:23:18.479497 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476609 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 19:23:18.479497 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476614 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 19:23:18.479497 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476621 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 19:23:18.479497 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476628 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 19:23:18.480017 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476633 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 19:23:18.480017 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476637 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 19:23:18.480017 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476642 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 19:23:18.480017 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476645 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 19:23:18.480017 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476649 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 19:23:18.480017 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476652 2576 flags.go:64] FLAG: --cgroup-root="" Apr 22 19:23:18.480017 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476654 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 19:23:18.480017 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476658 2576 flags.go:64] FLAG: --client-ca-file="" Apr 22 19:23:18.480017 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476661 2576 flags.go:64] FLAG: --cloud-config="" Apr 22 19:23:18.480017 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476664 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 22 19:23:18.480017 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476667 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 19:23:18.480017 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476672 2576 flags.go:64] FLAG: --cluster-domain="" Apr 22 19:23:18.480017 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476675 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 19:23:18.480017 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476678 2576 flags.go:64] FLAG: --config-dir="" Apr 22 19:23:18.480017 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476681 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 19:23:18.480017 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476684 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 19:23:18.480017 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476688 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 19:23:18.480017 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476691 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 19:23:18.480017 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476695 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 19:23:18.480017 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476698 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 19:23:18.480017 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476702 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 22 19:23:18.480017 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476705 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 19:23:18.480017 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476708 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 19:23:18.480017 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476711 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 19:23:18.480017 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476715 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 19:23:18.480616 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476719 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 19:23:18.480616 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476736 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 19:23:18.480616 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476739 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 19:23:18.480616 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476742 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 19:23:18.480616 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476745 2576 flags.go:64] FLAG: --enable-server="true" Apr 22 19:23:18.480616 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476748 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 19:23:18.480616 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476753 2576 flags.go:64] FLAG: --event-burst="100" Apr 22 19:23:18.480616 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476756 2576 flags.go:64] FLAG: --event-qps="50" Apr 22 19:23:18.480616 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476760 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 19:23:18.480616 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476764 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 19:23:18.480616 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476767 2576 flags.go:64] FLAG: --eviction-hard="" Apr 22 19:23:18.480616 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476771 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 19:23:18.480616 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476774 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 19:23:18.480616 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476777 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 19:23:18.480616 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476781 2576 flags.go:64] FLAG: --eviction-soft="" Apr 22 19:23:18.480616 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476784 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 19:23:18.480616 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476787 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 19:23:18.480616 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476790 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 19:23:18.480616 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476793 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 19:23:18.480616 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476796 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 19:23:18.480616 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476799 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 19:23:18.480616 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476802 2576 flags.go:64] FLAG: --feature-gates="" Apr 22 19:23:18.480616 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476806 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 19:23:18.480616 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476810 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 19:23:18.480616 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476813 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 19:23:18.481221 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476816 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 19:23:18.481221 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476819 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 22 19:23:18.481221 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476823 2576 flags.go:64] FLAG: --help="false" Apr 22 19:23:18.481221 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476826 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-129-175.ec2.internal" Apr 22 19:23:18.481221 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476829 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 19:23:18.481221 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476832 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 19:23:18.481221 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476836 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 19:23:18.481221 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476839 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 19:23:18.481221 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476843 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 19:23:18.481221 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476847 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 19:23:18.481221 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476850 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 19:23:18.481221 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476853 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 19:23:18.481221 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476856 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 19:23:18.481221 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476859 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 19:23:18.481221 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476862 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 19:23:18.481221 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476866 2576 flags.go:64] FLAG: --kube-reserved="" Apr 22 19:23:18.481221 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476869 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 19:23:18.481221 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476872 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 19:23:18.481221 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476875 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 19:23:18.481221 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476877 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 19:23:18.481221 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476881 2576 flags.go:64] FLAG: --lock-file="" Apr 22 19:23:18.481221 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476884 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 19:23:18.481221 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476886 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 19:23:18.481221 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476890 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 19:23:18.481909 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476895 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 19:23:18.481909 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476898 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 19:23:18.481909 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476901 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 19:23:18.481909 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476904 2576 flags.go:64] FLAG: --logging-format="text" Apr 22 19:23:18.481909 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476907 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 19:23:18.481909 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476910 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 19:23:18.481909 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476913 2576 flags.go:64] FLAG: --manifest-url="" Apr 22 19:23:18.481909 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476916 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 22 19:23:18.481909 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476921 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 19:23:18.481909 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476924 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 19:23:18.481909 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476928 2576 flags.go:64] FLAG: --max-pods="110" Apr 22 19:23:18.481909 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476931 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 19:23:18.481909 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476934 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 19:23:18.481909 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476937 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 19:23:18.481909 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476940 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 19:23:18.481909 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476944 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 19:23:18.481909 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476947 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 19:23:18.481909 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476950 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 19:23:18.481909 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476958 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 19:23:18.481909 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476962 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 19:23:18.481909 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476965 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 19:23:18.481909 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476968 2576 flags.go:64] FLAG: --pod-cidr="" Apr 22 19:23:18.481909 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476974 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 19:23:18.482487 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476980 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 19:23:18.482487 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476983 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 19:23:18.482487 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476986 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 22 19:23:18.482487 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476989 2576 flags.go:64] FLAG: --port="10250" Apr 22 19:23:18.482487 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476993 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 19:23:18.482487 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476996 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-01d5392879a5b79f7" Apr 22 19:23:18.482487 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.476999 2576 flags.go:64] FLAG: --qos-reserved="" Apr 22 19:23:18.482487 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477002 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 22 19:23:18.482487 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477005 2576 flags.go:64] FLAG: --register-node="true" Apr 22 19:23:18.482487 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477008 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 22 19:23:18.482487 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477011 2576 flags.go:64] FLAG: --register-with-taints="" Apr 22 19:23:18.482487 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477015 2576 flags.go:64] FLAG: --registry-burst="10" Apr 22 19:23:18.482487 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477018 2576 flags.go:64] FLAG: --registry-qps="5" Apr 22 19:23:18.482487 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477021 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 22 19:23:18.482487 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477023 2576 flags.go:64] FLAG: --reserved-memory="" Apr 22 19:23:18.482487 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477027 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 19:23:18.482487 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477030 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 19:23:18.482487 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477033 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 19:23:18.482487 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477040 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 19:23:18.482487 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477043 2576 flags.go:64] FLAG: --runonce="false" Apr 22 19:23:18.482487 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477046 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 19:23:18.482487 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477049 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 19:23:18.482487 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477052 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 22 19:23:18.482487 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477055 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 19:23:18.482487 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477057 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 19:23:18.482487 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477061 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 19:23:18.483147 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477064 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 19:23:18.483147 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477067 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 19:23:18.483147 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477070 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 19:23:18.483147 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477074 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 19:23:18.483147 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477077 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 19:23:18.483147 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477081 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 19:23:18.483147 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477085 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 19:23:18.483147 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477089 2576 flags.go:64] FLAG: --system-cgroups="" Apr 22 19:23:18.483147 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477092 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 19:23:18.483147 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477097 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 19:23:18.483147 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477100 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 22 19:23:18.483147 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477103 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 19:23:18.483147 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477109 2576 flags.go:64] FLAG: --tls-min-version="" Apr 22 19:23:18.483147 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477112 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 19:23:18.483147 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477114 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 19:23:18.483147 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477118 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 19:23:18.483147 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477121 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 19:23:18.483147 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477124 2576 flags.go:64] FLAG: --v="2" Apr 22 19:23:18.483147 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477128 2576 flags.go:64] FLAG: --version="false" Apr 22 19:23:18.483147 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477133 2576 flags.go:64] FLAG: --vmodule="" Apr 22 19:23:18.483147 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477138 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 19:23:18.483147 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477141 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 19:23:18.483147 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477239 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:18.483147 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477243 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:18.483147 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477247 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:18.483745 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477250 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:18.483745 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477253 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:18.483745 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477256 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:18.483745 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477259 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:18.483745 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477262 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:18.483745 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477265 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:18.483745 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477268 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:18.483745 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477271 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:18.483745 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477274 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:18.483745 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477276 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:18.483745 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477280 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:18.483745 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477283 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:18.483745 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477287 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:18.483745 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477290 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:18.483745 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477293 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:18.483745 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477295 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:18.483745 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477298 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:18.483745 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477301 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:18.483745 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477303 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:18.484233 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477306 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:18.484233 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477308 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:18.484233 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477311 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:18.484233 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477314 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:18.484233 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477316 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:18.484233 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477319 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:18.484233 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477321 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:18.484233 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477325 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:18.484233 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477329 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:18.484233 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477332 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:18.484233 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477335 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:18.484233 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477339 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:18.484233 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477343 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:18.484233 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477346 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:18.484233 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477348 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:18.484233 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477351 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:18.484233 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477354 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:18.484233 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477356 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:18.484233 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477359 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:18.484705 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477362 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:18.484705 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477364 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:18.484705 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477367 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:18.484705 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477370 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:18.484705 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477373 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:18.484705 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477376 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:18.484705 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477381 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:18.484705 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477385 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:18.484705 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477388 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:18.484705 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477390 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:18.484705 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477393 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:18.484705 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477395 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:18.484705 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477398 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:18.484705 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477401 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:18.484705 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477403 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:18.484705 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477406 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:18.484705 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477409 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:18.484705 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477411 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:18.484705 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477414 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:18.485208 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477416 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:18.485208 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477419 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:18.485208 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477422 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:18.485208 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477424 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:18.485208 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477427 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:18.485208 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477429 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:18.485208 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477434 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:18.485208 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477436 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:18.485208 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477439 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:18.485208 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477441 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:18.485208 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477445 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:18.485208 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477447 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:18.485208 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477450 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:18.485208 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477452 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:18.485208 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477455 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:18.485208 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477458 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:18.485208 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477460 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:18.485208 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477463 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:18.485208 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477466 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:18.485208 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477470 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:18.485694 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477473 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:18.485694 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477475 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:18.485694 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477478 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:18.485694 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477481 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:18.485694 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477483 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:18.485694 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.477486 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:18.485694 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.477491 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:23:18.485694 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.484476 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 19:23:18.485694 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.484493 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 19:23:18.485694 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484540 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:18.485694 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484547 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:18.485694 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484553 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:18.485694 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484556 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:18.485694 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484559 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:18.485694 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484562 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:18.486133 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484565 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:18.486133 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484568 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:18.486133 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484571 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:18.486133 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484573 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:18.486133 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484576 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:18.486133 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484578 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:18.486133 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484581 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:18.486133 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484584 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:18.486133 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484586 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:18.486133 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484591 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:18.486133 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484594 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:18.486133 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484597 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:18.486133 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484600 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:18.486133 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484603 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:18.486133 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484606 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:18.486133 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484608 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:18.486133 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484611 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:18.486133 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484614 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:18.486133 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484616 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:18.486596 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484619 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:18.486596 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484621 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:18.486596 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484624 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:18.486596 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484626 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:18.486596 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484629 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:18.486596 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484631 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:18.486596 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484636 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:18.486596 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484639 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:18.486596 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484642 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:18.486596 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484644 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:18.486596 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484648 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:18.486596 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484651 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:18.486596 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484653 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:18.486596 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484656 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:18.486596 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484658 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:18.486596 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484661 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:18.486596 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484663 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:18.486596 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484666 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:18.486596 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484670 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:18.486596 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484672 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:18.487190 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484674 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:18.487190 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484677 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:18.487190 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484679 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:18.487190 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484682 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:18.487190 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484685 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:18.487190 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484687 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:18.487190 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484690 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:18.487190 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484692 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:18.487190 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484695 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:18.487190 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484697 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:18.487190 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484700 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:18.487190 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484702 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:18.487190 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484705 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:18.487190 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484707 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:18.487190 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484711 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:18.487190 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484713 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:18.487190 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484716 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:18.487190 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484719 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:18.487190 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484735 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:18.487190 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484739 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:18.487672 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484742 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:18.487672 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484744 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:18.487672 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484747 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:18.487672 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484750 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:18.487672 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484753 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:18.487672 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484755 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:18.487672 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484758 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:18.487672 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484760 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:18.487672 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484763 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:18.487672 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484765 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:18.487672 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484768 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:18.487672 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484771 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:18.487672 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484774 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:18.487672 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484776 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:18.487672 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484779 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:18.487672 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484781 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:18.487672 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484784 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:18.487672 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484786 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:18.487672 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484789 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:18.487672 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484792 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:18.488187 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484795 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:18.488187 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.484800 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:23:18.488187 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484905 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:18.488187 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484911 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:18.488187 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484914 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:18.488187 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484918 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:18.488187 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484921 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:18.488187 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484923 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:18.488187 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484926 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:18.488187 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484929 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:18.488187 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484932 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:18.488187 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484935 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:18.488187 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484939 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:18.488187 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484941 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:18.488187 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484944 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:18.488187 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484946 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:18.488570 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484949 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:18.488570 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484952 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:18.488570 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484954 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:18.488570 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484957 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:18.488570 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484959 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:18.488570 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484962 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:18.488570 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484964 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:18.488570 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484967 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:18.488570 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484970 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:18.488570 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484972 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:18.488570 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484976 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:18.488570 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484981 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:18.488570 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484984 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:18.488570 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484987 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:18.488570 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484989 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:18.488570 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484992 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:18.488570 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484995 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:18.488570 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.484997 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:18.488570 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485000 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:18.489030 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485002 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:18.489030 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485005 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:18.489030 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485007 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:18.489030 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485010 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:18.489030 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485012 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:18.489030 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485015 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:18.489030 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485018 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:18.489030 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485020 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:18.489030 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485023 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:18.489030 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485026 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:18.489030 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485029 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:18.489030 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485032 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:18.489030 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485035 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:18.489030 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485037 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:18.489030 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485039 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:18.489030 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485042 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:18.489030 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485044 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:18.489030 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485047 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:18.489030 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485050 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:18.489030 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485053 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:18.489510 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485055 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:18.489510 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485058 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:18.489510 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485060 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:18.489510 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485063 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:18.489510 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485065 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:18.489510 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485068 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:18.489510 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485070 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:18.489510 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485073 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:18.489510 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485075 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:18.489510 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485079 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:18.489510 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485082 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:18.489510 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485085 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:18.489510 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485088 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:18.489510 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485091 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:18.489510 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485093 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:18.489510 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485096 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:18.489510 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485098 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:18.489510 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485101 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:18.489510 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485104 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:18.489982 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485107 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:18.489982 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485110 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:18.489982 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485113 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:18.489982 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485115 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:18.489982 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485118 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:18.489982 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485121 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:18.489982 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485124 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:18.489982 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485126 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:18.489982 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485129 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:18.489982 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485132 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:18.489982 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485134 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:18.489982 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485137 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:18.489982 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485139 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:18.489982 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:18.485142 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:18.489982 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.485147 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:23:18.489982 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.485999 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 19:23:18.491149 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.491134 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 19:23:18.492295 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.492283 2576 server.go:1019] "Starting client certificate rotation" Apr 22 19:23:18.492416 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.492392 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:23:18.492475 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.492455 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:23:18.523802 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.523770 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:23:18.529973 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.529947 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:23:18.543694 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.543672 2576 log.go:25] "Validated CRI v1 runtime API" Apr 22 19:23:18.549114 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.549093 2576 log.go:25] "Validated CRI v1 image API" Apr 22 19:23:18.550614 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.550592 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 19:23:18.555672 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.555645 2576 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 e8400f85-063f-4688-833c-59e402cefab0:/dev/nvme0n1p4 f295b362-b2a1-4dd6-abd2-de36a035253b:/dev/nvme0n1p3] Apr 22 19:23:18.555755 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.555670 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 19:23:18.556100 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.556083 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:23:18.562631 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.562502 2576 manager.go:217] Machine: {Timestamp:2026-04-22 19:23:18.559606091 +0000 UTC m=+0.420559819 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101988 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec23b9891fa3a12e84667ec42e8b0d1f SystemUUID:ec23b989-1fa3-a12e-8466-7ec42e8b0d1f BootID:c9220006-50c8-4e23-8161-7f909e76c864 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:14:57:d5:cc:e5 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:14:57:d5:cc:e5 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:2e:f7:1d:06:45:f1 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 19:23:18.562631 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.562622 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 19:23:18.562771 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.562720 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 19:23:18.564276 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.564247 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 19:23:18.564438 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.564279 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-175.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 19:23:18.564487 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.564448 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 19:23:18.564487 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.564457 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 19:23:18.564487 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.564476 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:23:18.565319 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.565307 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:23:18.566953 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.566941 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:23:18.567068 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.567059 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 19:23:18.571434 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.571420 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 22 19:23:18.571482 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.571437 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 19:23:18.571482 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.571454 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 19:23:18.571482 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.571465 2576 kubelet.go:397] "Adding apiserver pod source" Apr 22 19:23:18.571482 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.571474 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 19:23:18.573549 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.573534 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:23:18.573597 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.573555 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:23:18.581168 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.581144 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 19:23:18.583737 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.583696 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 19:23:18.585620 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.585603 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 19:23:18.585620 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.585624 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 19:23:18.585752 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.585630 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 19:23:18.585752 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.585636 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 19:23:18.585752 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.585642 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 19:23:18.585752 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.585648 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 19:23:18.585752 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.585654 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 19:23:18.585752 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.585660 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 19:23:18.585752 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.585667 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 19:23:18.585752 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.585673 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 19:23:18.585752 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.585682 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 19:23:18.585752 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.585691 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 19:23:18.586498 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.586487 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 19:23:18.586498 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.586498 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 19:23:18.588540 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:18.588515 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 19:23:18.588540 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:18.588522 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-175.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 19:23:18.590354 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.590342 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 19:23:18.590391 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.590379 2576 server.go:1295] "Started kubelet" Apr 22 19:23:18.590495 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.590467 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 19:23:18.590570 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.590528 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 19:23:18.590617 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.590591 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 19:23:18.591257 ip-10-0-129-175 systemd[1]: Started Kubernetes Kubelet. Apr 22 19:23:18.591489 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.591471 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 19:23:18.592951 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.592934 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 22 19:23:18.599147 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:18.599129 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 19:23:18.600074 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.600045 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 19:23:18.600374 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.600357 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-175.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 19:23:18.600512 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.600494 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 19:23:18.601130 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.601110 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 19:23:18.601208 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.601111 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 19:23:18.601208 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.601161 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 19:23:18.601323 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:18.600270 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-175.ec2.internal.18a8c4382f6a144a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-175.ec2.internal,UID:ip-10-0-129-175.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-175.ec2.internal,},FirstTimestamp:2026-04-22 19:23:18.590354506 +0000 UTC m=+0.451308236,LastTimestamp:2026-04-22 19:23:18.590354506 +0000 UTC m=+0.451308236,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-175.ec2.internal,}" Apr 22 19:23:18.601323 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.601289 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 22 19:23:18.601323 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.601297 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 22 19:23:18.601476 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.601343 2576 factory.go:55] Registering systemd factory Apr 22 19:23:18.601476 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:18.601364 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-175.ec2.internal\" not found" Apr 22 19:23:18.601476 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.601394 2576 factory.go:223] Registration of the systemd container factory successfully Apr 22 19:23:18.601630 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.601618 2576 factory.go:153] Registering CRI-O factory Apr 22 19:23:18.601660 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.601632 2576 factory.go:223] Registration of the crio container factory successfully Apr 22 19:23:18.601698 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.601689 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 19:23:18.601756 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.601710 2576 factory.go:103] Registering Raw factory Apr 22 19:23:18.601756 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.601721 2576 manager.go:1196] Started watching for new ooms in manager Apr 22 19:23:18.601947 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.601931 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zp5nq" Apr 22 19:23:18.602375 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.602364 2576 manager.go:319] Starting recovery of all containers Apr 22 19:23:18.608260 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.608236 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zp5nq" Apr 22 19:23:18.611954 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:18.611777 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 19:23:18.612054 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:18.611949 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-129-175.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 19:23:18.613988 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.613871 2576 manager.go:324] Recovery completed Apr 22 19:23:18.615736 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:18.615706 2576 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 22 19:23:18.618678 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.618665 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:18.623166 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.623148 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-175.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:18.623246 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.623179 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-175.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:18.623246 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.623189 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-175.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:18.623799 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.623783 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 19:23:18.623799 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.623796 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 19:23:18.623879 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.623826 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:23:18.627318 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.627306 2576 policy_none.go:49] "None policy: Start" Apr 22 19:23:18.627368 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.627321 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 19:23:18.627368 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.627340 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 22 19:23:18.665376 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.665360 2576 manager.go:341] "Starting Device Plugin manager" Apr 22 19:23:18.691712 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:18.665404 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 19:23:18.691712 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.665416 2576 server.go:85] "Starting device plugin registration server" Apr 22 19:23:18.691712 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.665705 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 19:23:18.691712 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.665717 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 19:23:18.691712 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.665864 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 19:23:18.691712 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.665988 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 19:23:18.691712 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.665997 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 19:23:18.691712 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:18.666469 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 19:23:18.691712 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:18.666508 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-175.ec2.internal\" not found" Apr 22 19:23:18.731411 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.731364 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 19:23:18.732608 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.732588 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 19:23:18.732744 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.732615 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 19:23:18.732744 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.732635 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 19:23:18.732744 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.732641 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 19:23:18.732744 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:18.732679 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 19:23:18.735271 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.735249 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:18.766497 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.766473 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:18.767371 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.767352 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-175.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:18.767478 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.767390 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-175.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:18.767478 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.767405 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-175.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:18.767478 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.767437 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-175.ec2.internal" Apr 22 19:23:18.779928 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.779906 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-175.ec2.internal" Apr 22 19:23:18.780032 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:18.779934 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-175.ec2.internal\": node \"ip-10-0-129-175.ec2.internal\" not found" Apr 22 19:23:18.797706 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:18.797678 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-175.ec2.internal\" not found" Apr 22 19:23:18.833717 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.833685 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-175.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-175.ec2.internal"] Apr 22 19:23:18.833868 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.833779 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:18.834673 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.834657 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-175.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:18.834762 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.834692 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-175.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:18.834762 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.834706 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-175.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:18.837079 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.837062 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:18.837220 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.837205 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-175.ec2.internal" Apr 22 19:23:18.837264 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.837236 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:18.837760 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.837741 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-175.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:18.837760 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.837753 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-175.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:18.837883 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.837773 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-175.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:18.837883 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.837774 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-175.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:18.837883 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.837783 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-175.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:18.837883 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.837788 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-175.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:18.839984 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.839970 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-175.ec2.internal" Apr 22 19:23:18.840042 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.839995 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:18.840667 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.840648 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-175.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:18.840778 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.840678 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-175.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:18.840778 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.840688 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-175.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:18.862438 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:18.862415 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-175.ec2.internal\" not found" node="ip-10-0-129-175.ec2.internal" Apr 22 19:23:18.866884 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:18.866864 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-175.ec2.internal\" not found" node="ip-10-0-129-175.ec2.internal" Apr 22 19:23:18.897911 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:18.897884 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-175.ec2.internal\" not found" Apr 22 19:23:18.902887 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.902830 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6f55add77c4d902c98f085f4badc816c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-175.ec2.internal\" (UID: \"6f55add77c4d902c98f085f4badc816c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-175.ec2.internal" Apr 22 19:23:18.902887 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.902869 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6f55add77c4d902c98f085f4badc816c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-175.ec2.internal\" (UID: \"6f55add77c4d902c98f085f4badc816c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-175.ec2.internal" Apr 22 19:23:18.903042 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:18.902894 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/65b71d693c1ea98bc6c9a5017263d773-config\") pod \"kube-apiserver-proxy-ip-10-0-129-175.ec2.internal\" (UID: \"65b71d693c1ea98bc6c9a5017263d773\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-175.ec2.internal" Apr 22 19:23:18.998324 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:18.998297 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-175.ec2.internal\" not found" Apr 22 19:23:19.003649 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:19.003630 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6f55add77c4d902c98f085f4badc816c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-175.ec2.internal\" (UID: \"6f55add77c4d902c98f085f4badc816c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-175.ec2.internal" Apr 22 19:23:19.003757 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:19.003674 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6f55add77c4d902c98f085f4badc816c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-175.ec2.internal\" (UID: \"6f55add77c4d902c98f085f4badc816c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-175.ec2.internal" Apr 22 19:23:19.003757 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:19.003701 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/65b71d693c1ea98bc6c9a5017263d773-config\") pod \"kube-apiserver-proxy-ip-10-0-129-175.ec2.internal\" (UID: \"65b71d693c1ea98bc6c9a5017263d773\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-175.ec2.internal" Apr 22 19:23:19.003757 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:19.003742 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6f55add77c4d902c98f085f4badc816c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-175.ec2.internal\" (UID: \"6f55add77c4d902c98f085f4badc816c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-175.ec2.internal" Apr 22 19:23:19.003877 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:19.003776 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6f55add77c4d902c98f085f4badc816c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-175.ec2.internal\" (UID: \"6f55add77c4d902c98f085f4badc816c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-175.ec2.internal" Apr 22 19:23:19.003877 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:19.003776 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/65b71d693c1ea98bc6c9a5017263d773-config\") pod \"kube-apiserver-proxy-ip-10-0-129-175.ec2.internal\" (UID: \"65b71d693c1ea98bc6c9a5017263d773\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-175.ec2.internal" Apr 22 19:23:19.099137 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:19.099093 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-175.ec2.internal\" not found" Apr 22 19:23:19.164652 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:19.164591 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-175.ec2.internal" Apr 22 19:23:19.169628 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:19.169606 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-175.ec2.internal" Apr 22 19:23:19.199373 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:19.199340 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-175.ec2.internal\" not found" Apr 22 19:23:19.299896 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:19.299852 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-175.ec2.internal\" not found" Apr 22 19:23:19.400394 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:19.400355 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-175.ec2.internal\" not found" Apr 22 19:23:19.492946 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:19.492856 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 19:23:19.493429 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:19.492996 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:23:19.501011 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:19.500991 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-175.ec2.internal\" not found" Apr 22 19:23:19.600822 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:19.600792 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 19:23:19.601088 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:19.601069 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-175.ec2.internal\" not found" Apr 22 19:23:19.612307 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:19.612282 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 19:18:18 +0000 UTC" deadline="2027-12-05 11:12:16.896447215 +0000 UTC" Apr 22 19:23:19.612307 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:19.612306 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14199h48m57.284145012s" Apr 22 19:23:19.615896 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:19.615878 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:23:19.638536 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:19.638508 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-rkrnq" Apr 22 19:23:19.646781 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:19.646760 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-rkrnq" Apr 22 19:23:19.701780 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:19.701741 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-175.ec2.internal\" not found" Apr 22 19:23:19.729472 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:19.729428 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65b71d693c1ea98bc6c9a5017263d773.slice/crio-3a10cbb3f6f14689c02a1b0a3a0de6ca6b4dc3d542ed7461a303d7f0a4054524 WatchSource:0}: Error finding container 3a10cbb3f6f14689c02a1b0a3a0de6ca6b4dc3d542ed7461a303d7f0a4054524: Status 404 returned error can't find the container with id 3a10cbb3f6f14689c02a1b0a3a0de6ca6b4dc3d542ed7461a303d7f0a4054524 Apr 22 19:23:19.730031 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:19.730006 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f55add77c4d902c98f085f4badc816c.slice/crio-4ee9a2fe5956f5e9775f279ee82f9f4ca53f4c2701de0c9d6f0ac3bd59b4b55d WatchSource:0}: Error finding container 4ee9a2fe5956f5e9775f279ee82f9f4ca53f4c2701de0c9d6f0ac3bd59b4b55d: Status 404 returned error can't find the container with id 4ee9a2fe5956f5e9775f279ee82f9f4ca53f4c2701de0c9d6f0ac3bd59b4b55d Apr 22 19:23:19.733555 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:19.733535 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:23:19.735236 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:19.735200 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-175.ec2.internal" event={"ID":"6f55add77c4d902c98f085f4badc816c","Type":"ContainerStarted","Data":"4ee9a2fe5956f5e9775f279ee82f9f4ca53f4c2701de0c9d6f0ac3bd59b4b55d"} Apr 22 19:23:19.736247 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:19.736225 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-175.ec2.internal" event={"ID":"65b71d693c1ea98bc6c9a5017263d773","Type":"ContainerStarted","Data":"3a10cbb3f6f14689c02a1b0a3a0de6ca6b4dc3d542ed7461a303d7f0a4054524"} Apr 22 19:23:19.801915 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:19.801820 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-175.ec2.internal\" not found" Apr 22 19:23:19.902228 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:19.902201 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-175.ec2.internal\" not found" Apr 22 19:23:19.987165 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:19.987139 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:20.002875 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:20.002852 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-175.ec2.internal\" not found" Apr 22 19:23:20.094212 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.094131 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:20.103664 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:20.103634 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-175.ec2.internal\" not found" Apr 22 19:23:20.144149 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.144110 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:20.201691 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.201648 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-175.ec2.internal" Apr 22 19:23:20.218954 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.218892 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:23:20.223923 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.223625 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-175.ec2.internal" Apr 22 19:23:20.229592 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.229556 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:23:20.573281 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.573195 2576 apiserver.go:52] "Watching apiserver" Apr 22 19:23:20.578520 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.578496 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 19:23:20.578907 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.578881 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-nflqk","openshift-network-diagnostics/network-check-target-858rm","kube-system/konnectivity-agent-w4jz7","openshift-cluster-node-tuning-operator/tuned-tmk7w","openshift-image-registry/node-ca-pnb2n","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-175.ec2.internal","openshift-multus/multus-additional-cni-plugins-hr5vn","openshift-multus/network-metrics-daemon-4sl88","openshift-network-operator/iptables-alerter-68jrq","openshift-ovn-kubernetes/ovnkube-node-r2gjj","kube-system/kube-apiserver-proxy-ip-10-0-129-175.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2l7p","openshift-dns/node-resolver-5hk26"] Apr 22 19:23:20.581720 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.581692 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pnb2n" Apr 22 19:23:20.583912 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.583891 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 19:23:20.584005 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.583891 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 19:23:20.584069 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.584007 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-d7znh\"" Apr 22 19:23:20.584069 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.583891 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 19:23:20.586847 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.586809 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-858rm" Apr 22 19:23:20.586926 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:20.586878 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-858rm" podUID="c887aba7-380b-4a80-bc2c-6e89b986da6b" Apr 22 19:23:20.586926 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.586903 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5hk26" Apr 22 19:23:20.589363 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.589346 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.590047 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.589702 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 19:23:20.590047 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.589738 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-qgw28\"" Apr 22 19:23:20.590047 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.589931 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 19:23:20.591345 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.591328 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 19:23:20.591610 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.591558 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hr5vn" Apr 22 19:23:20.591694 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.591632 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 19:23:20.593086 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.593069 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 19:23:20.593301 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.593288 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 19:23:20.593380 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.593315 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-26kft\"" Apr 22 19:23:20.593745 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.593715 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.594132 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.594111 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 19:23:20.594600 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.594581 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-hkfdv\"" Apr 22 19:23:20.594679 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.594637 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 19:23:20.595258 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.595238 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:23:20.595365 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.595353 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-b5jlt\"" Apr 22 19:23:20.596073 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.595693 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 19:23:20.598610 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.598203 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sl88" Apr 22 19:23:20.598610 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:20.598275 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4sl88" podUID="ac86a56e-148b-4fb4-8415-acff438d7915" Apr 22 19:23:20.598610 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.598298 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-68jrq" Apr 22 19:23:20.600148 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.600120 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 19:23:20.600233 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.600157 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 19:23:20.600233 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.600221 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:23:20.600570 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.600535 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-gsd26\"" Apr 22 19:23:20.600703 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.600686 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.604260 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.602639 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 19:23:20.604260 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.602756 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 19:23:20.604260 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.602909 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 19:23:20.604260 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.603817 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 19:23:20.604260 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.603918 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 19:23:20.604260 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.604061 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-gfcjx\"" Apr 22 19:23:20.606321 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.605550 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 19:23:20.609123 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.608627 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-w4jz7" Apr 22 19:23:20.609123 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.608668 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2l7p" Apr 22 19:23:20.610670 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.610654 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 19:23:20.610830 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.610813 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 19:23:20.610907 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.610858 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 19:23:20.610964 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.610920 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-xgnxp\"" Apr 22 19:23:20.611028 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.610815 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-rx9ps\"" Apr 22 19:23:20.611082 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.611074 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 19:23:20.611198 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.611181 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 19:23:20.612905 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.612887 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-tmp\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.613012 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.612918 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-systemd-units\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.613012 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.612943 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-var-lib-openvswitch\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.613012 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.612968 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-host-run-ovn-kubernetes\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.613160 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.613019 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4b532865-112c-43d5-a22b-58a5cece9682-multus-daemon-config\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.613160 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.613079 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-etc-kubernetes\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.613160 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.613120 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-etc-modprobe-d\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.613286 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.613159 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-run-openvswitch\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.613286 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.613191 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-log-socket\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.613286 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.613224 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6c9d2810-4d20-4eb6-9318-15765f879dfa-iptables-alerter-script\") pod \"iptables-alerter-68jrq\" (UID: \"6c9d2810-4d20-4eb6-9318-15765f879dfa\") " pod="openshift-network-operator/iptables-alerter-68jrq" Apr 22 19:23:20.613286 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.613250 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2085208-6038-4f2b-929e-b75fe56d5a28-host\") pod \"node-ca-pnb2n\" (UID: \"a2085208-6038-4f2b-929e-b75fe56d5a28\") " pod="openshift-image-registry/node-ca-pnb2n" Apr 22 19:23:20.613286 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.613271 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-os-release\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.613286 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.613286 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz2g5\" (UniqueName: \"kubernetes.io/projected/4b532865-112c-43d5-a22b-58a5cece9682-kube-api-access-wz2g5\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.613533 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.613327 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-lib-modules\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.613533 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.613354 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7ec73ca3-22df-4ef0-ad03-92031016c8b8-ovnkube-config\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.613533 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.613376 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n8z2\" (UniqueName: \"kubernetes.io/projected/7ec73ca3-22df-4ef0-ad03-92031016c8b8-kube-api-access-7n8z2\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.613533 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.613401 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6c9d2810-4d20-4eb6-9318-15765f879dfa-host-slash\") pod \"iptables-alerter-68jrq\" (UID: \"6c9d2810-4d20-4eb6-9318-15765f879dfa\") " pod="openshift-network-operator/iptables-alerter-68jrq" Apr 22 19:23:20.613533 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.613428 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-host-run-netns\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.613533 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.613451 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a2085208-6038-4f2b-929e-b75fe56d5a28-serviceca\") pod \"node-ca-pnb2n\" (UID: \"a2085208-6038-4f2b-929e-b75fe56d5a28\") " pod="openshift-image-registry/node-ca-pnb2n" Apr 22 19:23:20.613533 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.613475 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-etc-tuned\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.613533 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.613495 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkm5k\" (UniqueName: \"kubernetes.io/projected/17ada943-fc5e-4b09-bf82-9132909cb32d-kube-api-access-dkm5k\") pod \"multus-additional-cni-plugins-hr5vn\" (UID: \"17ada943-fc5e-4b09-bf82-9132909cb32d\") " pod="openshift-multus/multus-additional-cni-plugins-hr5vn" Apr 22 19:23:20.613953 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.613540 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f634baad-b7a2-4fe7-9706-4ecc6e5b9366-kubelet-dir\") pod \"aws-ebs-csi-driver-node-s2l7p\" (UID: \"f634baad-b7a2-4fe7-9706-4ecc6e5b9366\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2l7p" Apr 22 19:23:20.613953 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.613569 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w62d\" (UniqueName: \"kubernetes.io/projected/f634baad-b7a2-4fe7-9706-4ecc6e5b9366-kube-api-access-5w62d\") pod \"aws-ebs-csi-driver-node-s2l7p\" (UID: \"f634baad-b7a2-4fe7-9706-4ecc6e5b9366\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2l7p" Apr 22 19:23:20.613953 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.613606 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-multus-cni-dir\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.613953 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.613665 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-host-run-multus-certs\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.613953 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.613692 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxzrj\" (UniqueName: \"kubernetes.io/projected/ac86a56e-148b-4fb4-8415-acff438d7915-kube-api-access-qxzrj\") pod \"network-metrics-daemon-4sl88\" (UID: \"ac86a56e-148b-4fb4-8415-acff438d7915\") " pod="openshift-multus/network-metrics-daemon-4sl88" Apr 22 19:23:20.613953 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.613713 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-node-log\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.613953 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.613751 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-host-cni-netd\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.613953 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.613772 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/17ada943-fc5e-4b09-bf82-9132909cb32d-system-cni-dir\") pod \"multus-additional-cni-plugins-hr5vn\" (UID: \"17ada943-fc5e-4b09-bf82-9132909cb32d\") " pod="openshift-multus/multus-additional-cni-plugins-hr5vn" Apr 22 19:23:20.613953 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.613791 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/17ada943-fc5e-4b09-bf82-9132909cb32d-cni-binary-copy\") pod \"multus-additional-cni-plugins-hr5vn\" (UID: \"17ada943-fc5e-4b09-bf82-9132909cb32d\") " pod="openshift-multus/multus-additional-cni-plugins-hr5vn" Apr 22 19:23:20.613953 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.613816 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-system-cni-dir\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.613953 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.613840 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-etc-sysconfig\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.613953 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.613881 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-etc-sysctl-d\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.613953 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.613914 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-sys\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.613953 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.613940 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-var-lib-kubelet\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.614610 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.613976 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac86a56e-148b-4fb4-8415-acff438d7915-metrics-certs\") pod \"network-metrics-daemon-4sl88\" (UID: \"ac86a56e-148b-4fb4-8415-acff438d7915\") " pod="openshift-multus/network-metrics-daemon-4sl88" Apr 22 19:23:20.614610 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614012 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-host-run-netns\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.614610 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614038 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-run-ovn\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.614610 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614063 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv5th\" (UniqueName: \"kubernetes.io/projected/a2085208-6038-4f2b-929e-b75fe56d5a28-kube-api-access-hv5th\") pod \"node-ca-pnb2n\" (UID: \"a2085208-6038-4f2b-929e-b75fe56d5a28\") " pod="openshift-image-registry/node-ca-pnb2n" Apr 22 19:23:20.614610 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614088 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-host-kubelet\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.614610 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614108 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f634baad-b7a2-4fe7-9706-4ecc6e5b9366-registration-dir\") pod \"aws-ebs-csi-driver-node-s2l7p\" (UID: \"f634baad-b7a2-4fe7-9706-4ecc6e5b9366\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2l7p" Apr 22 19:23:20.614610 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614142 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f634baad-b7a2-4fe7-9706-4ecc6e5b9366-etc-selinux\") pod \"aws-ebs-csi-driver-node-s2l7p\" (UID: \"f634baad-b7a2-4fe7-9706-4ecc6e5b9366\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2l7p" Apr 22 19:23:20.614610 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614176 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f634baad-b7a2-4fe7-9706-4ecc6e5b9366-sys-fs\") pod \"aws-ebs-csi-driver-node-s2l7p\" (UID: \"f634baad-b7a2-4fe7-9706-4ecc6e5b9366\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2l7p" Apr 22 19:23:20.614610 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614207 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-cnibin\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.614610 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614244 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-host-var-lib-cni-multus\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.614610 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614263 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-etc-sysctl-conf\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.614610 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614278 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-host\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.614610 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614294 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-etc-openvswitch\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.614610 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614317 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f634baad-b7a2-4fe7-9706-4ecc6e5b9366-device-dir\") pod \"aws-ebs-csi-driver-node-s2l7p\" (UID: \"f634baad-b7a2-4fe7-9706-4ecc6e5b9366\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2l7p" Apr 22 19:23:20.614610 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614339 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-run-systemd\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.614610 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614361 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/17ada943-fc5e-4b09-bf82-9132909cb32d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hr5vn\" (UID: \"17ada943-fc5e-4b09-bf82-9132909cb32d\") " pod="openshift-multus/multus-additional-cni-plugins-hr5vn" Apr 22 19:23:20.615261 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614388 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-host-run-k8s-cni-cncf-io\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.615261 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614411 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-hostroot\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.615261 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614427 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-run\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.615261 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614440 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7ec73ca3-22df-4ef0-ad03-92031016c8b8-ovnkube-script-lib\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.615261 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614458 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/17ada943-fc5e-4b09-bf82-9132909cb32d-os-release\") pod \"multus-additional-cni-plugins-hr5vn\" (UID: \"17ada943-fc5e-4b09-bf82-9132909cb32d\") " pod="openshift-multus/multus-additional-cni-plugins-hr5vn" Apr 22 19:23:20.615261 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614482 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4b532865-112c-43d5-a22b-58a5cece9682-cni-binary-copy\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.615261 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614535 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r5vj\" (UniqueName: \"kubernetes.io/projected/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-kube-api-access-7r5vj\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.615261 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614563 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7ec73ca3-22df-4ef0-ad03-92031016c8b8-env-overrides\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.615261 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614588 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7ec73ca3-22df-4ef0-ad03-92031016c8b8-ovn-node-metrics-cert\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.615261 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614604 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/17ada943-fc5e-4b09-bf82-9132909cb32d-cnibin\") pod \"multus-additional-cni-plugins-hr5vn\" (UID: \"17ada943-fc5e-4b09-bf82-9132909cb32d\") " pod="openshift-multus/multus-additional-cni-plugins-hr5vn" Apr 22 19:23:20.615261 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614619 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw8rh\" (UniqueName: \"kubernetes.io/projected/c887aba7-380b-4a80-bc2c-6e89b986da6b-kube-api-access-jw8rh\") pod \"network-check-target-858rm\" (UID: \"c887aba7-380b-4a80-bc2c-6e89b986da6b\") " pod="openshift-network-diagnostics/network-check-target-858rm" Apr 22 19:23:20.615261 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614640 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/eb4cf245-3080-4779-80ac-295e37a8327f-hosts-file\") pod \"node-resolver-5hk26\" (UID: \"eb4cf245-3080-4779-80ac-295e37a8327f\") " pod="openshift-dns/node-resolver-5hk26" Apr 22 19:23:20.615261 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614661 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-host-var-lib-kubelet\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.615261 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614676 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-host-slash\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.615261 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614692 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.615261 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614714 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkv8d\" (UniqueName: \"kubernetes.io/projected/6c9d2810-4d20-4eb6-9318-15765f879dfa-kube-api-access-hkv8d\") pod \"iptables-alerter-68jrq\" (UID: \"6c9d2810-4d20-4eb6-9318-15765f879dfa\") " pod="openshift-network-operator/iptables-alerter-68jrq" Apr 22 19:23:20.615961 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614758 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-host-cni-bin\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.615961 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614781 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eb4cf245-3080-4779-80ac-295e37a8327f-tmp-dir\") pod \"node-resolver-5hk26\" (UID: \"eb4cf245-3080-4779-80ac-295e37a8327f\") " pod="openshift-dns/node-resolver-5hk26" Apr 22 19:23:20.615961 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614799 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-multus-conf-dir\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.615961 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614837 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-etc-systemd\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.615961 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614862 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-multus-socket-dir-parent\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.615961 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614901 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-host-var-lib-cni-bin\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.615961 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614928 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-etc-kubernetes\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.615961 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614955 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/17ada943-fc5e-4b09-bf82-9132909cb32d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hr5vn\" (UID: \"17ada943-fc5e-4b09-bf82-9132909cb32d\") " pod="openshift-multus/multus-additional-cni-plugins-hr5vn" Apr 22 19:23:20.615961 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.614996 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/17ada943-fc5e-4b09-bf82-9132909cb32d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hr5vn\" (UID: \"17ada943-fc5e-4b09-bf82-9132909cb32d\") " pod="openshift-multus/multus-additional-cni-plugins-hr5vn" Apr 22 19:23:20.615961 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.615025 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lmlp\" (UniqueName: \"kubernetes.io/projected/eb4cf245-3080-4779-80ac-295e37a8327f-kube-api-access-2lmlp\") pod \"node-resolver-5hk26\" (UID: \"eb4cf245-3080-4779-80ac-295e37a8327f\") " pod="openshift-dns/node-resolver-5hk26" Apr 22 19:23:20.615961 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.615047 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f634baad-b7a2-4fe7-9706-4ecc6e5b9366-socket-dir\") pod \"aws-ebs-csi-driver-node-s2l7p\" (UID: \"f634baad-b7a2-4fe7-9706-4ecc6e5b9366\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2l7p" Apr 22 19:23:20.647411 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.647355 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:18:19 +0000 UTC" deadline="2027-12-03 06:19:47.402474681 +0000 UTC" Apr 22 19:23:20.647411 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.647391 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14146h56m26.755088394s" Apr 22 19:23:20.701887 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.701854 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 19:23:20.715255 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.715224 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f634baad-b7a2-4fe7-9706-4ecc6e5b9366-kubelet-dir\") pod \"aws-ebs-csi-driver-node-s2l7p\" (UID: \"f634baad-b7a2-4fe7-9706-4ecc6e5b9366\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2l7p" Apr 22 19:23:20.715413 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.715261 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5w62d\" (UniqueName: \"kubernetes.io/projected/f634baad-b7a2-4fe7-9706-4ecc6e5b9366-kube-api-access-5w62d\") pod \"aws-ebs-csi-driver-node-s2l7p\" (UID: \"f634baad-b7a2-4fe7-9706-4ecc6e5b9366\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2l7p" Apr 22 19:23:20.715413 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.715290 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2fb0effb-90c6-4bf4-8981-baf430cec62a-agent-certs\") pod \"konnectivity-agent-w4jz7\" (UID: \"2fb0effb-90c6-4bf4-8981-baf430cec62a\") " pod="kube-system/konnectivity-agent-w4jz7" Apr 22 19:23:20.715413 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.715312 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-multus-cni-dir\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.715413 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.715334 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-host-run-multus-certs\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.715413 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.715356 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qxzrj\" (UniqueName: \"kubernetes.io/projected/ac86a56e-148b-4fb4-8415-acff438d7915-kube-api-access-qxzrj\") pod \"network-metrics-daemon-4sl88\" (UID: \"ac86a56e-148b-4fb4-8415-acff438d7915\") " pod="openshift-multus/network-metrics-daemon-4sl88" Apr 22 19:23:20.715413 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.715378 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-node-log\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.715413 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.715376 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f634baad-b7a2-4fe7-9706-4ecc6e5b9366-kubelet-dir\") pod \"aws-ebs-csi-driver-node-s2l7p\" (UID: \"f634baad-b7a2-4fe7-9706-4ecc6e5b9366\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2l7p" Apr 22 19:23:20.715776 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.715415 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-multus-cni-dir\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.715776 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.715415 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-host-run-multus-certs\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.715776 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.715438 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-node-log\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.715776 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.715512 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-host-cni-netd\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.715776 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.715545 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/17ada943-fc5e-4b09-bf82-9132909cb32d-system-cni-dir\") pod \"multus-additional-cni-plugins-hr5vn\" (UID: \"17ada943-fc5e-4b09-bf82-9132909cb32d\") " pod="openshift-multus/multus-additional-cni-plugins-hr5vn" Apr 22 19:23:20.715776 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.715571 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/17ada943-fc5e-4b09-bf82-9132909cb32d-cni-binary-copy\") pod \"multus-additional-cni-plugins-hr5vn\" (UID: \"17ada943-fc5e-4b09-bf82-9132909cb32d\") " pod="openshift-multus/multus-additional-cni-plugins-hr5vn" Apr 22 19:23:20.715776 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.715580 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-host-cni-netd\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.715776 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.715606 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-system-cni-dir\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.715776 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.715620 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/17ada943-fc5e-4b09-bf82-9132909cb32d-system-cni-dir\") pod \"multus-additional-cni-plugins-hr5vn\" (UID: \"17ada943-fc5e-4b09-bf82-9132909cb32d\") " pod="openshift-multus/multus-additional-cni-plugins-hr5vn" Apr 22 19:23:20.715776 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.715631 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-etc-sysconfig\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.715776 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.715654 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-etc-sysctl-d\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.715776 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.715673 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-sys\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.715776 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.715689 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-system-cni-dir\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.715776 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.715715 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-etc-sysconfig\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.715776 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.715756 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-var-lib-kubelet\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.716451 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.715788 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac86a56e-148b-4fb4-8415-acff438d7915-metrics-certs\") pod \"network-metrics-daemon-4sl88\" (UID: \"ac86a56e-148b-4fb4-8415-acff438d7915\") " pod="openshift-multus/network-metrics-daemon-4sl88" Apr 22 19:23:20.716451 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.715812 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-sys\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.716451 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.715816 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-host-run-netns\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.716451 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.715848 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-run-ovn\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.716451 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.715854 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-etc-sysctl-d\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.716451 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.715857 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-var-lib-kubelet\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.716451 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.715884 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-run-ovn\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.716451 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.715851 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-host-run-netns\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.716451 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.715896 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hv5th\" (UniqueName: \"kubernetes.io/projected/a2085208-6038-4f2b-929e-b75fe56d5a28-kube-api-access-hv5th\") pod \"node-ca-pnb2n\" (UID: \"a2085208-6038-4f2b-929e-b75fe56d5a28\") " pod="openshift-image-registry/node-ca-pnb2n" Apr 22 19:23:20.716451 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:20.715909 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:20.716451 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.715925 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-host-kubelet\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.716451 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.715950 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f634baad-b7a2-4fe7-9706-4ecc6e5b9366-registration-dir\") pod \"aws-ebs-csi-driver-node-s2l7p\" (UID: \"f634baad-b7a2-4fe7-9706-4ecc6e5b9366\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2l7p" Apr 22 19:23:20.716451 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:20.715996 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac86a56e-148b-4fb4-8415-acff438d7915-metrics-certs podName:ac86a56e-148b-4fb4-8415-acff438d7915 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:21.215962663 +0000 UTC m=+3.076916381 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ac86a56e-148b-4fb4-8415-acff438d7915-metrics-certs") pod "network-metrics-daemon-4sl88" (UID: "ac86a56e-148b-4fb4-8415-acff438d7915") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:20.716451 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716005 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f634baad-b7a2-4fe7-9706-4ecc6e5b9366-registration-dir\") pod \"aws-ebs-csi-driver-node-s2l7p\" (UID: \"f634baad-b7a2-4fe7-9706-4ecc6e5b9366\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2l7p" Apr 22 19:23:20.716451 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716014 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f634baad-b7a2-4fe7-9706-4ecc6e5b9366-etc-selinux\") pod \"aws-ebs-csi-driver-node-s2l7p\" (UID: \"f634baad-b7a2-4fe7-9706-4ecc6e5b9366\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2l7p" Apr 22 19:23:20.716451 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716016 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-host-kubelet\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.716451 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716033 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f634baad-b7a2-4fe7-9706-4ecc6e5b9366-sys-fs\") pod \"aws-ebs-csi-driver-node-s2l7p\" (UID: \"f634baad-b7a2-4fe7-9706-4ecc6e5b9366\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2l7p" Apr 22 19:23:20.717246 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716049 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-cnibin\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.717246 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716088 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-host-var-lib-cni-multus\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.717246 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716113 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-etc-sysctl-conf\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.717246 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716119 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f634baad-b7a2-4fe7-9706-4ecc6e5b9366-sys-fs\") pod \"aws-ebs-csi-driver-node-s2l7p\" (UID: \"f634baad-b7a2-4fe7-9706-4ecc6e5b9366\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2l7p" Apr 22 19:23:20.717246 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716137 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f634baad-b7a2-4fe7-9706-4ecc6e5b9366-etc-selinux\") pod \"aws-ebs-csi-driver-node-s2l7p\" (UID: \"f634baad-b7a2-4fe7-9706-4ecc6e5b9366\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2l7p" Apr 22 19:23:20.717246 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716143 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-host\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.717246 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716143 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-cnibin\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.717246 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716174 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-host-var-lib-cni-multus\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.717246 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716177 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-etc-openvswitch\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.717246 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716211 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-etc-openvswitch\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.717246 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716215 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-host\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.717246 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716211 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/17ada943-fc5e-4b09-bf82-9132909cb32d-cni-binary-copy\") pod \"multus-additional-cni-plugins-hr5vn\" (UID: \"17ada943-fc5e-4b09-bf82-9132909cb32d\") " pod="openshift-multus/multus-additional-cni-plugins-hr5vn" Apr 22 19:23:20.717246 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716226 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f634baad-b7a2-4fe7-9706-4ecc6e5b9366-device-dir\") pod \"aws-ebs-csi-driver-node-s2l7p\" (UID: \"f634baad-b7a2-4fe7-9706-4ecc6e5b9366\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2l7p" Apr 22 19:23:20.717246 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716255 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-run-systemd\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.717246 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716282 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/17ada943-fc5e-4b09-bf82-9132909cb32d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hr5vn\" (UID: \"17ada943-fc5e-4b09-bf82-9132909cb32d\") " pod="openshift-multus/multus-additional-cni-plugins-hr5vn" Apr 22 19:23:20.717246 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716289 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f634baad-b7a2-4fe7-9706-4ecc6e5b9366-device-dir\") pod \"aws-ebs-csi-driver-node-s2l7p\" (UID: \"f634baad-b7a2-4fe7-9706-4ecc6e5b9366\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2l7p" Apr 22 19:23:20.717246 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716310 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-host-run-k8s-cni-cncf-io\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.718036 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716315 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-etc-sysctl-conf\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.718036 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716316 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-run-systemd\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.718036 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716334 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-hostroot\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.718036 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716357 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-run\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.718036 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716381 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7ec73ca3-22df-4ef0-ad03-92031016c8b8-ovnkube-script-lib\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.718036 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716390 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-host-run-k8s-cni-cncf-io\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.718036 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716422 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-hostroot\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.718036 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716424 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/17ada943-fc5e-4b09-bf82-9132909cb32d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hr5vn\" (UID: \"17ada943-fc5e-4b09-bf82-9132909cb32d\") " pod="openshift-multus/multus-additional-cni-plugins-hr5vn" Apr 22 19:23:20.718036 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716446 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/17ada943-fc5e-4b09-bf82-9132909cb32d-os-release\") pod \"multus-additional-cni-plugins-hr5vn\" (UID: \"17ada943-fc5e-4b09-bf82-9132909cb32d\") " pod="openshift-multus/multus-additional-cni-plugins-hr5vn" Apr 22 19:23:20.718036 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716448 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-run\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.718036 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716494 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4b532865-112c-43d5-a22b-58a5cece9682-cni-binary-copy\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.718036 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716502 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/17ada943-fc5e-4b09-bf82-9132909cb32d-os-release\") pod \"multus-additional-cni-plugins-hr5vn\" (UID: \"17ada943-fc5e-4b09-bf82-9132909cb32d\") " pod="openshift-multus/multus-additional-cni-plugins-hr5vn" Apr 22 19:23:20.718036 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716523 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7r5vj\" (UniqueName: \"kubernetes.io/projected/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-kube-api-access-7r5vj\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.718036 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716548 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7ec73ca3-22df-4ef0-ad03-92031016c8b8-env-overrides\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.718036 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716574 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7ec73ca3-22df-4ef0-ad03-92031016c8b8-ovn-node-metrics-cert\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.718036 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716597 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/17ada943-fc5e-4b09-bf82-9132909cb32d-cnibin\") pod \"multus-additional-cni-plugins-hr5vn\" (UID: \"17ada943-fc5e-4b09-bf82-9132909cb32d\") " pod="openshift-multus/multus-additional-cni-plugins-hr5vn" Apr 22 19:23:20.718036 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716623 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jw8rh\" (UniqueName: \"kubernetes.io/projected/c887aba7-380b-4a80-bc2c-6e89b986da6b-kube-api-access-jw8rh\") pod \"network-check-target-858rm\" (UID: \"c887aba7-380b-4a80-bc2c-6e89b986da6b\") " pod="openshift-network-diagnostics/network-check-target-858rm" Apr 22 19:23:20.718036 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716647 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/eb4cf245-3080-4779-80ac-295e37a8327f-hosts-file\") pod \"node-resolver-5hk26\" (UID: \"eb4cf245-3080-4779-80ac-295e37a8327f\") " pod="openshift-dns/node-resolver-5hk26" Apr 22 19:23:20.718872 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716671 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/17ada943-fc5e-4b09-bf82-9132909cb32d-cnibin\") pod \"multus-additional-cni-plugins-hr5vn\" (UID: \"17ada943-fc5e-4b09-bf82-9132909cb32d\") " pod="openshift-multus/multus-additional-cni-plugins-hr5vn" Apr 22 19:23:20.718872 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716671 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-host-var-lib-kubelet\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.718872 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716715 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-host-slash\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.718872 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716755 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.718872 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716775 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hkv8d\" (UniqueName: \"kubernetes.io/projected/6c9d2810-4d20-4eb6-9318-15765f879dfa-kube-api-access-hkv8d\") pod \"iptables-alerter-68jrq\" (UID: \"6c9d2810-4d20-4eb6-9318-15765f879dfa\") " pod="openshift-network-operator/iptables-alerter-68jrq" Apr 22 19:23:20.718872 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716791 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-host-cni-bin\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.718872 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716806 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eb4cf245-3080-4779-80ac-295e37a8327f-tmp-dir\") pod \"node-resolver-5hk26\" (UID: \"eb4cf245-3080-4779-80ac-295e37a8327f\") " pod="openshift-dns/node-resolver-5hk26" Apr 22 19:23:20.718872 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716829 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-multus-conf-dir\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.718872 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716846 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-etc-systemd\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.718872 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716860 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-multus-socket-dir-parent\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.718872 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716880 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/eb4cf245-3080-4779-80ac-295e37a8327f-hosts-file\") pod \"node-resolver-5hk26\" (UID: \"eb4cf245-3080-4779-80ac-295e37a8327f\") " pod="openshift-dns/node-resolver-5hk26" Apr 22 19:23:20.718872 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716877 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-host-var-lib-cni-bin\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.718872 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716902 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-etc-kubernetes\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.718872 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716913 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-host-var-lib-kubelet\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.718872 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716943 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-host-slash\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.718872 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716967 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.718872 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716996 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7ec73ca3-22df-4ef0-ad03-92031016c8b8-ovnkube-script-lib\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.719578 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.717029 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7ec73ca3-22df-4ef0-ad03-92031016c8b8-env-overrides\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.719578 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.716917 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/17ada943-fc5e-4b09-bf82-9132909cb32d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hr5vn\" (UID: \"17ada943-fc5e-4b09-bf82-9132909cb32d\") " pod="openshift-multus/multus-additional-cni-plugins-hr5vn" Apr 22 19:23:20.719578 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.717087 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/17ada943-fc5e-4b09-bf82-9132909cb32d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hr5vn\" (UID: \"17ada943-fc5e-4b09-bf82-9132909cb32d\") " pod="openshift-multus/multus-additional-cni-plugins-hr5vn" Apr 22 19:23:20.719578 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.717085 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4b532865-112c-43d5-a22b-58a5cece9682-cni-binary-copy\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.719578 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.717120 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2fb0effb-90c6-4bf4-8981-baf430cec62a-konnectivity-ca\") pod \"konnectivity-agent-w4jz7\" (UID: \"2fb0effb-90c6-4bf4-8981-baf430cec62a\") " pod="kube-system/konnectivity-agent-w4jz7" Apr 22 19:23:20.719578 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.717132 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-host-cni-bin\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.719578 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.717148 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2lmlp\" (UniqueName: \"kubernetes.io/projected/eb4cf245-3080-4779-80ac-295e37a8327f-kube-api-access-2lmlp\") pod \"node-resolver-5hk26\" (UID: \"eb4cf245-3080-4779-80ac-295e37a8327f\") " pod="openshift-dns/node-resolver-5hk26" Apr 22 19:23:20.719578 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.717167 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-multus-socket-dir-parent\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.719578 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.717171 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-etc-systemd\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.719578 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.717185 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f634baad-b7a2-4fe7-9706-4ecc6e5b9366-socket-dir\") pod \"aws-ebs-csi-driver-node-s2l7p\" (UID: \"f634baad-b7a2-4fe7-9706-4ecc6e5b9366\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2l7p" Apr 22 19:23:20.719578 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.717190 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-host-var-lib-cni-bin\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.719578 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.717209 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-tmp\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.719578 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.717220 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-etc-kubernetes\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.719578 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.717233 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-systemd-units\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.719578 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.717257 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-var-lib-openvswitch\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.719578 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.717281 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-host-run-ovn-kubernetes\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.719578 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.717286 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/17ada943-fc5e-4b09-bf82-9132909cb32d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hr5vn\" (UID: \"17ada943-fc5e-4b09-bf82-9132909cb32d\") " pod="openshift-multus/multus-additional-cni-plugins-hr5vn" Apr 22 19:23:20.720305 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.717305 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4b532865-112c-43d5-a22b-58a5cece9682-multus-daemon-config\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.720305 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.717299 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 19:23:20.720305 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.717333 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-systemd-units\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.720305 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.717349 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-var-lib-openvswitch\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.720305 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.717353 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-host-run-ovn-kubernetes\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.720305 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.717389 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-etc-kubernetes\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.720305 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.717402 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f634baad-b7a2-4fe7-9706-4ecc6e5b9366-socket-dir\") pod \"aws-ebs-csi-driver-node-s2l7p\" (UID: \"f634baad-b7a2-4fe7-9706-4ecc6e5b9366\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2l7p" Apr 22 19:23:20.720305 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.717420 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-etc-modprobe-d\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.720305 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.717438 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-etc-kubernetes\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.720305 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.717447 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-run-openvswitch\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.720305 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.717473 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-log-socket\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.720305 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.717482 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-multus-conf-dir\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.720305 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.717518 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-run-openvswitch\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.720305 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.717449 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eb4cf245-3080-4779-80ac-295e37a8327f-tmp-dir\") pod \"node-resolver-5hk26\" (UID: \"eb4cf245-3080-4779-80ac-295e37a8327f\") " pod="openshift-dns/node-resolver-5hk26" Apr 22 19:23:20.720305 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.717543 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6c9d2810-4d20-4eb6-9318-15765f879dfa-iptables-alerter-script\") pod \"iptables-alerter-68jrq\" (UID: \"6c9d2810-4d20-4eb6-9318-15765f879dfa\") " pod="openshift-network-operator/iptables-alerter-68jrq" Apr 22 19:23:20.720305 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.717522 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7ec73ca3-22df-4ef0-ad03-92031016c8b8-log-socket\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.720305 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.717607 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-etc-modprobe-d\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.720305 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.717749 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/17ada943-fc5e-4b09-bf82-9132909cb32d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hr5vn\" (UID: \"17ada943-fc5e-4b09-bf82-9132909cb32d\") " pod="openshift-multus/multus-additional-cni-plugins-hr5vn" Apr 22 19:23:20.723435 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.718032 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2085208-6038-4f2b-929e-b75fe56d5a28-host\") pod \"node-ca-pnb2n\" (UID: \"a2085208-6038-4f2b-929e-b75fe56d5a28\") " pod="openshift-image-registry/node-ca-pnb2n" Apr 22 19:23:20.723435 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.718126 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-os-release\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.723435 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.718162 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2085208-6038-4f2b-929e-b75fe56d5a28-host\") pod \"node-ca-pnb2n\" (UID: \"a2085208-6038-4f2b-929e-b75fe56d5a28\") " pod="openshift-image-registry/node-ca-pnb2n" Apr 22 19:23:20.723435 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.718209 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wz2g5\" (UniqueName: \"kubernetes.io/projected/4b532865-112c-43d5-a22b-58a5cece9682-kube-api-access-wz2g5\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.723435 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.718250 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-lib-modules\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.723435 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.718254 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-os-release\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.723435 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.718293 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7ec73ca3-22df-4ef0-ad03-92031016c8b8-ovnkube-config\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.723435 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.718319 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7n8z2\" (UniqueName: \"kubernetes.io/projected/7ec73ca3-22df-4ef0-ad03-92031016c8b8-kube-api-access-7n8z2\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.723435 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.718353 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6c9d2810-4d20-4eb6-9318-15765f879dfa-host-slash\") pod \"iptables-alerter-68jrq\" (UID: \"6c9d2810-4d20-4eb6-9318-15765f879dfa\") " pod="openshift-network-operator/iptables-alerter-68jrq" Apr 22 19:23:20.723435 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.718399 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-host-run-netns\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.723435 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.718406 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-lib-modules\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.723435 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.718359 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4b532865-112c-43d5-a22b-58a5cece9682-multus-daemon-config\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.723435 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.718425 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a2085208-6038-4f2b-929e-b75fe56d5a28-serviceca\") pod \"node-ca-pnb2n\" (UID: \"a2085208-6038-4f2b-929e-b75fe56d5a28\") " pod="openshift-image-registry/node-ca-pnb2n" Apr 22 19:23:20.723435 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.718467 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6c9d2810-4d20-4eb6-9318-15765f879dfa-host-slash\") pod \"iptables-alerter-68jrq\" (UID: \"6c9d2810-4d20-4eb6-9318-15765f879dfa\") " pod="openshift-network-operator/iptables-alerter-68jrq" Apr 22 19:23:20.723435 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.718498 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-etc-tuned\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.723435 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.718523 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6c9d2810-4d20-4eb6-9318-15765f879dfa-iptables-alerter-script\") pod \"iptables-alerter-68jrq\" (UID: \"6c9d2810-4d20-4eb6-9318-15765f879dfa\") " pod="openshift-network-operator/iptables-alerter-68jrq" Apr 22 19:23:20.723435 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.718583 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dkm5k\" (UniqueName: \"kubernetes.io/projected/17ada943-fc5e-4b09-bf82-9132909cb32d-kube-api-access-dkm5k\") pod \"multus-additional-cni-plugins-hr5vn\" (UID: \"17ada943-fc5e-4b09-bf82-9132909cb32d\") " pod="openshift-multus/multus-additional-cni-plugins-hr5vn" Apr 22 19:23:20.723435 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.718604 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4b532865-112c-43d5-a22b-58a5cece9682-host-run-netns\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.724093 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.718817 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a2085208-6038-4f2b-929e-b75fe56d5a28-serviceca\") pod \"node-ca-pnb2n\" (UID: \"a2085208-6038-4f2b-929e-b75fe56d5a28\") " pod="openshift-image-registry/node-ca-pnb2n" Apr 22 19:23:20.724093 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.718898 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7ec73ca3-22df-4ef0-ad03-92031016c8b8-ovnkube-config\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.724093 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.721989 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-tmp\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.724093 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.722110 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7ec73ca3-22df-4ef0-ad03-92031016c8b8-ovn-node-metrics-cert\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.724093 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.723830 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-etc-tuned\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.726917 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.726862 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxzrj\" (UniqueName: \"kubernetes.io/projected/ac86a56e-148b-4fb4-8415-acff438d7915-kube-api-access-qxzrj\") pod \"network-metrics-daemon-4sl88\" (UID: \"ac86a56e-148b-4fb4-8415-acff438d7915\") " pod="openshift-multus/network-metrics-daemon-4sl88" Apr 22 19:23:20.727789 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.727757 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w62d\" (UniqueName: \"kubernetes.io/projected/f634baad-b7a2-4fe7-9706-4ecc6e5b9366-kube-api-access-5w62d\") pod \"aws-ebs-csi-driver-node-s2l7p\" (UID: \"f634baad-b7a2-4fe7-9706-4ecc6e5b9366\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2l7p" Apr 22 19:23:20.727990 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.727967 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz2g5\" (UniqueName: \"kubernetes.io/projected/4b532865-112c-43d5-a22b-58a5cece9682-kube-api-access-wz2g5\") pod \"multus-nflqk\" (UID: \"4b532865-112c-43d5-a22b-58a5cece9682\") " pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.730431 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:20.730409 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:20.730549 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:20.730438 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:20.730549 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:20.730452 2576 projected.go:194] Error preparing data for projected volume kube-api-access-jw8rh for pod openshift-network-diagnostics/network-check-target-858rm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:20.730549 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:20.730526 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c887aba7-380b-4a80-bc2c-6e89b986da6b-kube-api-access-jw8rh podName:c887aba7-380b-4a80-bc2c-6e89b986da6b nodeName:}" failed. No retries permitted until 2026-04-22 19:23:21.230506065 +0000 UTC m=+3.091459786 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-jw8rh" (UniqueName: "kubernetes.io/projected/c887aba7-380b-4a80-bc2c-6e89b986da6b-kube-api-access-jw8rh") pod "network-check-target-858rm" (UID: "c887aba7-380b-4a80-bc2c-6e89b986da6b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:20.731482 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.731458 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r5vj\" (UniqueName: \"kubernetes.io/projected/cf7e7909-ac67-46c2-af45-9ceb49eb60c2-kube-api-access-7r5vj\") pod \"tuned-tmk7w\" (UID: \"cf7e7909-ac67-46c2-af45-9ceb49eb60c2\") " pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.731705 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.731675 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkv8d\" (UniqueName: \"kubernetes.io/projected/6c9d2810-4d20-4eb6-9318-15765f879dfa-kube-api-access-hkv8d\") pod \"iptables-alerter-68jrq\" (UID: \"6c9d2810-4d20-4eb6-9318-15765f879dfa\") " pod="openshift-network-operator/iptables-alerter-68jrq" Apr 22 19:23:20.732113 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.732092 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lmlp\" (UniqueName: \"kubernetes.io/projected/eb4cf245-3080-4779-80ac-295e37a8327f-kube-api-access-2lmlp\") pod \"node-resolver-5hk26\" (UID: \"eb4cf245-3080-4779-80ac-295e37a8327f\") " pod="openshift-dns/node-resolver-5hk26" Apr 22 19:23:20.732349 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.732303 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv5th\" (UniqueName: \"kubernetes.io/projected/a2085208-6038-4f2b-929e-b75fe56d5a28-kube-api-access-hv5th\") pod \"node-ca-pnb2n\" (UID: \"a2085208-6038-4f2b-929e-b75fe56d5a28\") " pod="openshift-image-registry/node-ca-pnb2n" Apr 22 19:23:20.733406 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.733379 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n8z2\" (UniqueName: \"kubernetes.io/projected/7ec73ca3-22df-4ef0-ad03-92031016c8b8-kube-api-access-7n8z2\") pod \"ovnkube-node-r2gjj\" (UID: \"7ec73ca3-22df-4ef0-ad03-92031016c8b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.733540 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.733521 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkm5k\" (UniqueName: \"kubernetes.io/projected/17ada943-fc5e-4b09-bf82-9132909cb32d-kube-api-access-dkm5k\") pod \"multus-additional-cni-plugins-hr5vn\" (UID: \"17ada943-fc5e-4b09-bf82-9132909cb32d\") " pod="openshift-multus/multus-additional-cni-plugins-hr5vn" Apr 22 19:23:20.819260 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.819230 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2fb0effb-90c6-4bf4-8981-baf430cec62a-agent-certs\") pod \"konnectivity-agent-w4jz7\" (UID: \"2fb0effb-90c6-4bf4-8981-baf430cec62a\") " pod="kube-system/konnectivity-agent-w4jz7" Apr 22 19:23:20.819395 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.819379 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2fb0effb-90c6-4bf4-8981-baf430cec62a-konnectivity-ca\") pod \"konnectivity-agent-w4jz7\" (UID: \"2fb0effb-90c6-4bf4-8981-baf430cec62a\") " pod="kube-system/konnectivity-agent-w4jz7" Apr 22 19:23:20.819810 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.819795 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2fb0effb-90c6-4bf4-8981-baf430cec62a-konnectivity-ca\") pod \"konnectivity-agent-w4jz7\" (UID: \"2fb0effb-90c6-4bf4-8981-baf430cec62a\") " pod="kube-system/konnectivity-agent-w4jz7" Apr 22 19:23:20.821443 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.821410 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2fb0effb-90c6-4bf4-8981-baf430cec62a-agent-certs\") pod \"konnectivity-agent-w4jz7\" (UID: \"2fb0effb-90c6-4bf4-8981-baf430cec62a\") " pod="kube-system/konnectivity-agent-w4jz7" Apr 22 19:23:20.892399 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.892309 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pnb2n" Apr 22 19:23:20.900144 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.900109 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5hk26" Apr 22 19:23:20.911024 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.910992 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nflqk" Apr 22 19:23:20.916081 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.916045 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hr5vn" Apr 22 19:23:20.922840 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.922817 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" Apr 22 19:23:20.931473 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.931445 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-68jrq" Apr 22 19:23:20.937210 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.937173 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:20.944851 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.944829 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-w4jz7" Apr 22 19:23:20.950482 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:20.950461 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2l7p" Apr 22 19:23:21.012272 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:21.012236 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:21.222093 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:21.222008 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac86a56e-148b-4fb4-8415-acff438d7915-metrics-certs\") pod \"network-metrics-daemon-4sl88\" (UID: \"ac86a56e-148b-4fb4-8415-acff438d7915\") " pod="openshift-multus/network-metrics-daemon-4sl88" Apr 22 19:23:21.222235 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:21.222160 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:21.222302 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:21.222236 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac86a56e-148b-4fb4-8415-acff438d7915-metrics-certs podName:ac86a56e-148b-4fb4-8415-acff438d7915 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:22.222215096 +0000 UTC m=+4.083168815 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ac86a56e-148b-4fb4-8415-acff438d7915-metrics-certs") pod "network-metrics-daemon-4sl88" (UID: "ac86a56e-148b-4fb4-8415-acff438d7915") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:21.322519 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:21.322484 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jw8rh\" (UniqueName: \"kubernetes.io/projected/c887aba7-380b-4a80-bc2c-6e89b986da6b-kube-api-access-jw8rh\") pod \"network-check-target-858rm\" (UID: \"c887aba7-380b-4a80-bc2c-6e89b986da6b\") " pod="openshift-network-diagnostics/network-check-target-858rm" Apr 22 19:23:21.322710 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:21.322635 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:21.322710 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:21.322655 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:21.322710 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:21.322666 2576 projected.go:194] Error preparing data for projected volume kube-api-access-jw8rh for pod openshift-network-diagnostics/network-check-target-858rm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:21.322907 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:21.322741 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c887aba7-380b-4a80-bc2c-6e89b986da6b-kube-api-access-jw8rh podName:c887aba7-380b-4a80-bc2c-6e89b986da6b nodeName:}" failed. No retries permitted until 2026-04-22 19:23:22.322706291 +0000 UTC m=+4.183660020 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-jw8rh" (UniqueName: "kubernetes.io/projected/c887aba7-380b-4a80-bc2c-6e89b986da6b-kube-api-access-jw8rh") pod "network-check-target-858rm" (UID: "c887aba7-380b-4a80-bc2c-6e89b986da6b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:21.401165 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:21.401136 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2085208_6038_4f2b_929e_b75fe56d5a28.slice/crio-bfe4819364775c71e8bd54a8a3e6969127583d928bec75c559888a6694248457 WatchSource:0}: Error finding container bfe4819364775c71e8bd54a8a3e6969127583d928bec75c559888a6694248457: Status 404 returned error can't find the container with id bfe4819364775c71e8bd54a8a3e6969127583d928bec75c559888a6694248457 Apr 22 19:23:21.404154 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:21.404046 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b532865_112c_43d5_a22b_58a5cece9682.slice/crio-d2f73bdc9bc2758d1e7f3ea45ca17d6691c8ce83423ed857855982803473afcd WatchSource:0}: Error finding container d2f73bdc9bc2758d1e7f3ea45ca17d6691c8ce83423ed857855982803473afcd: Status 404 returned error can't find the container with id d2f73bdc9bc2758d1e7f3ea45ca17d6691c8ce83423ed857855982803473afcd Apr 22 19:23:21.405502 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:21.405476 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fb0effb_90c6_4bf4_8981_baf430cec62a.slice/crio-6c629785cb06ddd5b8942e2fa8bf651fad3b628d2ad323e854780218c42519c5 WatchSource:0}: Error finding container 6c629785cb06ddd5b8942e2fa8bf651fad3b628d2ad323e854780218c42519c5: Status 404 returned error can't find the container with id 6c629785cb06ddd5b8942e2fa8bf651fad3b628d2ad323e854780218c42519c5 Apr 22 19:23:21.406382 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:21.406359 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17ada943_fc5e_4b09_bf82_9132909cb32d.slice/crio-7b5d429c78c69d35141ec3f17ba6711336455d902ff5a4cd71c5b365e20e2fd6 WatchSource:0}: Error finding container 7b5d429c78c69d35141ec3f17ba6711336455d902ff5a4cd71c5b365e20e2fd6: Status 404 returned error can't find the container with id 7b5d429c78c69d35141ec3f17ba6711336455d902ff5a4cd71c5b365e20e2fd6 Apr 22 19:23:21.406991 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:21.406961 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ec73ca3_22df_4ef0_ad03_92031016c8b8.slice/crio-fbe3f10fd0429b1ddda02ee08787f2a618c4081d41711851c96d4da59dbf5433 WatchSource:0}: Error finding container fbe3f10fd0429b1ddda02ee08787f2a618c4081d41711851c96d4da59dbf5433: Status 404 returned error can't find the container with id fbe3f10fd0429b1ddda02ee08787f2a618c4081d41711851c96d4da59dbf5433 Apr 22 19:23:21.408421 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:21.408116 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb4cf245_3080_4779_80ac_295e37a8327f.slice/crio-0834e570357a3d4b8ad014c6970f3e4efa0b50949f9ca723d7594ef08a069073 WatchSource:0}: Error finding container 0834e570357a3d4b8ad014c6970f3e4efa0b50949f9ca723d7594ef08a069073: Status 404 returned error can't find the container with id 0834e570357a3d4b8ad014c6970f3e4efa0b50949f9ca723d7594ef08a069073 Apr 22 19:23:21.409413 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:21.408917 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c9d2810_4d20_4eb6_9318_15765f879dfa.slice/crio-c7d5f389fd550770df17d0443aabe553f2b1f6c9a66bc9eabe4c2ba4231425ae WatchSource:0}: Error finding container c7d5f389fd550770df17d0443aabe553f2b1f6c9a66bc9eabe4c2ba4231425ae: Status 404 returned error can't find the container with id c7d5f389fd550770df17d0443aabe553f2b1f6c9a66bc9eabe4c2ba4231425ae Apr 22 19:23:21.411449 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:21.411078 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf634baad_b7a2_4fe7_9706_4ecc6e5b9366.slice/crio-37c6c506db0b53870fac374f9aff0a4f178b8099216e7345ce0fb1239cb8e51e WatchSource:0}: Error finding container 37c6c506db0b53870fac374f9aff0a4f178b8099216e7345ce0fb1239cb8e51e: Status 404 returned error can't find the container with id 37c6c506db0b53870fac374f9aff0a4f178b8099216e7345ce0fb1239cb8e51e Apr 22 19:23:21.413458 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:21.413416 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf7e7909_ac67_46c2_af45_9ceb49eb60c2.slice/crio-23bea4c43ddf44f819d61bbc5878f3f70fe809d7a4f0801767ae6160dd6c7d81 WatchSource:0}: Error finding container 23bea4c43ddf44f819d61bbc5878f3f70fe809d7a4f0801767ae6160dd6c7d81: Status 404 returned error can't find the container with id 23bea4c43ddf44f819d61bbc5878f3f70fe809d7a4f0801767ae6160dd6c7d81 Apr 22 19:23:21.648330 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:21.648134 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:18:19 +0000 UTC" deadline="2027-10-26 17:55:18.777435869 +0000 UTC" Apr 22 19:23:21.648330 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:21.648322 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13246h31m57.129117493s" Apr 22 19:23:21.733461 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:21.733378 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sl88" Apr 22 19:23:21.734008 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:21.733870 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4sl88" podUID="ac86a56e-148b-4fb4-8415-acff438d7915" Apr 22 19:23:21.740236 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:21.740101 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pnb2n" event={"ID":"a2085208-6038-4f2b-929e-b75fe56d5a28","Type":"ContainerStarted","Data":"bfe4819364775c71e8bd54a8a3e6969127583d928bec75c559888a6694248457"} Apr 22 19:23:21.742216 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:21.742136 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-175.ec2.internal" event={"ID":"65b71d693c1ea98bc6c9a5017263d773","Type":"ContainerStarted","Data":"ae00d79af3896b892c40bc60dadf5e0b09603723f82be50e3f48c7e0aaf8744f"} Apr 22 19:23:21.746193 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:21.746167 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2l7p" event={"ID":"f634baad-b7a2-4fe7-9706-4ecc6e5b9366","Type":"ContainerStarted","Data":"37c6c506db0b53870fac374f9aff0a4f178b8099216e7345ce0fb1239cb8e51e"} Apr 22 19:23:21.749487 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:21.749459 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-68jrq" event={"ID":"6c9d2810-4d20-4eb6-9318-15765f879dfa","Type":"ContainerStarted","Data":"c7d5f389fd550770df17d0443aabe553f2b1f6c9a66bc9eabe4c2ba4231425ae"} Apr 22 19:23:21.750636 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:21.750587 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-w4jz7" event={"ID":"2fb0effb-90c6-4bf4-8981-baf430cec62a","Type":"ContainerStarted","Data":"6c629785cb06ddd5b8942e2fa8bf651fad3b628d2ad323e854780218c42519c5"} Apr 22 19:23:21.754288 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:21.754259 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nflqk" event={"ID":"4b532865-112c-43d5-a22b-58a5cece9682","Type":"ContainerStarted","Data":"d2f73bdc9bc2758d1e7f3ea45ca17d6691c8ce83423ed857855982803473afcd"} Apr 22 19:23:21.756400 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:21.755945 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-175.ec2.internal" podStartSLOduration=1.7559292960000001 podStartE2EDuration="1.755929296s" podCreationTimestamp="2026-04-22 19:23:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:23:21.755214163 +0000 UTC m=+3.616167902" watchObservedRunningTime="2026-04-22 19:23:21.755929296 +0000 UTC m=+3.616883035" Apr 22 19:23:21.757370 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:21.757326 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" event={"ID":"cf7e7909-ac67-46c2-af45-9ceb49eb60c2","Type":"ContainerStarted","Data":"23bea4c43ddf44f819d61bbc5878f3f70fe809d7a4f0801767ae6160dd6c7d81"} Apr 22 19:23:21.760450 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:21.760358 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5hk26" event={"ID":"eb4cf245-3080-4779-80ac-295e37a8327f","Type":"ContainerStarted","Data":"0834e570357a3d4b8ad014c6970f3e4efa0b50949f9ca723d7594ef08a069073"} Apr 22 19:23:21.762842 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:21.762799 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" event={"ID":"7ec73ca3-22df-4ef0-ad03-92031016c8b8","Type":"ContainerStarted","Data":"fbe3f10fd0429b1ddda02ee08787f2a618c4081d41711851c96d4da59dbf5433"} Apr 22 19:23:21.765995 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:21.765937 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hr5vn" event={"ID":"17ada943-fc5e-4b09-bf82-9132909cb32d","Type":"ContainerStarted","Data":"7b5d429c78c69d35141ec3f17ba6711336455d902ff5a4cd71c5b365e20e2fd6"} Apr 22 19:23:22.229597 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:22.228980 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac86a56e-148b-4fb4-8415-acff438d7915-metrics-certs\") pod \"network-metrics-daemon-4sl88\" (UID: \"ac86a56e-148b-4fb4-8415-acff438d7915\") " pod="openshift-multus/network-metrics-daemon-4sl88" Apr 22 19:23:22.229597 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:22.229141 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:22.229597 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:22.229206 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac86a56e-148b-4fb4-8415-acff438d7915-metrics-certs podName:ac86a56e-148b-4fb4-8415-acff438d7915 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:24.229185441 +0000 UTC m=+6.090139159 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ac86a56e-148b-4fb4-8415-acff438d7915-metrics-certs") pod "network-metrics-daemon-4sl88" (UID: "ac86a56e-148b-4fb4-8415-acff438d7915") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:22.330821 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:22.330191 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jw8rh\" (UniqueName: \"kubernetes.io/projected/c887aba7-380b-4a80-bc2c-6e89b986da6b-kube-api-access-jw8rh\") pod \"network-check-target-858rm\" (UID: \"c887aba7-380b-4a80-bc2c-6e89b986da6b\") " pod="openshift-network-diagnostics/network-check-target-858rm" Apr 22 19:23:22.330821 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:22.330371 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:22.330821 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:22.330392 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:22.330821 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:22.330404 2576 projected.go:194] Error preparing data for projected volume kube-api-access-jw8rh for pod openshift-network-diagnostics/network-check-target-858rm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:22.330821 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:22.330474 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c887aba7-380b-4a80-bc2c-6e89b986da6b-kube-api-access-jw8rh podName:c887aba7-380b-4a80-bc2c-6e89b986da6b nodeName:}" failed. No retries permitted until 2026-04-22 19:23:24.330453975 +0000 UTC m=+6.191407696 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-jw8rh" (UniqueName: "kubernetes.io/projected/c887aba7-380b-4a80-bc2c-6e89b986da6b-kube-api-access-jw8rh") pod "network-check-target-858rm" (UID: "c887aba7-380b-4a80-bc2c-6e89b986da6b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:22.733495 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:22.733312 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-858rm" Apr 22 19:23:22.733495 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:22.733444 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-858rm" podUID="c887aba7-380b-4a80-bc2c-6e89b986da6b" Apr 22 19:23:22.787063 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:22.787023 2576 generic.go:358] "Generic (PLEG): container finished" podID="6f55add77c4d902c98f085f4badc816c" containerID="045ae2395ea05de7c7bfff68dec3df826351d0eb62c970c4000b661da9d2be87" exitCode=0 Apr 22 19:23:22.787829 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:22.787794 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-175.ec2.internal" event={"ID":"6f55add77c4d902c98f085f4badc816c","Type":"ContainerDied","Data":"045ae2395ea05de7c7bfff68dec3df826351d0eb62c970c4000b661da9d2be87"} Apr 22 19:23:23.733108 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:23.733071 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sl88" Apr 22 19:23:23.733300 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:23.733219 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4sl88" podUID="ac86a56e-148b-4fb4-8415-acff438d7915" Apr 22 19:23:23.802409 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:23.802357 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-175.ec2.internal" event={"ID":"6f55add77c4d902c98f085f4badc816c","Type":"ContainerStarted","Data":"1a14c7b8066ae99b54590842238ba14d2fca48d3ca925b079a43b347ece57b3b"} Apr 22 19:23:24.244716 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:24.244585 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac86a56e-148b-4fb4-8415-acff438d7915-metrics-certs\") pod \"network-metrics-daemon-4sl88\" (UID: \"ac86a56e-148b-4fb4-8415-acff438d7915\") " pod="openshift-multus/network-metrics-daemon-4sl88" Apr 22 19:23:24.244922 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:24.244791 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:24.244922 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:24.244866 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac86a56e-148b-4fb4-8415-acff438d7915-metrics-certs podName:ac86a56e-148b-4fb4-8415-acff438d7915 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:28.244847114 +0000 UTC m=+10.105800834 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ac86a56e-148b-4fb4-8415-acff438d7915-metrics-certs") pod "network-metrics-daemon-4sl88" (UID: "ac86a56e-148b-4fb4-8415-acff438d7915") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:24.345454 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:24.345318 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jw8rh\" (UniqueName: \"kubernetes.io/projected/c887aba7-380b-4a80-bc2c-6e89b986da6b-kube-api-access-jw8rh\") pod \"network-check-target-858rm\" (UID: \"c887aba7-380b-4a80-bc2c-6e89b986da6b\") " pod="openshift-network-diagnostics/network-check-target-858rm" Apr 22 19:23:24.345641 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:24.345525 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:24.345641 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:24.345543 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:24.345641 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:24.345557 2576 projected.go:194] Error preparing data for projected volume kube-api-access-jw8rh for pod openshift-network-diagnostics/network-check-target-858rm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:24.345641 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:24.345619 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c887aba7-380b-4a80-bc2c-6e89b986da6b-kube-api-access-jw8rh podName:c887aba7-380b-4a80-bc2c-6e89b986da6b nodeName:}" failed. No retries permitted until 2026-04-22 19:23:28.345600321 +0000 UTC m=+10.206554051 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-jw8rh" (UniqueName: "kubernetes.io/projected/c887aba7-380b-4a80-bc2c-6e89b986da6b-kube-api-access-jw8rh") pod "network-check-target-858rm" (UID: "c887aba7-380b-4a80-bc2c-6e89b986da6b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:24.735925 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:24.735702 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-858rm" Apr 22 19:23:24.735925 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:24.735841 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-858rm" podUID="c887aba7-380b-4a80-bc2c-6e89b986da6b" Apr 22 19:23:25.733228 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:25.733186 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sl88" Apr 22 19:23:25.733690 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:25.733330 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4sl88" podUID="ac86a56e-148b-4fb4-8415-acff438d7915" Apr 22 19:23:26.735043 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:26.734546 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-858rm" Apr 22 19:23:26.735043 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:26.734665 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-858rm" podUID="c887aba7-380b-4a80-bc2c-6e89b986da6b" Apr 22 19:23:27.733886 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:27.733856 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sl88" Apr 22 19:23:27.734056 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:27.733996 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4sl88" podUID="ac86a56e-148b-4fb4-8415-acff438d7915" Apr 22 19:23:28.279580 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:28.279002 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac86a56e-148b-4fb4-8415-acff438d7915-metrics-certs\") pod \"network-metrics-daemon-4sl88\" (UID: \"ac86a56e-148b-4fb4-8415-acff438d7915\") " pod="openshift-multus/network-metrics-daemon-4sl88" Apr 22 19:23:28.279580 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:28.279154 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:28.279580 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:28.279218 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac86a56e-148b-4fb4-8415-acff438d7915-metrics-certs podName:ac86a56e-148b-4fb4-8415-acff438d7915 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:36.279199325 +0000 UTC m=+18.140153046 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ac86a56e-148b-4fb4-8415-acff438d7915-metrics-certs") pod "network-metrics-daemon-4sl88" (UID: "ac86a56e-148b-4fb4-8415-acff438d7915") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:28.379962 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:28.379357 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jw8rh\" (UniqueName: \"kubernetes.io/projected/c887aba7-380b-4a80-bc2c-6e89b986da6b-kube-api-access-jw8rh\") pod \"network-check-target-858rm\" (UID: \"c887aba7-380b-4a80-bc2c-6e89b986da6b\") " pod="openshift-network-diagnostics/network-check-target-858rm" Apr 22 19:23:28.379962 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:28.379524 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:28.379962 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:28.379542 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:28.379962 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:28.379555 2576 projected.go:194] Error preparing data for projected volume kube-api-access-jw8rh for pod openshift-network-diagnostics/network-check-target-858rm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:28.379962 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:28.379613 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c887aba7-380b-4a80-bc2c-6e89b986da6b-kube-api-access-jw8rh podName:c887aba7-380b-4a80-bc2c-6e89b986da6b nodeName:}" failed. No retries permitted until 2026-04-22 19:23:36.379594307 +0000 UTC m=+18.240548026 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-jw8rh" (UniqueName: "kubernetes.io/projected/c887aba7-380b-4a80-bc2c-6e89b986da6b-kube-api-access-jw8rh") pod "network-check-target-858rm" (UID: "c887aba7-380b-4a80-bc2c-6e89b986da6b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:28.662181 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:28.660707 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-175.ec2.internal" podStartSLOduration=8.660688169 podStartE2EDuration="8.660688169s" podCreationTimestamp="2026-04-22 19:23:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:23:23.815448438 +0000 UTC m=+5.676402185" watchObservedRunningTime="2026-04-22 19:23:28.660688169 +0000 UTC m=+10.521641911" Apr 22 19:23:28.662181 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:28.661390 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-rb7nf"] Apr 22 19:23:28.664998 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:28.664965 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rb7nf" Apr 22 19:23:28.665149 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:28.665046 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rb7nf" podUID="b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8" Apr 22 19:23:28.734755 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:28.734279 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-858rm" Apr 22 19:23:28.734755 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:28.734386 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-858rm" podUID="c887aba7-380b-4a80-bc2c-6e89b986da6b" Apr 22 19:23:28.783584 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:28.783522 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8-kubelet-config\") pod \"global-pull-secret-syncer-rb7nf\" (UID: \"b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8\") " pod="kube-system/global-pull-secret-syncer-rb7nf" Apr 22 19:23:28.783584 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:28.783572 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8-dbus\") pod \"global-pull-secret-syncer-rb7nf\" (UID: \"b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8\") " pod="kube-system/global-pull-secret-syncer-rb7nf" Apr 22 19:23:28.783951 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:28.783684 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8-original-pull-secret\") pod \"global-pull-secret-syncer-rb7nf\" (UID: \"b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8\") " pod="kube-system/global-pull-secret-syncer-rb7nf" Apr 22 19:23:28.884904 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:28.884815 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8-kubelet-config\") pod \"global-pull-secret-syncer-rb7nf\" (UID: \"b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8\") " pod="kube-system/global-pull-secret-syncer-rb7nf" Apr 22 19:23:28.885071 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:28.884942 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8-dbus\") pod \"global-pull-secret-syncer-rb7nf\" (UID: \"b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8\") " pod="kube-system/global-pull-secret-syncer-rb7nf" Apr 22 19:23:28.885071 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:28.884953 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8-kubelet-config\") pod \"global-pull-secret-syncer-rb7nf\" (UID: \"b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8\") " pod="kube-system/global-pull-secret-syncer-rb7nf" Apr 22 19:23:28.885071 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:28.885030 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8-original-pull-secret\") pod \"global-pull-secret-syncer-rb7nf\" (UID: \"b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8\") " pod="kube-system/global-pull-secret-syncer-rb7nf" Apr 22 19:23:28.885475 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:28.885447 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:28.885595 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:28.885522 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8-dbus\") pod \"global-pull-secret-syncer-rb7nf\" (UID: \"b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8\") " pod="kube-system/global-pull-secret-syncer-rb7nf" Apr 22 19:23:28.885595 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:28.885529 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8-original-pull-secret podName:b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:29.385500441 +0000 UTC m=+11.246454159 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8-original-pull-secret") pod "global-pull-secret-syncer-rb7nf" (UID: "b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:29.389225 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:29.389185 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8-original-pull-secret\") pod \"global-pull-secret-syncer-rb7nf\" (UID: \"b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8\") " pod="kube-system/global-pull-secret-syncer-rb7nf" Apr 22 19:23:29.389671 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:29.389340 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:29.389671 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:29.389401 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8-original-pull-secret podName:b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:30.389382121 +0000 UTC m=+12.250335839 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8-original-pull-secret") pod "global-pull-secret-syncer-rb7nf" (UID: "b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:29.733425 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:29.733335 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sl88" Apr 22 19:23:29.733575 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:29.733485 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4sl88" podUID="ac86a56e-148b-4fb4-8415-acff438d7915" Apr 22 19:23:30.397092 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:30.397045 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8-original-pull-secret\") pod \"global-pull-secret-syncer-rb7nf\" (UID: \"b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8\") " pod="kube-system/global-pull-secret-syncer-rb7nf" Apr 22 19:23:30.397568 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:30.397236 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:30.397568 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:30.397327 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8-original-pull-secret podName:b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:32.397305863 +0000 UTC m=+14.258259581 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8-original-pull-secret") pod "global-pull-secret-syncer-rb7nf" (UID: "b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:30.732949 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:30.732911 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rb7nf" Apr 22 19:23:30.732949 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:30.732933 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-858rm" Apr 22 19:23:30.733161 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:30.733056 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rb7nf" podUID="b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8" Apr 22 19:23:30.733161 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:30.733132 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-858rm" podUID="c887aba7-380b-4a80-bc2c-6e89b986da6b" Apr 22 19:23:31.733004 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:31.732970 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sl88" Apr 22 19:23:31.733443 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:31.733089 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4sl88" podUID="ac86a56e-148b-4fb4-8415-acff438d7915" Apr 22 19:23:32.411976 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:32.411939 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8-original-pull-secret\") pod \"global-pull-secret-syncer-rb7nf\" (UID: \"b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8\") " pod="kube-system/global-pull-secret-syncer-rb7nf" Apr 22 19:23:32.412173 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:32.412102 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:32.412296 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:32.412191 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8-original-pull-secret podName:b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:36.412171643 +0000 UTC m=+18.273125373 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8-original-pull-secret") pod "global-pull-secret-syncer-rb7nf" (UID: "b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:32.733664 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:32.733578 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-858rm" Apr 22 19:23:32.733664 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:32.733611 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rb7nf" Apr 22 19:23:32.734143 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:32.733714 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-858rm" podUID="c887aba7-380b-4a80-bc2c-6e89b986da6b" Apr 22 19:23:32.734143 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:32.733864 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rb7nf" podUID="b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8" Apr 22 19:23:33.732878 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:33.732857 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sl88" Apr 22 19:23:33.732970 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:33.732952 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4sl88" podUID="ac86a56e-148b-4fb4-8415-acff438d7915" Apr 22 19:23:34.733938 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:34.733900 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rb7nf" Apr 22 19:23:34.734415 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:34.733908 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-858rm" Apr 22 19:23:34.734415 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:34.734041 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rb7nf" podUID="b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8" Apr 22 19:23:34.734415 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:34.734083 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-858rm" podUID="c887aba7-380b-4a80-bc2c-6e89b986da6b" Apr 22 19:23:35.733559 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:35.733513 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sl88" Apr 22 19:23:35.733825 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:35.733678 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4sl88" podUID="ac86a56e-148b-4fb4-8415-acff438d7915" Apr 22 19:23:36.342696 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:36.342661 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac86a56e-148b-4fb4-8415-acff438d7915-metrics-certs\") pod \"network-metrics-daemon-4sl88\" (UID: \"ac86a56e-148b-4fb4-8415-acff438d7915\") " pod="openshift-multus/network-metrics-daemon-4sl88" Apr 22 19:23:36.343138 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:36.342836 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:36.343138 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:36.342915 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac86a56e-148b-4fb4-8415-acff438d7915-metrics-certs podName:ac86a56e-148b-4fb4-8415-acff438d7915 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:52.34289358 +0000 UTC m=+34.203847304 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ac86a56e-148b-4fb4-8415-acff438d7915-metrics-certs") pod "network-metrics-daemon-4sl88" (UID: "ac86a56e-148b-4fb4-8415-acff438d7915") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:36.443821 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:36.443777 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jw8rh\" (UniqueName: \"kubernetes.io/projected/c887aba7-380b-4a80-bc2c-6e89b986da6b-kube-api-access-jw8rh\") pod \"network-check-target-858rm\" (UID: \"c887aba7-380b-4a80-bc2c-6e89b986da6b\") " pod="openshift-network-diagnostics/network-check-target-858rm" Apr 22 19:23:36.444004 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:36.443848 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8-original-pull-secret\") pod \"global-pull-secret-syncer-rb7nf\" (UID: \"b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8\") " pod="kube-system/global-pull-secret-syncer-rb7nf" Apr 22 19:23:36.444004 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:36.443950 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:36.444004 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:36.443980 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:36.444004 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:36.443995 2576 projected.go:194] Error preparing data for projected volume kube-api-access-jw8rh for pod openshift-network-diagnostics/network-check-target-858rm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:36.444219 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:36.443956 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:36.444219 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:36.444054 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c887aba7-380b-4a80-bc2c-6e89b986da6b-kube-api-access-jw8rh podName:c887aba7-380b-4a80-bc2c-6e89b986da6b nodeName:}" failed. No retries permitted until 2026-04-22 19:23:52.444035285 +0000 UTC m=+34.304989005 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-jw8rh" (UniqueName: "kubernetes.io/projected/c887aba7-380b-4a80-bc2c-6e89b986da6b-kube-api-access-jw8rh") pod "network-check-target-858rm" (UID: "c887aba7-380b-4a80-bc2c-6e89b986da6b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:36.444219 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:36.444097 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8-original-pull-secret podName:b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:44.444079012 +0000 UTC m=+26.305032728 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8-original-pull-secret") pod "global-pull-secret-syncer-rb7nf" (UID: "b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:36.733071 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:36.732980 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rb7nf" Apr 22 19:23:36.733236 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:36.733120 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rb7nf" podUID="b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8" Apr 22 19:23:36.733236 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:36.733175 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-858rm" Apr 22 19:23:36.733349 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:36.733288 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-858rm" podUID="c887aba7-380b-4a80-bc2c-6e89b986da6b" Apr 22 19:23:37.733644 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:37.733604 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sl88" Apr 22 19:23:37.734112 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:37.733765 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4sl88" podUID="ac86a56e-148b-4fb4-8415-acff438d7915" Apr 22 19:23:38.735155 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:38.734719 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rb7nf" Apr 22 19:23:38.735931 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:38.734868 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-858rm" Apr 22 19:23:38.735931 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:38.735256 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rb7nf" podUID="b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8" Apr 22 19:23:38.735931 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:38.735348 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-858rm" podUID="c887aba7-380b-4a80-bc2c-6e89b986da6b" Apr 22 19:23:38.830343 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:38.830309 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5hk26" event={"ID":"eb4cf245-3080-4779-80ac-295e37a8327f","Type":"ContainerStarted","Data":"6c870ee5e7c809432bbc8c69189d33b0bf01f96704fe872ac63c16bf963b663f"} Apr 22 19:23:38.836861 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:38.836491 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" event={"ID":"7ec73ca3-22df-4ef0-ad03-92031016c8b8","Type":"ContainerStarted","Data":"bde6fbdec0a48bc15d28abf5df3c4b88f6de2c35381dade388a0cbf02f75d20d"} Apr 22 19:23:38.836861 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:38.836529 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" event={"ID":"7ec73ca3-22df-4ef0-ad03-92031016c8b8","Type":"ContainerStarted","Data":"df220c2d6e42d40448bf10eeafff8ac658eff7786a3ab1bf0315f2f74934c2df"} Apr 22 19:23:38.836861 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:38.836544 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" event={"ID":"7ec73ca3-22df-4ef0-ad03-92031016c8b8","Type":"ContainerStarted","Data":"f3fba7c8b1198e540a68bf3d35d285a000a035364d177befa990ad79bdf45662"} Apr 22 19:23:38.837818 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:38.837796 2576 generic.go:358] "Generic (PLEG): container finished" podID="17ada943-fc5e-4b09-bf82-9132909cb32d" containerID="fb2b856569a83672dc7aa70db2a75f3439d1c127c0f5c39256d8023beedcec4b" exitCode=0 Apr 22 19:23:38.837914 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:38.837862 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hr5vn" event={"ID":"17ada943-fc5e-4b09-bf82-9132909cb32d","Type":"ContainerDied","Data":"fb2b856569a83672dc7aa70db2a75f3439d1c127c0f5c39256d8023beedcec4b"} Apr 22 19:23:38.839431 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:38.839231 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pnb2n" event={"ID":"a2085208-6038-4f2b-929e-b75fe56d5a28","Type":"ContainerStarted","Data":"8006d4f80f532265ec3dca45fa6c7712fb0b6f8fe0ba2de223cd9ff76f76db0f"} Apr 22 19:23:38.840631 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:38.840610 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2l7p" event={"ID":"f634baad-b7a2-4fe7-9706-4ecc6e5b9366","Type":"ContainerStarted","Data":"53b73fc925f3503deaa53922aab95236cb8b643312c4daeccbaff3b872e0a459"} Apr 22 19:23:38.842023 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:38.841994 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-w4jz7" event={"ID":"2fb0effb-90c6-4bf4-8981-baf430cec62a","Type":"ContainerStarted","Data":"b3c6cf106e30a84a5c28601e680eb1581d5a2a959be3410106e1c4c86f36af65"} Apr 22 19:23:38.843364 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:38.843300 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5hk26" podStartSLOduration=4.239552383 podStartE2EDuration="20.843266273s" podCreationTimestamp="2026-04-22 19:23:18 +0000 UTC" firstStartedPulling="2026-04-22 19:23:21.410613475 +0000 UTC m=+3.271567204" lastFinishedPulling="2026-04-22 19:23:38.014327361 +0000 UTC m=+19.875281094" observedRunningTime="2026-04-22 19:23:38.84323649 +0000 UTC m=+20.704190228" watchObservedRunningTime="2026-04-22 19:23:38.843266273 +0000 UTC m=+20.704220012" Apr 22 19:23:38.843480 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:38.843360 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nflqk" event={"ID":"4b532865-112c-43d5-a22b-58a5cece9682","Type":"ContainerStarted","Data":"d0959ae259268b54268fd57b093e549a51ff075f00ce17535a8754719698245e"} Apr 22 19:23:38.844662 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:38.844643 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" event={"ID":"cf7e7909-ac67-46c2-af45-9ceb49eb60c2","Type":"ContainerStarted","Data":"0d67c6a8f2031536abe72f3ef9cb25023ea4d0c49d15159b91879f4e8580af83"} Apr 22 19:23:38.854825 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:38.854779 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-pnb2n" podStartSLOduration=8.548485728 podStartE2EDuration="20.854764442s" podCreationTimestamp="2026-04-22 19:23:18 +0000 UTC" firstStartedPulling="2026-04-22 19:23:21.403091453 +0000 UTC m=+3.264045168" lastFinishedPulling="2026-04-22 19:23:33.709370149 +0000 UTC m=+15.570323882" observedRunningTime="2026-04-22 19:23:38.85467188 +0000 UTC m=+20.715625617" watchObservedRunningTime="2026-04-22 19:23:38.854764442 +0000 UTC m=+20.715718179" Apr 22 19:23:38.879821 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:38.879784 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-w4jz7" podStartSLOduration=7.578528518 podStartE2EDuration="19.879769898s" podCreationTimestamp="2026-04-22 19:23:19 +0000 UTC" firstStartedPulling="2026-04-22 19:23:21.408205384 +0000 UTC m=+3.269159100" lastFinishedPulling="2026-04-22 19:23:33.70944676 +0000 UTC m=+15.570400480" observedRunningTime="2026-04-22 19:23:38.8795782 +0000 UTC m=+20.740531936" watchObservedRunningTime="2026-04-22 19:23:38.879769898 +0000 UTC m=+20.740723635" Apr 22 19:23:38.893430 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:38.893393 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-tmk7w" podStartSLOduration=2.922770288 podStartE2EDuration="19.893380968s" podCreationTimestamp="2026-04-22 19:23:19 +0000 UTC" firstStartedPulling="2026-04-22 19:23:21.415805536 +0000 UTC m=+3.276759254" lastFinishedPulling="2026-04-22 19:23:38.386416219 +0000 UTC m=+20.247369934" observedRunningTime="2026-04-22 19:23:38.892719222 +0000 UTC m=+20.753672970" watchObservedRunningTime="2026-04-22 19:23:38.893380968 +0000 UTC m=+20.754334705" Apr 22 19:23:38.909934 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:38.909889 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-nflqk" podStartSLOduration=3.896593971 podStartE2EDuration="20.909869808s" podCreationTimestamp="2026-04-22 19:23:18 +0000 UTC" firstStartedPulling="2026-04-22 19:23:21.407140721 +0000 UTC m=+3.268094450" lastFinishedPulling="2026-04-22 19:23:38.420416556 +0000 UTC m=+20.281370287" observedRunningTime="2026-04-22 19:23:38.908918774 +0000 UTC m=+20.769872507" watchObservedRunningTime="2026-04-22 19:23:38.909869808 +0000 UTC m=+20.770823547" Apr 22 19:23:39.328935 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:39.328844 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-w4jz7" Apr 22 19:23:39.329711 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:39.329685 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-w4jz7" Apr 22 19:23:39.732924 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:39.732890 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sl88" Apr 22 19:23:39.733074 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:39.733014 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4sl88" podUID="ac86a56e-148b-4fb4-8415-acff438d7915" Apr 22 19:23:39.817645 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:39.817613 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 19:23:39.850251 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:39.850210 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" event={"ID":"7ec73ca3-22df-4ef0-ad03-92031016c8b8","Type":"ContainerStarted","Data":"c082c249fbfeecc31e565acd585710f36d7f9db463c119ff32691a0055b2b517"} Apr 22 19:23:39.850251 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:39.850254 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" event={"ID":"7ec73ca3-22df-4ef0-ad03-92031016c8b8","Type":"ContainerStarted","Data":"247fe63885760e3540e95659dea597cb3eac312038d371d55a6f369d29ffc056"} Apr 22 19:23:39.850450 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:39.850270 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" event={"ID":"7ec73ca3-22df-4ef0-ad03-92031016c8b8","Type":"ContainerStarted","Data":"7cc34160eaa1c9f75d8e7e7766c96950e90a55c0a1930391a0b66347defd4881"} Apr 22 19:23:39.851985 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:39.851950 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2l7p" event={"ID":"f634baad-b7a2-4fe7-9706-4ecc6e5b9366","Type":"ContainerStarted","Data":"fc801b5668331a72d3c95e6e134e6f7d3fdd578d36ee42bc8ad5a56c128ab3b5"} Apr 22 19:23:39.852212 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:39.852193 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-w4jz7" Apr 22 19:23:39.852908 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:39.852889 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-w4jz7" Apr 22 19:23:40.678742 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:40.678074 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T19:23:39.817629732Z","UUID":"70f9774c-b489-40bb-ad06-f3ff84f2cbc7","Handler":null,"Name":"","Endpoint":""} Apr 22 19:23:40.681184 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:40.681162 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 19:23:40.681328 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:40.681195 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 19:23:40.732860 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:40.732835 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rb7nf" Apr 22 19:23:40.733005 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:40.732961 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rb7nf" podUID="b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8" Apr 22 19:23:40.733005 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:40.732971 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-858rm" Apr 22 19:23:40.733117 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:40.733048 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-858rm" podUID="c887aba7-380b-4a80-bc2c-6e89b986da6b" Apr 22 19:23:40.856351 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:40.856319 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2l7p" event={"ID":"f634baad-b7a2-4fe7-9706-4ecc6e5b9366","Type":"ContainerStarted","Data":"c66d3817e814832b6ecccdc9aa5d50e25e5c93e028a23b218585333dc8bf3318"} Apr 22 19:23:40.857791 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:40.857762 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-68jrq" event={"ID":"6c9d2810-4d20-4eb6-9318-15765f879dfa","Type":"ContainerStarted","Data":"2361b99fec11aa3c23bf577f7f5855f66102516c908e658ba4119f0aa4039d4e"} Apr 22 19:23:40.883162 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:40.883116 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2l7p" podStartSLOduration=2.65023601 podStartE2EDuration="21.883098861s" podCreationTimestamp="2026-04-22 19:23:19 +0000 UTC" firstStartedPulling="2026-04-22 19:23:21.414572379 +0000 UTC m=+3.275526094" lastFinishedPulling="2026-04-22 19:23:40.647435225 +0000 UTC m=+22.508388945" observedRunningTime="2026-04-22 19:23:40.88238571 +0000 UTC m=+22.743339448" watchObservedRunningTime="2026-04-22 19:23:40.883098861 +0000 UTC m=+22.744052596" Apr 22 19:23:40.898206 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:40.898113 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-68jrq" podStartSLOduration=4.923570183 podStartE2EDuration="21.898096757s" podCreationTimestamp="2026-04-22 19:23:19 +0000 UTC" firstStartedPulling="2026-04-22 19:23:21.411865304 +0000 UTC m=+3.272819031" lastFinishedPulling="2026-04-22 19:23:38.386391876 +0000 UTC m=+20.247345605" observedRunningTime="2026-04-22 19:23:40.897937761 +0000 UTC m=+22.758891497" watchObservedRunningTime="2026-04-22 19:23:40.898096757 +0000 UTC m=+22.759050496" Apr 22 19:23:41.733770 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:41.733719 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sl88" Apr 22 19:23:41.733950 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:41.733865 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4sl88" podUID="ac86a56e-148b-4fb4-8415-acff438d7915" Apr 22 19:23:41.862344 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:41.862304 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" event={"ID":"7ec73ca3-22df-4ef0-ad03-92031016c8b8","Type":"ContainerStarted","Data":"7f2dd84bbd636881dec15eb796e54f796d46e1df30c213ed3d57209fab37d313"} Apr 22 19:23:42.733133 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:42.733095 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-858rm" Apr 22 19:23:42.733293 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:42.733100 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rb7nf" Apr 22 19:23:42.733293 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:42.733195 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-858rm" podUID="c887aba7-380b-4a80-bc2c-6e89b986da6b" Apr 22 19:23:42.733403 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:42.733300 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rb7nf" podUID="b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8" Apr 22 19:23:43.733058 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:43.732840 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sl88" Apr 22 19:23:43.733661 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:43.733089 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4sl88" podUID="ac86a56e-148b-4fb4-8415-acff438d7915" Apr 22 19:23:43.867483 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:43.867452 2576 generic.go:358] "Generic (PLEG): container finished" podID="17ada943-fc5e-4b09-bf82-9132909cb32d" containerID="6bdbd44a21714665b9cc183eb28be1992705251a00db503e12cff5e0cbe6e925" exitCode=0 Apr 22 19:23:43.867647 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:43.867538 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hr5vn" event={"ID":"17ada943-fc5e-4b09-bf82-9132909cb32d","Type":"ContainerDied","Data":"6bdbd44a21714665b9cc183eb28be1992705251a00db503e12cff5e0cbe6e925"} Apr 22 19:23:43.870558 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:43.870531 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" event={"ID":"7ec73ca3-22df-4ef0-ad03-92031016c8b8","Type":"ContainerStarted","Data":"4930d96cf1e9cbbdab9aeaaf2af9abaad0e9d7bb78fad21cf2ea2a7cd0b822b9"} Apr 22 19:23:43.870969 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:43.870950 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:43.870969 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:43.870980 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:43.885124 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:43.885102 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:43.885243 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:43.885180 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:43.927169 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:43.927120 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" podStartSLOduration=7.760391066 podStartE2EDuration="24.927104861s" podCreationTimestamp="2026-04-22 19:23:19 +0000 UTC" firstStartedPulling="2026-04-22 19:23:21.410778122 +0000 UTC m=+3.271731851" lastFinishedPulling="2026-04-22 19:23:38.577491932 +0000 UTC m=+20.438445646" observedRunningTime="2026-04-22 19:23:43.926685145 +0000 UTC m=+25.787638883" watchObservedRunningTime="2026-04-22 19:23:43.927104861 +0000 UTC m=+25.788058597" Apr 22 19:23:44.505575 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:44.505023 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8-original-pull-secret\") pod \"global-pull-secret-syncer-rb7nf\" (UID: \"b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8\") " pod="kube-system/global-pull-secret-syncer-rb7nf" Apr 22 19:23:44.505575 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:44.505149 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:44.505575 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:44.505208 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8-original-pull-secret podName:b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:00.505190736 +0000 UTC m=+42.366144468 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8-original-pull-secret") pod "global-pull-secret-syncer-rb7nf" (UID: "b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:44.733596 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:44.733429 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rb7nf" Apr 22 19:23:44.733938 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:44.733459 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-858rm" Apr 22 19:23:44.733938 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:44.733691 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rb7nf" podUID="b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8" Apr 22 19:23:44.733938 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:44.733804 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-858rm" podUID="c887aba7-380b-4a80-bc2c-6e89b986da6b" Apr 22 19:23:44.874116 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:44.874080 2576 generic.go:358] "Generic (PLEG): container finished" podID="17ada943-fc5e-4b09-bf82-9132909cb32d" containerID="7761617ae866fe9e5df9b3a0090d8b6061882ad5ac15eeb84a8193d73cf8ad11" exitCode=0 Apr 22 19:23:44.874265 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:44.874123 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hr5vn" event={"ID":"17ada943-fc5e-4b09-bf82-9132909cb32d","Type":"ContainerDied","Data":"7761617ae866fe9e5df9b3a0090d8b6061882ad5ac15eeb84a8193d73cf8ad11"} Apr 22 19:23:44.874357 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:44.874343 2576 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 19:23:45.179235 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:45.179157 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rb7nf"] Apr 22 19:23:45.179383 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:45.179297 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rb7nf" Apr 22 19:23:45.179454 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:45.179402 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rb7nf" podUID="b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8" Apr 22 19:23:45.182415 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:45.182392 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-858rm"] Apr 22 19:23:45.182502 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:45.182483 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-858rm" Apr 22 19:23:45.182573 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:45.182558 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-858rm" podUID="c887aba7-380b-4a80-bc2c-6e89b986da6b" Apr 22 19:23:45.185402 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:45.185372 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4sl88"] Apr 22 19:23:45.185501 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:45.185488 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sl88" Apr 22 19:23:45.185617 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:45.185597 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4sl88" podUID="ac86a56e-148b-4fb4-8415-acff438d7915" Apr 22 19:23:45.445417 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:45.445339 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:23:45.877773 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:45.877665 2576 generic.go:358] "Generic (PLEG): container finished" podID="17ada943-fc5e-4b09-bf82-9132909cb32d" containerID="3d6e06db85b76d9fecf2426c55b7186a5b75e522085b0907ad0315870865a14f" exitCode=0 Apr 22 19:23:45.877773 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:45.877760 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hr5vn" event={"ID":"17ada943-fc5e-4b09-bf82-9132909cb32d","Type":"ContainerDied","Data":"3d6e06db85b76d9fecf2426c55b7186a5b75e522085b0907ad0315870865a14f"} Apr 22 19:23:46.733609 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:46.733574 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rb7nf" Apr 22 19:23:46.733793 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:46.733696 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rb7nf" podUID="b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8" Apr 22 19:23:46.733793 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:46.733574 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-858rm" Apr 22 19:23:46.733793 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:46.733764 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sl88" Apr 22 19:23:46.733960 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:46.733850 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-858rm" podUID="c887aba7-380b-4a80-bc2c-6e89b986da6b" Apr 22 19:23:46.734013 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:46.733956 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4sl88" podUID="ac86a56e-148b-4fb4-8415-acff438d7915" Apr 22 19:23:48.734158 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:48.734117 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rb7nf" Apr 22 19:23:48.734624 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:48.734228 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rb7nf" podUID="b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8" Apr 22 19:23:48.734624 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:48.734316 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-858rm" Apr 22 19:23:48.734624 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:48.734428 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-858rm" podUID="c887aba7-380b-4a80-bc2c-6e89b986da6b" Apr 22 19:23:48.734624 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:48.734471 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sl88" Apr 22 19:23:48.734624 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:48.734608 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4sl88" podUID="ac86a56e-148b-4fb4-8415-acff438d7915" Apr 22 19:23:50.733687 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:50.733653 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-858rm" Apr 22 19:23:50.734163 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:50.733709 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rb7nf" Apr 22 19:23:50.734163 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:50.733812 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-858rm" podUID="c887aba7-380b-4a80-bc2c-6e89b986da6b" Apr 22 19:23:50.734163 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:50.733898 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sl88" Apr 22 19:23:50.734163 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:50.733886 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rb7nf" podUID="b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8" Apr 22 19:23:50.734163 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:50.734045 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4sl88" podUID="ac86a56e-148b-4fb4-8415-acff438d7915" Apr 22 19:23:51.510933 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.510904 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-175.ec2.internal" event="NodeReady" Apr 22 19:23:51.511097 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.511011 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 19:23:51.544838 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.544802 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7674868db4-j8tkf"] Apr 22 19:23:51.565365 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.564961 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-5rw5p"] Apr 22 19:23:51.565365 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.565063 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7674868db4-j8tkf" Apr 22 19:23:51.568376 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.568352 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 19:23:51.568512 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.568494 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 19:23:51.568512 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.568503 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 19:23:51.568843 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.568826 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-df49v\"" Apr 22 19:23:51.574174 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.574152 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 19:23:51.581057 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.580713 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pq94v"] Apr 22 19:23:51.594688 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.594663 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrxdf"] Apr 22 19:23:51.594804 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.594773 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-5rw5p" Apr 22 19:23:51.594997 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.594909 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pq94v" Apr 22 19:23:51.596837 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.596820 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 22 19:23:51.596970 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.596952 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:23:51.597026 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.597004 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:23:51.597079 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.597046 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-bts6t\"" Apr 22 19:23:51.597140 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.597125 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-jllz2\"" Apr 22 19:23:51.597211 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.597181 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 22 19:23:51.597337 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.597319 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 22 19:23:51.597444 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.597349 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 22 19:23:51.606142 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.606120 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 22 19:23:51.619560 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.619541 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-748dj"] Apr 22 19:23:51.619714 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.619698 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrxdf" Apr 22 19:23:51.623542 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.623523 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 22 19:23:51.623629 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.623549 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 22 19:23:51.624086 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.624051 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-hv58p\"" Apr 22 19:23:51.624086 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.624052 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:23:51.637570 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.637523 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-z7d4f"] Apr 22 19:23:51.637768 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.637747 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-748dj" Apr 22 19:23:51.639496 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.639479 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 22 19:23:51.639885 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.639867 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-6bl7r\"" Apr 22 19:23:51.639984 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.639936 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 22 19:23:51.640055 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.639986 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 19:23:51.640055 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.640017 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 19:23:51.645524 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.645495 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 22 19:23:51.655403 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.655383 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7697566b5d-hh5dn"] Apr 22 19:23:51.655548 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.655531 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-z7d4f" Apr 22 19:23:51.657402 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.657385 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-t5r89\"" Apr 22 19:23:51.657509 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.657388 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 22 19:23:51.657704 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.657688 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:23:51.657803 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.657715 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 22 19:23:51.657860 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.657847 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 22 19:23:51.662073 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.662050 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mnpm\" (UniqueName: \"kubernetes.io/projected/45a00f57-cea9-483d-bda8-358ac125ff3b-kube-api-access-9mnpm\") pod \"volume-data-source-validator-7c6cbb6c87-pq94v\" (UID: \"45a00f57-cea9-483d-bda8-358ac125ff3b\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pq94v" Apr 22 19:23:51.662145 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.662095 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/14676505-81f2-4619-b4da-c41d68fefcce-image-registry-private-configuration\") pod \"image-registry-7674868db4-j8tkf\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " pod="openshift-image-registry/image-registry-7674868db4-j8tkf" Apr 22 19:23:51.662145 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.662129 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af8ba87e-3b79-4273-af80-a16c607dbf85-serving-cert\") pod \"console-operator-9d4b6777b-5rw5p\" (UID: \"af8ba87e-3b79-4273-af80-a16c607dbf85\") " pod="openshift-console-operator/console-operator-9d4b6777b-5rw5p" Apr 22 19:23:51.662208 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.662164 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14676505-81f2-4619-b4da-c41d68fefcce-installation-pull-secrets\") pod \"image-registry-7674868db4-j8tkf\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " pod="openshift-image-registry/image-registry-7674868db4-j8tkf" Apr 22 19:23:51.662208 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.662196 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrl5c\" (UniqueName: \"kubernetes.io/projected/14676505-81f2-4619-b4da-c41d68fefcce-kube-api-access-qrl5c\") pod \"image-registry-7674868db4-j8tkf\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " pod="openshift-image-registry/image-registry-7674868db4-j8tkf" Apr 22 19:23:51.662287 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.662225 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af8ba87e-3b79-4273-af80-a16c607dbf85-config\") pod \"console-operator-9d4b6777b-5rw5p\" (UID: \"af8ba87e-3b79-4273-af80-a16c607dbf85\") " pod="openshift-console-operator/console-operator-9d4b6777b-5rw5p" Apr 22 19:23:51.662325 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.662305 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14676505-81f2-4619-b4da-c41d68fefcce-ca-trust-extracted\") pod \"image-registry-7674868db4-j8tkf\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " pod="openshift-image-registry/image-registry-7674868db4-j8tkf" Apr 22 19:23:51.662357 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.662331 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14676505-81f2-4619-b4da-c41d68fefcce-registry-tls\") pod \"image-registry-7674868db4-j8tkf\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " pod="openshift-image-registry/image-registry-7674868db4-j8tkf" Apr 22 19:23:51.662390 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.662358 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14676505-81f2-4619-b4da-c41d68fefcce-trusted-ca\") pod \"image-registry-7674868db4-j8tkf\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " pod="openshift-image-registry/image-registry-7674868db4-j8tkf" Apr 22 19:23:51.662390 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.662382 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wlxh\" (UniqueName: \"kubernetes.io/projected/af8ba87e-3b79-4273-af80-a16c607dbf85-kube-api-access-7wlxh\") pod \"console-operator-9d4b6777b-5rw5p\" (UID: \"af8ba87e-3b79-4273-af80-a16c607dbf85\") " pod="openshift-console-operator/console-operator-9d4b6777b-5rw5p" Apr 22 19:23:51.662451 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.662402 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14676505-81f2-4619-b4da-c41d68fefcce-registry-certificates\") pod \"image-registry-7674868db4-j8tkf\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " pod="openshift-image-registry/image-registry-7674868db4-j8tkf" Apr 22 19:23:51.662451 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.662416 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14676505-81f2-4619-b4da-c41d68fefcce-bound-sa-token\") pod \"image-registry-7674868db4-j8tkf\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " pod="openshift-image-registry/image-registry-7674868db4-j8tkf" Apr 22 19:23:51.662451 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.662435 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/af8ba87e-3b79-4273-af80-a16c607dbf85-trusted-ca\") pod \"console-operator-9d4b6777b-5rw5p\" (UID: \"af8ba87e-3b79-4273-af80-a16c607dbf85\") " pod="openshift-console-operator/console-operator-9d4b6777b-5rw5p" Apr 22 19:23:51.673597 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.673577 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mqx2f"] Apr 22 19:23:51.673761 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.673745 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7697566b5d-hh5dn" Apr 22 19:23:51.675631 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.675612 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 19:23:51.675754 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.675648 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-4sm6v\"" Apr 22 19:23:51.675754 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.675694 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 19:23:51.675754 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.675747 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 22 19:23:51.675895 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.675758 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 22 19:23:51.675967 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.675949 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 22 19:23:51.676023 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.675960 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 22 19:23:51.691881 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.691862 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-5rw5p"] Apr 22 19:23:51.691881 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.691885 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pq94v"] Apr 22 19:23:51.692004 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.691895 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7674868db4-j8tkf"] Apr 22 19:23:51.692004 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.691902 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-748dj"] Apr 22 19:23:51.692004 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.691910 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-z7d4f"] Apr 22 19:23:51.692004 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.691920 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-5sfmj"] Apr 22 19:23:51.692004 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.691989 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mqx2f" Apr 22 19:23:51.693697 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.693679 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 22 19:23:51.693887 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.693870 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:23:51.693987 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.693970 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 22 19:23:51.693987 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.693981 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-8q5sb\"" Apr 22 19:23:51.694069 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.694039 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 22 19:23:51.709869 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.709848 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrxdf"] Apr 22 19:23:51.709869 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.709868 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-wg445"] Apr 22 19:23:51.709991 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.709971 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5sfmj" Apr 22 19:23:51.711767 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.711751 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 22 19:23:51.711968 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.711951 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 19:23:51.713252 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.713202 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 19:23:51.714090 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.713531 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-kb7vh\"" Apr 22 19:23:51.716528 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.716005 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 22 19:23:51.724690 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.724670 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-v2f7q"] Apr 22 19:23:51.724877 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.724857 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wg445" Apr 22 19:23:51.726708 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.726691 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 22 19:23:51.727152 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.727134 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 22 19:23:51.727249 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.727161 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-f2r5m\"" Apr 22 19:23:51.742290 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.742255 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-rbbtr"] Apr 22 19:23:51.742848 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.742414 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v2f7q" Apr 22 19:23:51.744344 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.744326 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-mjqf7\"" Apr 22 19:23:51.744629 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.744598 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:23:51.744842 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.744634 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:23:51.756203 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.756186 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7697566b5d-hh5dn"] Apr 22 19:23:51.756280 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.756207 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mqx2f"] Apr 22 19:23:51.756280 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.756218 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-5sfmj"] Apr 22 19:23:51.756280 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.756231 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-wg445"] Apr 22 19:23:51.756280 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.756246 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rbbtr"] Apr 22 19:23:51.756280 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.756257 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-v2f7q"] Apr 22 19:23:51.756428 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.756290 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-cj979"] Apr 22 19:23:51.756428 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.756319 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rbbtr" Apr 22 19:23:51.758423 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.758407 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 19:23:51.759313 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.759298 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 19:23:51.759500 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.759486 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 19:23:51.759575 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.759509 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-wknft\"" Apr 22 19:23:51.763291 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.763274 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/944e0b61-e3fd-4e73-88f5-25f65ad44d11-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-mqx2f\" (UID: \"944e0b61-e3fd-4e73-88f5-25f65ad44d11\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mqx2f" Apr 22 19:23:51.763402 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.763301 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97ce33ae-619d-451a-8e53-dc35cdbba9e3-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-748dj\" (UID: \"97ce33ae-619d-451a-8e53-dc35cdbba9e3\") " pod="openshift-insights/insights-operator-585dfdc468-748dj" Apr 22 19:23:51.763402 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.763319 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97ce33ae-619d-451a-8e53-dc35cdbba9e3-service-ca-bundle\") pod \"insights-operator-585dfdc468-748dj\" (UID: \"97ce33ae-619d-451a-8e53-dc35cdbba9e3\") " pod="openshift-insights/insights-operator-585dfdc468-748dj" Apr 22 19:23:51.763402 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.763335 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/403f71c1-3bb4-4aef-9a26-b062489c9d03-config\") pod \"service-ca-operator-d6fc45fc5-z7d4f\" (UID: \"403f71c1-3bb4-4aef-9a26-b062489c9d03\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-z7d4f" Apr 22 19:23:51.763402 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.763366 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/af8ba87e-3b79-4273-af80-a16c607dbf85-trusted-ca\") pod \"console-operator-9d4b6777b-5rw5p\" (UID: \"af8ba87e-3b79-4273-af80-a16c607dbf85\") " pod="openshift-console-operator/console-operator-9d4b6777b-5rw5p" Apr 22 19:23:51.763626 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.763423 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14676505-81f2-4619-b4da-c41d68fefcce-trusted-ca\") pod \"image-registry-7674868db4-j8tkf\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " pod="openshift-image-registry/image-registry-7674868db4-j8tkf" Apr 22 19:23:51.763626 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.763460 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/944e0b61-e3fd-4e73-88f5-25f65ad44d11-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-mqx2f\" (UID: \"944e0b61-e3fd-4e73-88f5-25f65ad44d11\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mqx2f" Apr 22 19:23:51.763626 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.763489 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ns8x\" (UniqueName: \"kubernetes.io/projected/117697b4-88b4-4152-942c-224dcf13a685-kube-api-access-5ns8x\") pod \"router-default-7697566b5d-hh5dn\" (UID: \"117697b4-88b4-4152-942c-224dcf13a685\") " pod="openshift-ingress/router-default-7697566b5d-hh5dn" Apr 22 19:23:51.763626 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.763516 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/97ce33ae-619d-451a-8e53-dc35cdbba9e3-tmp\") pod \"insights-operator-585dfdc468-748dj\" (UID: \"97ce33ae-619d-451a-8e53-dc35cdbba9e3\") " pod="openshift-insights/insights-operator-585dfdc468-748dj" Apr 22 19:23:51.763626 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.763539 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjdnf\" (UniqueName: \"kubernetes.io/projected/3543ced1-0882-444d-be8a-53d6ef47c1e6-kube-api-access-jjdnf\") pod \"cluster-samples-operator-6dc5bdb6b4-wrxdf\" (UID: \"3543ced1-0882-444d-be8a-53d6ef47c1e6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrxdf" Apr 22 19:23:51.763626 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.763572 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/117697b4-88b4-4152-942c-224dcf13a685-stats-auth\") pod \"router-default-7697566b5d-hh5dn\" (UID: \"117697b4-88b4-4152-942c-224dcf13a685\") " pod="openshift-ingress/router-default-7697566b5d-hh5dn" Apr 22 19:23:51.763626 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.763590 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/14676505-81f2-4619-b4da-c41d68fefcce-image-registry-private-configuration\") pod \"image-registry-7674868db4-j8tkf\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " pod="openshift-image-registry/image-registry-7674868db4-j8tkf" Apr 22 19:23:51.763626 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.763606 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af8ba87e-3b79-4273-af80-a16c607dbf85-serving-cert\") pod \"console-operator-9d4b6777b-5rw5p\" (UID: \"af8ba87e-3b79-4273-af80-a16c607dbf85\") " pod="openshift-console-operator/console-operator-9d4b6777b-5rw5p" Apr 22 19:23:51.763626 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.763624 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14676505-81f2-4619-b4da-c41d68fefcce-installation-pull-secrets\") pod \"image-registry-7674868db4-j8tkf\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " pod="openshift-image-registry/image-registry-7674868db4-j8tkf" Apr 22 19:23:51.764081 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.763663 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/117697b4-88b4-4152-942c-224dcf13a685-default-certificate\") pod \"router-default-7697566b5d-hh5dn\" (UID: \"117697b4-88b4-4152-942c-224dcf13a685\") " pod="openshift-ingress/router-default-7697566b5d-hh5dn" Apr 22 19:23:51.764081 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.763702 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrl5c\" (UniqueName: \"kubernetes.io/projected/14676505-81f2-4619-b4da-c41d68fefcce-kube-api-access-qrl5c\") pod \"image-registry-7674868db4-j8tkf\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " pod="openshift-image-registry/image-registry-7674868db4-j8tkf" Apr 22 19:23:51.764081 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.763767 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af8ba87e-3b79-4273-af80-a16c607dbf85-config\") pod \"console-operator-9d4b6777b-5rw5p\" (UID: \"af8ba87e-3b79-4273-af80-a16c607dbf85\") " pod="openshift-console-operator/console-operator-9d4b6777b-5rw5p" Apr 22 19:23:51.764081 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.763797 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/117697b4-88b4-4152-942c-224dcf13a685-metrics-certs\") pod \"router-default-7697566b5d-hh5dn\" (UID: \"117697b4-88b4-4152-942c-224dcf13a685\") " pod="openshift-ingress/router-default-7697566b5d-hh5dn" Apr 22 19:23:51.764081 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.763849 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14676505-81f2-4619-b4da-c41d68fefcce-ca-trust-extracted\") pod \"image-registry-7674868db4-j8tkf\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " pod="openshift-image-registry/image-registry-7674868db4-j8tkf" Apr 22 19:23:51.764081 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.763875 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97ce33ae-619d-451a-8e53-dc35cdbba9e3-serving-cert\") pod \"insights-operator-585dfdc468-748dj\" (UID: \"97ce33ae-619d-451a-8e53-dc35cdbba9e3\") " pod="openshift-insights/insights-operator-585dfdc468-748dj" Apr 22 19:23:51.764081 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.763901 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wshzs\" (UniqueName: \"kubernetes.io/projected/97ce33ae-619d-451a-8e53-dc35cdbba9e3-kube-api-access-wshzs\") pod \"insights-operator-585dfdc468-748dj\" (UID: \"97ce33ae-619d-451a-8e53-dc35cdbba9e3\") " pod="openshift-insights/insights-operator-585dfdc468-748dj" Apr 22 19:23:51.764081 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.763929 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqtx2\" (UniqueName: \"kubernetes.io/projected/403f71c1-3bb4-4aef-9a26-b062489c9d03-kube-api-access-tqtx2\") pod \"service-ca-operator-d6fc45fc5-z7d4f\" (UID: \"403f71c1-3bb4-4aef-9a26-b062489c9d03\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-z7d4f" Apr 22 19:23:51.764081 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.763955 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/117697b4-88b4-4152-942c-224dcf13a685-service-ca-bundle\") pod \"router-default-7697566b5d-hh5dn\" (UID: \"117697b4-88b4-4152-942c-224dcf13a685\") " pod="openshift-ingress/router-default-7697566b5d-hh5dn" Apr 22 19:23:51.764081 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.763988 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14676505-81f2-4619-b4da-c41d68fefcce-registry-tls\") pod \"image-registry-7674868db4-j8tkf\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " pod="openshift-image-registry/image-registry-7674868db4-j8tkf" Apr 22 19:23:51.764081 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.764014 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/403f71c1-3bb4-4aef-9a26-b062489c9d03-serving-cert\") pod \"service-ca-operator-d6fc45fc5-z7d4f\" (UID: \"403f71c1-3bb4-4aef-9a26-b062489c9d03\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-z7d4f" Apr 22 19:23:51.764081 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.764042 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7wlxh\" (UniqueName: \"kubernetes.io/projected/af8ba87e-3b79-4273-af80-a16c607dbf85-kube-api-access-7wlxh\") pod \"console-operator-9d4b6777b-5rw5p\" (UID: \"af8ba87e-3b79-4273-af80-a16c607dbf85\") " pod="openshift-console-operator/console-operator-9d4b6777b-5rw5p" Apr 22 19:23:51.764081 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.764070 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/97ce33ae-619d-451a-8e53-dc35cdbba9e3-snapshots\") pod \"insights-operator-585dfdc468-748dj\" (UID: \"97ce33ae-619d-451a-8e53-dc35cdbba9e3\") " pod="openshift-insights/insights-operator-585dfdc468-748dj" Apr 22 19:23:51.764686 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.764101 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14676505-81f2-4619-b4da-c41d68fefcce-registry-certificates\") pod \"image-registry-7674868db4-j8tkf\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " pod="openshift-image-registry/image-registry-7674868db4-j8tkf" Apr 22 19:23:51.764686 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.764139 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14676505-81f2-4619-b4da-c41d68fefcce-bound-sa-token\") pod \"image-registry-7674868db4-j8tkf\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " pod="openshift-image-registry/image-registry-7674868db4-j8tkf" Apr 22 19:23:51.764686 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.764176 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3543ced1-0882-444d-be8a-53d6ef47c1e6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wrxdf\" (UID: \"3543ced1-0882-444d-be8a-53d6ef47c1e6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrxdf" Apr 22 19:23:51.764686 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.764269 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/af8ba87e-3b79-4273-af80-a16c607dbf85-trusted-ca\") pod \"console-operator-9d4b6777b-5rw5p\" (UID: \"af8ba87e-3b79-4273-af80-a16c607dbf85\") " pod="openshift-console-operator/console-operator-9d4b6777b-5rw5p" Apr 22 19:23:51.764686 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.764343 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mnpm\" (UniqueName: \"kubernetes.io/projected/45a00f57-cea9-483d-bda8-358ac125ff3b-kube-api-access-9mnpm\") pod \"volume-data-source-validator-7c6cbb6c87-pq94v\" (UID: \"45a00f57-cea9-483d-bda8-358ac125ff3b\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pq94v" Apr 22 19:23:51.764686 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.764378 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgpsh\" (UniqueName: \"kubernetes.io/projected/944e0b61-e3fd-4e73-88f5-25f65ad44d11-kube-api-access-fgpsh\") pod \"kube-storage-version-migrator-operator-6769c5d45-mqx2f\" (UID: \"944e0b61-e3fd-4e73-88f5-25f65ad44d11\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mqx2f" Apr 22 19:23:51.764686 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.764382 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14676505-81f2-4619-b4da-c41d68fefcce-trusted-ca\") pod \"image-registry-7674868db4-j8tkf\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " pod="openshift-image-registry/image-registry-7674868db4-j8tkf" Apr 22 19:23:51.764686 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:51.764488 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:23:51.764686 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:51.764504 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7674868db4-j8tkf: secret "image-registry-tls" not found Apr 22 19:23:51.764686 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:51.764559 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14676505-81f2-4619-b4da-c41d68fefcce-registry-tls podName:14676505-81f2-4619-b4da-c41d68fefcce nodeName:}" failed. No retries permitted until 2026-04-22 19:23:52.264543038 +0000 UTC m=+34.125496767 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/14676505-81f2-4619-b4da-c41d68fefcce-registry-tls") pod "image-registry-7674868db4-j8tkf" (UID: "14676505-81f2-4619-b4da-c41d68fefcce") : secret "image-registry-tls" not found Apr 22 19:23:51.765202 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.764857 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14676505-81f2-4619-b4da-c41d68fefcce-ca-trust-extracted\") pod \"image-registry-7674868db4-j8tkf\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " pod="openshift-image-registry/image-registry-7674868db4-j8tkf" Apr 22 19:23:51.765258 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.765215 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14676505-81f2-4619-b4da-c41d68fefcce-registry-certificates\") pod \"image-registry-7674868db4-j8tkf\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " pod="openshift-image-registry/image-registry-7674868db4-j8tkf" Apr 22 19:23:51.765757 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.765625 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af8ba87e-3b79-4273-af80-a16c607dbf85-config\") pod \"console-operator-9d4b6777b-5rw5p\" (UID: \"af8ba87e-3b79-4273-af80-a16c607dbf85\") " pod="openshift-console-operator/console-operator-9d4b6777b-5rw5p" Apr 22 19:23:51.767628 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.767604 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af8ba87e-3b79-4273-af80-a16c607dbf85-serving-cert\") pod \"console-operator-9d4b6777b-5rw5p\" (UID: \"af8ba87e-3b79-4273-af80-a16c607dbf85\") " pod="openshift-console-operator/console-operator-9d4b6777b-5rw5p" Apr 22 19:23:51.767760 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.767610 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14676505-81f2-4619-b4da-c41d68fefcce-installation-pull-secrets\") pod \"image-registry-7674868db4-j8tkf\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " pod="openshift-image-registry/image-registry-7674868db4-j8tkf" Apr 22 19:23:51.767760 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.767640 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/14676505-81f2-4619-b4da-c41d68fefcce-image-registry-private-configuration\") pod \"image-registry-7674868db4-j8tkf\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " pod="openshift-image-registry/image-registry-7674868db4-j8tkf" Apr 22 19:23:51.778998 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.778975 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14676505-81f2-4619-b4da-c41d68fefcce-bound-sa-token\") pod \"image-registry-7674868db4-j8tkf\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " pod="openshift-image-registry/image-registry-7674868db4-j8tkf" Apr 22 19:23:51.779096 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.779030 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mnpm\" (UniqueName: \"kubernetes.io/projected/45a00f57-cea9-483d-bda8-358ac125ff3b-kube-api-access-9mnpm\") pod \"volume-data-source-validator-7c6cbb6c87-pq94v\" (UID: \"45a00f57-cea9-483d-bda8-358ac125ff3b\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pq94v" Apr 22 19:23:51.780044 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.780025 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wlxh\" (UniqueName: \"kubernetes.io/projected/af8ba87e-3b79-4273-af80-a16c607dbf85-kube-api-access-7wlxh\") pod \"console-operator-9d4b6777b-5rw5p\" (UID: \"af8ba87e-3b79-4273-af80-a16c607dbf85\") " pod="openshift-console-operator/console-operator-9d4b6777b-5rw5p" Apr 22 19:23:51.781068 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.781049 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrl5c\" (UniqueName: \"kubernetes.io/projected/14676505-81f2-4619-b4da-c41d68fefcce-kube-api-access-qrl5c\") pod \"image-registry-7674868db4-j8tkf\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " pod="openshift-image-registry/image-registry-7674868db4-j8tkf" Apr 22 19:23:51.785572 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.785556 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cj979"] Apr 22 19:23:51.785642 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.785616 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cj979" Apr 22 19:23:51.788049 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.788034 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 19:23:51.788362 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.788343 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 19:23:51.788508 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.788493 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-bspvd\"" Apr 22 19:23:51.865035 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.865008 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/117697b4-88b4-4152-942c-224dcf13a685-stats-auth\") pod \"router-default-7697566b5d-hh5dn\" (UID: \"117697b4-88b4-4152-942c-224dcf13a685\") " pod="openshift-ingress/router-default-7697566b5d-hh5dn" Apr 22 19:23:51.865139 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.865051 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/117697b4-88b4-4152-942c-224dcf13a685-default-certificate\") pod \"router-default-7697566b5d-hh5dn\" (UID: \"117697b4-88b4-4152-942c-224dcf13a685\") " pod="openshift-ingress/router-default-7697566b5d-hh5dn" Apr 22 19:23:51.865139 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.865070 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/117697b4-88b4-4152-942c-224dcf13a685-metrics-certs\") pod \"router-default-7697566b5d-hh5dn\" (UID: \"117697b4-88b4-4152-942c-224dcf13a685\") " pod="openshift-ingress/router-default-7697566b5d-hh5dn" Apr 22 19:23:51.865139 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.865121 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/3dd0e38f-633c-4e17-8389-15a41d942093-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-5sfmj\" (UID: \"3dd0e38f-633c-4e17-8389-15a41d942093\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5sfmj" Apr 22 19:23:51.865248 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:51.865160 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:23:51.865248 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.865175 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97ce33ae-619d-451a-8e53-dc35cdbba9e3-serving-cert\") pod \"insights-operator-585dfdc468-748dj\" (UID: \"97ce33ae-619d-451a-8e53-dc35cdbba9e3\") " pod="openshift-insights/insights-operator-585dfdc468-748dj" Apr 22 19:23:51.865248 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.865197 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wshzs\" (UniqueName: \"kubernetes.io/projected/97ce33ae-619d-451a-8e53-dc35cdbba9e3-kube-api-access-wshzs\") pod \"insights-operator-585dfdc468-748dj\" (UID: \"97ce33ae-619d-451a-8e53-dc35cdbba9e3\") " pod="openshift-insights/insights-operator-585dfdc468-748dj" Apr 22 19:23:51.865248 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.865213 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tqtx2\" (UniqueName: \"kubernetes.io/projected/403f71c1-3bb4-4aef-9a26-b062489c9d03-kube-api-access-tqtx2\") pod \"service-ca-operator-d6fc45fc5-z7d4f\" (UID: \"403f71c1-3bb4-4aef-9a26-b062489c9d03\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-z7d4f" Apr 22 19:23:51.865248 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:51.865241 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/117697b4-88b4-4152-942c-224dcf13a685-metrics-certs podName:117697b4-88b4-4152-942c-224dcf13a685 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:52.365220532 +0000 UTC m=+34.226174248 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/117697b4-88b4-4152-942c-224dcf13a685-metrics-certs") pod "router-default-7697566b5d-hh5dn" (UID: "117697b4-88b4-4152-942c-224dcf13a685") : secret "router-metrics-certs-default" not found Apr 22 19:23:51.865468 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.865271 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/117697b4-88b4-4152-942c-224dcf13a685-service-ca-bundle\") pod \"router-default-7697566b5d-hh5dn\" (UID: \"117697b4-88b4-4152-942c-224dcf13a685\") " pod="openshift-ingress/router-default-7697566b5d-hh5dn" Apr 22 19:23:51.865468 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.865300 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a439b73c-32fc-4331-99ac-1f81f67a857d-tmp-dir\") pod \"dns-default-cj979\" (UID: \"a439b73c-32fc-4331-99ac-1f81f67a857d\") " pod="openshift-dns/dns-default-cj979" Apr 22 19:23:51.865468 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.865341 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/403f71c1-3bb4-4aef-9a26-b062489c9d03-serving-cert\") pod \"service-ca-operator-d6fc45fc5-z7d4f\" (UID: \"403f71c1-3bb4-4aef-9a26-b062489c9d03\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-z7d4f" Apr 22 19:23:51.865468 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.865371 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3a182f49-9fd9-43cf-983a-9e45aae094ac-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-wg445\" (UID: \"3a182f49-9fd9-43cf-983a-9e45aae094ac\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wg445" Apr 22 19:23:51.865468 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.865389 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3dd0e38f-633c-4e17-8389-15a41d942093-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5sfmj\" (UID: \"3dd0e38f-633c-4e17-8389-15a41d942093\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5sfmj" Apr 22 19:23:51.865468 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:51.865426 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/117697b4-88b4-4152-942c-224dcf13a685-service-ca-bundle podName:117697b4-88b4-4152-942c-224dcf13a685 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:52.365407629 +0000 UTC m=+34.226361346 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/117697b4-88b4-4152-942c-224dcf13a685-service-ca-bundle") pod "router-default-7697566b5d-hh5dn" (UID: "117697b4-88b4-4152-942c-224dcf13a685") : configmap references non-existent config key: service-ca.crt Apr 22 19:23:51.865468 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.865448 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdsq4\" (UniqueName: \"kubernetes.io/projected/3dd0e38f-633c-4e17-8389-15a41d942093-kube-api-access-sdsq4\") pod \"cluster-monitoring-operator-75587bd455-5sfmj\" (UID: \"3dd0e38f-633c-4e17-8389-15a41d942093\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5sfmj" Apr 22 19:23:51.865782 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.865486 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/97ce33ae-619d-451a-8e53-dc35cdbba9e3-snapshots\") pod \"insights-operator-585dfdc468-748dj\" (UID: \"97ce33ae-619d-451a-8e53-dc35cdbba9e3\") " pod="openshift-insights/insights-operator-585dfdc468-748dj" Apr 22 19:23:51.865782 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.865534 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3a182f49-9fd9-43cf-983a-9e45aae094ac-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wg445\" (UID: \"3a182f49-9fd9-43cf-983a-9e45aae094ac\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wg445" Apr 22 19:23:51.865782 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.865566 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3543ced1-0882-444d-be8a-53d6ef47c1e6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wrxdf\" (UID: \"3543ced1-0882-444d-be8a-53d6ef47c1e6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrxdf" Apr 22 19:23:51.865782 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.865595 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxmcc\" (UniqueName: \"kubernetes.io/projected/69e02842-f395-40eb-875a-3561c42e7eef-kube-api-access-fxmcc\") pod \"network-check-source-8894fc9bd-v2f7q\" (UID: \"69e02842-f395-40eb-875a-3561c42e7eef\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v2f7q" Apr 22 19:23:51.865782 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.865634 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a439b73c-32fc-4331-99ac-1f81f67a857d-metrics-tls\") pod \"dns-default-cj979\" (UID: \"a439b73c-32fc-4331-99ac-1f81f67a857d\") " pod="openshift-dns/dns-default-cj979" Apr 22 19:23:51.865782 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:51.865663 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:23:51.865782 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.865681 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fgpsh\" (UniqueName: \"kubernetes.io/projected/944e0b61-e3fd-4e73-88f5-25f65ad44d11-kube-api-access-fgpsh\") pod \"kube-storage-version-migrator-operator-6769c5d45-mqx2f\" (UID: \"944e0b61-e3fd-4e73-88f5-25f65ad44d11\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mqx2f" Apr 22 19:23:51.865782 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.865741 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a439b73c-32fc-4331-99ac-1f81f67a857d-config-volume\") pod \"dns-default-cj979\" (UID: \"a439b73c-32fc-4331-99ac-1f81f67a857d\") " pod="openshift-dns/dns-default-cj979" Apr 22 19:23:51.865782 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:51.865755 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3543ced1-0882-444d-be8a-53d6ef47c1e6-samples-operator-tls podName:3543ced1-0882-444d-be8a-53d6ef47c1e6 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:52.365740543 +0000 UTC m=+34.226694258 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3543ced1-0882-444d-be8a-53d6ef47c1e6-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-wrxdf" (UID: "3543ced1-0882-444d-be8a-53d6ef47c1e6") : secret "samples-operator-tls" not found Apr 22 19:23:51.866140 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.865799 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/944e0b61-e3fd-4e73-88f5-25f65ad44d11-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-mqx2f\" (UID: \"944e0b61-e3fd-4e73-88f5-25f65ad44d11\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mqx2f" Apr 22 19:23:51.866140 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.865834 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97ce33ae-619d-451a-8e53-dc35cdbba9e3-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-748dj\" (UID: \"97ce33ae-619d-451a-8e53-dc35cdbba9e3\") " pod="openshift-insights/insights-operator-585dfdc468-748dj" Apr 22 19:23:51.866140 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.865863 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97ce33ae-619d-451a-8e53-dc35cdbba9e3-service-ca-bundle\") pod \"insights-operator-585dfdc468-748dj\" (UID: \"97ce33ae-619d-451a-8e53-dc35cdbba9e3\") " pod="openshift-insights/insights-operator-585dfdc468-748dj" Apr 22 19:23:51.866140 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.865888 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/403f71c1-3bb4-4aef-9a26-b062489c9d03-config\") pod \"service-ca-operator-d6fc45fc5-z7d4f\" (UID: \"403f71c1-3bb4-4aef-9a26-b062489c9d03\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-z7d4f" Apr 22 19:23:51.866140 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.865926 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxcfj\" (UniqueName: \"kubernetes.io/projected/a439b73c-32fc-4331-99ac-1f81f67a857d-kube-api-access-vxcfj\") pod \"dns-default-cj979\" (UID: \"a439b73c-32fc-4331-99ac-1f81f67a857d\") " pod="openshift-dns/dns-default-cj979" Apr 22 19:23:51.866140 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.865955 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/04ec7b20-5a3e-44b4-afea-44994d100476-cert\") pod \"ingress-canary-rbbtr\" (UID: \"04ec7b20-5a3e-44b4-afea-44994d100476\") " pod="openshift-ingress-canary/ingress-canary-rbbtr" Apr 22 19:23:51.866140 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.865989 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/944e0b61-e3fd-4e73-88f5-25f65ad44d11-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-mqx2f\" (UID: \"944e0b61-e3fd-4e73-88f5-25f65ad44d11\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mqx2f" Apr 22 19:23:51.866140 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.866014 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ns8x\" (UniqueName: \"kubernetes.io/projected/117697b4-88b4-4152-942c-224dcf13a685-kube-api-access-5ns8x\") pod \"router-default-7697566b5d-hh5dn\" (UID: \"117697b4-88b4-4152-942c-224dcf13a685\") " pod="openshift-ingress/router-default-7697566b5d-hh5dn" Apr 22 19:23:51.866140 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.866032 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/97ce33ae-619d-451a-8e53-dc35cdbba9e3-tmp\") pod \"insights-operator-585dfdc468-748dj\" (UID: \"97ce33ae-619d-451a-8e53-dc35cdbba9e3\") " pod="openshift-insights/insights-operator-585dfdc468-748dj" Apr 22 19:23:51.866140 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.866048 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jjdnf\" (UniqueName: \"kubernetes.io/projected/3543ced1-0882-444d-be8a-53d6ef47c1e6-kube-api-access-jjdnf\") pod \"cluster-samples-operator-6dc5bdb6b4-wrxdf\" (UID: \"3543ced1-0882-444d-be8a-53d6ef47c1e6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrxdf" Apr 22 19:23:51.866140 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.866082 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qznh9\" (UniqueName: \"kubernetes.io/projected/04ec7b20-5a3e-44b4-afea-44994d100476-kube-api-access-qznh9\") pod \"ingress-canary-rbbtr\" (UID: \"04ec7b20-5a3e-44b4-afea-44994d100476\") " pod="openshift-ingress-canary/ingress-canary-rbbtr" Apr 22 19:23:51.866706 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.866599 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/944e0b61-e3fd-4e73-88f5-25f65ad44d11-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-mqx2f\" (UID: \"944e0b61-e3fd-4e73-88f5-25f65ad44d11\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mqx2f" Apr 22 19:23:51.866781 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.866761 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/403f71c1-3bb4-4aef-9a26-b062489c9d03-config\") pod \"service-ca-operator-d6fc45fc5-z7d4f\" (UID: \"403f71c1-3bb4-4aef-9a26-b062489c9d03\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-z7d4f" Apr 22 19:23:51.867788 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.867763 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/403f71c1-3bb4-4aef-9a26-b062489c9d03-serving-cert\") pod \"service-ca-operator-d6fc45fc5-z7d4f\" (UID: \"403f71c1-3bb4-4aef-9a26-b062489c9d03\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-z7d4f" Apr 22 19:23:51.868009 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.867991 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/117697b4-88b4-4152-942c-224dcf13a685-stats-auth\") pod \"router-default-7697566b5d-hh5dn\" (UID: \"117697b4-88b4-4152-942c-224dcf13a685\") " pod="openshift-ingress/router-default-7697566b5d-hh5dn" Apr 22 19:23:51.868074 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.868047 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/117697b4-88b4-4152-942c-224dcf13a685-default-certificate\") pod \"router-default-7697566b5d-hh5dn\" (UID: \"117697b4-88b4-4152-942c-224dcf13a685\") " pod="openshift-ingress/router-default-7697566b5d-hh5dn" Apr 22 19:23:51.868109 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.868075 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/944e0b61-e3fd-4e73-88f5-25f65ad44d11-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-mqx2f\" (UID: \"944e0b61-e3fd-4e73-88f5-25f65ad44d11\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mqx2f" Apr 22 19:23:51.871699 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.871671 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97ce33ae-619d-451a-8e53-dc35cdbba9e3-service-ca-bundle\") pod \"insights-operator-585dfdc468-748dj\" (UID: \"97ce33ae-619d-451a-8e53-dc35cdbba9e3\") " pod="openshift-insights/insights-operator-585dfdc468-748dj" Apr 22 19:23:51.871826 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.871747 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97ce33ae-619d-451a-8e53-dc35cdbba9e3-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-748dj\" (UID: \"97ce33ae-619d-451a-8e53-dc35cdbba9e3\") " pod="openshift-insights/insights-operator-585dfdc468-748dj" Apr 22 19:23:51.873324 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.873108 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97ce33ae-619d-451a-8e53-dc35cdbba9e3-serving-cert\") pod \"insights-operator-585dfdc468-748dj\" (UID: \"97ce33ae-619d-451a-8e53-dc35cdbba9e3\") " pod="openshift-insights/insights-operator-585dfdc468-748dj" Apr 22 19:23:51.873535 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.873494 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqtx2\" (UniqueName: \"kubernetes.io/projected/403f71c1-3bb4-4aef-9a26-b062489c9d03-kube-api-access-tqtx2\") pod \"service-ca-operator-d6fc45fc5-z7d4f\" (UID: \"403f71c1-3bb4-4aef-9a26-b062489c9d03\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-z7d4f" Apr 22 19:23:51.874067 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.874048 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wshzs\" (UniqueName: \"kubernetes.io/projected/97ce33ae-619d-451a-8e53-dc35cdbba9e3-kube-api-access-wshzs\") pod \"insights-operator-585dfdc468-748dj\" (UID: \"97ce33ae-619d-451a-8e53-dc35cdbba9e3\") " pod="openshift-insights/insights-operator-585dfdc468-748dj" Apr 22 19:23:51.874147 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.874121 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgpsh\" (UniqueName: \"kubernetes.io/projected/944e0b61-e3fd-4e73-88f5-25f65ad44d11-kube-api-access-fgpsh\") pod \"kube-storage-version-migrator-operator-6769c5d45-mqx2f\" (UID: \"944e0b61-e3fd-4e73-88f5-25f65ad44d11\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mqx2f" Apr 22 19:23:51.874422 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.874406 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjdnf\" (UniqueName: \"kubernetes.io/projected/3543ced1-0882-444d-be8a-53d6ef47c1e6-kube-api-access-jjdnf\") pod \"cluster-samples-operator-6dc5bdb6b4-wrxdf\" (UID: \"3543ced1-0882-444d-be8a-53d6ef47c1e6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrxdf" Apr 22 19:23:51.875067 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.875050 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ns8x\" (UniqueName: \"kubernetes.io/projected/117697b4-88b4-4152-942c-224dcf13a685-kube-api-access-5ns8x\") pod \"router-default-7697566b5d-hh5dn\" (UID: \"117697b4-88b4-4152-942c-224dcf13a685\") " pod="openshift-ingress/router-default-7697566b5d-hh5dn" Apr 22 19:23:51.876498 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.876477 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/97ce33ae-619d-451a-8e53-dc35cdbba9e3-tmp\") pod \"insights-operator-585dfdc468-748dj\" (UID: \"97ce33ae-619d-451a-8e53-dc35cdbba9e3\") " pod="openshift-insights/insights-operator-585dfdc468-748dj" Apr 22 19:23:51.876574 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.876531 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/97ce33ae-619d-451a-8e53-dc35cdbba9e3-snapshots\") pod \"insights-operator-585dfdc468-748dj\" (UID: \"97ce33ae-619d-451a-8e53-dc35cdbba9e3\") " pod="openshift-insights/insights-operator-585dfdc468-748dj" Apr 22 19:23:51.906264 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.906237 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-5rw5p" Apr 22 19:23:51.910792 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.910772 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pq94v" Apr 22 19:23:51.946445 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.946415 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-748dj" Apr 22 19:23:51.964425 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.964393 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-z7d4f" Apr 22 19:23:51.967440 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.967411 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fxmcc\" (UniqueName: \"kubernetes.io/projected/69e02842-f395-40eb-875a-3561c42e7eef-kube-api-access-fxmcc\") pod \"network-check-source-8894fc9bd-v2f7q\" (UID: \"69e02842-f395-40eb-875a-3561c42e7eef\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v2f7q" Apr 22 19:23:51.967587 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.967448 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a439b73c-32fc-4331-99ac-1f81f67a857d-metrics-tls\") pod \"dns-default-cj979\" (UID: \"a439b73c-32fc-4331-99ac-1f81f67a857d\") " pod="openshift-dns/dns-default-cj979" Apr 22 19:23:51.967587 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.967466 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a439b73c-32fc-4331-99ac-1f81f67a857d-config-volume\") pod \"dns-default-cj979\" (UID: \"a439b73c-32fc-4331-99ac-1f81f67a857d\") " pod="openshift-dns/dns-default-cj979" Apr 22 19:23:51.967587 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.967493 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vxcfj\" (UniqueName: \"kubernetes.io/projected/a439b73c-32fc-4331-99ac-1f81f67a857d-kube-api-access-vxcfj\") pod \"dns-default-cj979\" (UID: \"a439b73c-32fc-4331-99ac-1f81f67a857d\") " pod="openshift-dns/dns-default-cj979" Apr 22 19:23:51.967587 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.967510 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/04ec7b20-5a3e-44b4-afea-44994d100476-cert\") pod \"ingress-canary-rbbtr\" (UID: \"04ec7b20-5a3e-44b4-afea-44994d100476\") " pod="openshift-ingress-canary/ingress-canary-rbbtr" Apr 22 19:23:51.967587 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.967540 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qznh9\" (UniqueName: \"kubernetes.io/projected/04ec7b20-5a3e-44b4-afea-44994d100476-kube-api-access-qznh9\") pod \"ingress-canary-rbbtr\" (UID: \"04ec7b20-5a3e-44b4-afea-44994d100476\") " pod="openshift-ingress-canary/ingress-canary-rbbtr" Apr 22 19:23:51.967587 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.967574 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/3dd0e38f-633c-4e17-8389-15a41d942093-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-5sfmj\" (UID: \"3dd0e38f-633c-4e17-8389-15a41d942093\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5sfmj" Apr 22 19:23:51.968048 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.967632 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a439b73c-32fc-4331-99ac-1f81f67a857d-tmp-dir\") pod \"dns-default-cj979\" (UID: \"a439b73c-32fc-4331-99ac-1f81f67a857d\") " pod="openshift-dns/dns-default-cj979" Apr 22 19:23:51.968048 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.967678 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3a182f49-9fd9-43cf-983a-9e45aae094ac-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-wg445\" (UID: \"3a182f49-9fd9-43cf-983a-9e45aae094ac\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wg445" Apr 22 19:23:51.968048 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.967695 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3dd0e38f-633c-4e17-8389-15a41d942093-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5sfmj\" (UID: \"3dd0e38f-633c-4e17-8389-15a41d942093\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5sfmj" Apr 22 19:23:51.968048 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.967712 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sdsq4\" (UniqueName: \"kubernetes.io/projected/3dd0e38f-633c-4e17-8389-15a41d942093-kube-api-access-sdsq4\") pod \"cluster-monitoring-operator-75587bd455-5sfmj\" (UID: \"3dd0e38f-633c-4e17-8389-15a41d942093\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5sfmj" Apr 22 19:23:51.968048 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.967750 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3a182f49-9fd9-43cf-983a-9e45aae094ac-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wg445\" (UID: \"3a182f49-9fd9-43cf-983a-9e45aae094ac\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wg445" Apr 22 19:23:51.968048 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:51.967859 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:23:51.968048 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:51.967932 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a182f49-9fd9-43cf-983a-9e45aae094ac-networking-console-plugin-cert podName:3a182f49-9fd9-43cf-983a-9e45aae094ac nodeName:}" failed. No retries permitted until 2026-04-22 19:23:52.46790393 +0000 UTC m=+34.328857645 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3a182f49-9fd9-43cf-983a-9e45aae094ac-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wg445" (UID: "3a182f49-9fd9-43cf-983a-9e45aae094ac") : secret "networking-console-plugin-cert" not found Apr 22 19:23:51.968475 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:51.968133 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:23:51.968475 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:51.968193 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dd0e38f-633c-4e17-8389-15a41d942093-cluster-monitoring-operator-tls podName:3dd0e38f-633c-4e17-8389-15a41d942093 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:52.468172852 +0000 UTC m=+34.329126568 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/3dd0e38f-633c-4e17-8389-15a41d942093-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-5sfmj" (UID: "3dd0e38f-633c-4e17-8389-15a41d942093") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:23:51.968475 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:51.968447 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:23:51.968639 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:51.968497 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04ec7b20-5a3e-44b4-afea-44994d100476-cert podName:04ec7b20-5a3e-44b4-afea-44994d100476 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:52.468482079 +0000 UTC m=+34.329435796 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/04ec7b20-5a3e-44b4-afea-44994d100476-cert") pod "ingress-canary-rbbtr" (UID: "04ec7b20-5a3e-44b4-afea-44994d100476") : secret "canary-serving-cert" not found Apr 22 19:23:51.968639 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.968539 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a439b73c-32fc-4331-99ac-1f81f67a857d-tmp-dir\") pod \"dns-default-cj979\" (UID: \"a439b73c-32fc-4331-99ac-1f81f67a857d\") " pod="openshift-dns/dns-default-cj979" Apr 22 19:23:51.968639 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:51.968565 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:23:51.968856 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.968563 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a439b73c-32fc-4331-99ac-1f81f67a857d-config-volume\") pod \"dns-default-cj979\" (UID: \"a439b73c-32fc-4331-99ac-1f81f67a857d\") " pod="openshift-dns/dns-default-cj979" Apr 22 19:23:51.969421 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:51.969366 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a439b73c-32fc-4331-99ac-1f81f67a857d-metrics-tls podName:a439b73c-32fc-4331-99ac-1f81f67a857d nodeName:}" failed. No retries permitted until 2026-04-22 19:23:52.469334332 +0000 UTC m=+34.330288126 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a439b73c-32fc-4331-99ac-1f81f67a857d-metrics-tls") pod "dns-default-cj979" (UID: "a439b73c-32fc-4331-99ac-1f81f67a857d") : secret "dns-default-metrics-tls" not found Apr 22 19:23:51.969421 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.969380 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3a182f49-9fd9-43cf-983a-9e45aae094ac-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-wg445\" (UID: \"3a182f49-9fd9-43cf-983a-9e45aae094ac\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wg445" Apr 22 19:23:51.970963 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.970939 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/3dd0e38f-633c-4e17-8389-15a41d942093-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-5sfmj\" (UID: \"3dd0e38f-633c-4e17-8389-15a41d942093\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5sfmj" Apr 22 19:23:51.982361 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.980025 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxcfj\" (UniqueName: \"kubernetes.io/projected/a439b73c-32fc-4331-99ac-1f81f67a857d-kube-api-access-vxcfj\") pod \"dns-default-cj979\" (UID: \"a439b73c-32fc-4331-99ac-1f81f67a857d\") " pod="openshift-dns/dns-default-cj979" Apr 22 19:23:51.983524 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.982696 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qznh9\" (UniqueName: \"kubernetes.io/projected/04ec7b20-5a3e-44b4-afea-44994d100476-kube-api-access-qznh9\") pod \"ingress-canary-rbbtr\" (UID: \"04ec7b20-5a3e-44b4-afea-44994d100476\") " pod="openshift-ingress-canary/ingress-canary-rbbtr" Apr 22 19:23:51.983524 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.983329 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxmcc\" (UniqueName: \"kubernetes.io/projected/69e02842-f395-40eb-875a-3561c42e7eef-kube-api-access-fxmcc\") pod \"network-check-source-8894fc9bd-v2f7q\" (UID: \"69e02842-f395-40eb-875a-3561c42e7eef\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v2f7q" Apr 22 19:23:51.985198 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:51.985157 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdsq4\" (UniqueName: \"kubernetes.io/projected/3dd0e38f-633c-4e17-8389-15a41d942093-kube-api-access-sdsq4\") pod \"cluster-monitoring-operator-75587bd455-5sfmj\" (UID: \"3dd0e38f-633c-4e17-8389-15a41d942093\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5sfmj" Apr 22 19:23:52.000179 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:52.000148 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mqx2f" Apr 22 19:23:52.058329 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:52.058157 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v2f7q" Apr 22 19:23:52.071425 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:52.071371 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-5rw5p"] Apr 22 19:23:52.073295 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:52.073270 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pq94v"] Apr 22 19:23:52.129112 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:52.129071 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45a00f57_cea9_483d_bda8_358ac125ff3b.slice/crio-7993132bb7b60bfc5ea1468826dcf48143430745f74bb49db656b2422841cf77 WatchSource:0}: Error finding container 7993132bb7b60bfc5ea1468826dcf48143430745f74bb49db656b2422841cf77: Status 404 returned error can't find the container with id 7993132bb7b60bfc5ea1468826dcf48143430745f74bb49db656b2422841cf77 Apr 22 19:23:52.129613 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:52.129562 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf8ba87e_3b79_4273_af80_a16c607dbf85.slice/crio-68d13a81bdde542de352f9a1191d8840979ed82cc527a030ddf771cda3ce912f WatchSource:0}: Error finding container 68d13a81bdde542de352f9a1191d8840979ed82cc527a030ddf771cda3ce912f: Status 404 returned error can't find the container with id 68d13a81bdde542de352f9a1191d8840979ed82cc527a030ddf771cda3ce912f Apr 22 19:23:52.249445 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:52.249413 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-748dj"] Apr 22 19:23:52.252212 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:52.252187 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-v2f7q"] Apr 22 19:23:52.258473 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:52.257375 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mqx2f"] Apr 22 19:23:52.261105 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:52.261082 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-z7d4f"] Apr 22 19:23:52.264243 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:52.264147 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97ce33ae_619d_451a_8e53_dc35cdbba9e3.slice/crio-a4391ffcaccc394df7d639d141e688d234d2a2ceb88b189d79761a3a25ceb251 WatchSource:0}: Error finding container a4391ffcaccc394df7d639d141e688d234d2a2ceb88b189d79761a3a25ceb251: Status 404 returned error can't find the container with id a4391ffcaccc394df7d639d141e688d234d2a2ceb88b189d79761a3a25ceb251 Apr 22 19:23:52.265156 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:52.264744 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69e02842_f395_40eb_875a_3561c42e7eef.slice/crio-49af9b2fbbd7b68a689ccea846503ec472ba8126ac72013dd5a41e176fc4b228 WatchSource:0}: Error finding container 49af9b2fbbd7b68a689ccea846503ec472ba8126ac72013dd5a41e176fc4b228: Status 404 returned error can't find the container with id 49af9b2fbbd7b68a689ccea846503ec472ba8126ac72013dd5a41e176fc4b228 Apr 22 19:23:52.266386 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:52.266109 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod403f71c1_3bb4_4aef_9a26_b062489c9d03.slice/crio-a2fab8ca93ef89bb00e636edd290c2d3f13881593741583c18bfa6586347fc30 WatchSource:0}: Error finding container a2fab8ca93ef89bb00e636edd290c2d3f13881593741583c18bfa6586347fc30: Status 404 returned error can't find the container with id a2fab8ca93ef89bb00e636edd290c2d3f13881593741583c18bfa6586347fc30 Apr 22 19:23:52.270165 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:52.270141 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14676505-81f2-4619-b4da-c41d68fefcce-registry-tls\") pod \"image-registry-7674868db4-j8tkf\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " pod="openshift-image-registry/image-registry-7674868db4-j8tkf" Apr 22 19:23:52.270416 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:52.270399 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:23:52.270534 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:52.270418 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7674868db4-j8tkf: secret "image-registry-tls" not found Apr 22 19:23:52.270534 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:52.270462 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14676505-81f2-4619-b4da-c41d68fefcce-registry-tls podName:14676505-81f2-4619-b4da-c41d68fefcce nodeName:}" failed. No retries permitted until 2026-04-22 19:23:53.270445342 +0000 UTC m=+35.131399058 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/14676505-81f2-4619-b4da-c41d68fefcce-registry-tls") pod "image-registry-7674868db4-j8tkf" (UID: "14676505-81f2-4619-b4da-c41d68fefcce") : secret "image-registry-tls" not found Apr 22 19:23:52.370862 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:52.370828 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3543ced1-0882-444d-be8a-53d6ef47c1e6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wrxdf\" (UID: \"3543ced1-0882-444d-be8a-53d6ef47c1e6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrxdf" Apr 22 19:23:52.371032 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:52.370979 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:23:52.371076 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:52.371031 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/117697b4-88b4-4152-942c-224dcf13a685-metrics-certs\") pod \"router-default-7697566b5d-hh5dn\" (UID: \"117697b4-88b4-4152-942c-224dcf13a685\") " pod="openshift-ingress/router-default-7697566b5d-hh5dn" Apr 22 19:23:52.371076 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:52.371040 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3543ced1-0882-444d-be8a-53d6ef47c1e6-samples-operator-tls podName:3543ced1-0882-444d-be8a-53d6ef47c1e6 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:53.371024871 +0000 UTC m=+35.231978586 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3543ced1-0882-444d-be8a-53d6ef47c1e6-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-wrxdf" (UID: "3543ced1-0882-444d-be8a-53d6ef47c1e6") : secret "samples-operator-tls" not found Apr 22 19:23:52.371076 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:52.371072 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac86a56e-148b-4fb4-8415-acff438d7915-metrics-certs\") pod \"network-metrics-daemon-4sl88\" (UID: \"ac86a56e-148b-4fb4-8415-acff438d7915\") " pod="openshift-multus/network-metrics-daemon-4sl88" Apr 22 19:23:52.371179 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:52.371100 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/117697b4-88b4-4152-942c-224dcf13a685-service-ca-bundle\") pod \"router-default-7697566b5d-hh5dn\" (UID: \"117697b4-88b4-4152-942c-224dcf13a685\") " pod="openshift-ingress/router-default-7697566b5d-hh5dn" Apr 22 19:23:52.371179 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:52.371158 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:23:52.371241 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:52.371183 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:52.371241 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:52.371224 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/117697b4-88b4-4152-942c-224dcf13a685-metrics-certs podName:117697b4-88b4-4152-942c-224dcf13a685 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:53.371190051 +0000 UTC m=+35.232143770 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/117697b4-88b4-4152-942c-224dcf13a685-metrics-certs") pod "router-default-7697566b5d-hh5dn" (UID: "117697b4-88b4-4152-942c-224dcf13a685") : secret "router-metrics-certs-default" not found Apr 22 19:23:52.371301 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:52.371244 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/117697b4-88b4-4152-942c-224dcf13a685-service-ca-bundle podName:117697b4-88b4-4152-942c-224dcf13a685 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:53.371236448 +0000 UTC m=+35.232190166 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/117697b4-88b4-4152-942c-224dcf13a685-service-ca-bundle") pod "router-default-7697566b5d-hh5dn" (UID: "117697b4-88b4-4152-942c-224dcf13a685") : configmap references non-existent config key: service-ca.crt Apr 22 19:23:52.371301 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:52.371254 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac86a56e-148b-4fb4-8415-acff438d7915-metrics-certs podName:ac86a56e-148b-4fb4-8415-acff438d7915 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:24.371248891 +0000 UTC m=+66.232202609 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ac86a56e-148b-4fb4-8415-acff438d7915-metrics-certs") pod "network-metrics-daemon-4sl88" (UID: "ac86a56e-148b-4fb4-8415-acff438d7915") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:52.472127 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:52.472089 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jw8rh\" (UniqueName: \"kubernetes.io/projected/c887aba7-380b-4a80-bc2c-6e89b986da6b-kube-api-access-jw8rh\") pod \"network-check-target-858rm\" (UID: \"c887aba7-380b-4a80-bc2c-6e89b986da6b\") " pod="openshift-network-diagnostics/network-check-target-858rm" Apr 22 19:23:52.472281 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:52.472184 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3dd0e38f-633c-4e17-8389-15a41d942093-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5sfmj\" (UID: \"3dd0e38f-633c-4e17-8389-15a41d942093\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5sfmj" Apr 22 19:23:52.472281 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:52.472215 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3a182f49-9fd9-43cf-983a-9e45aae094ac-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wg445\" (UID: \"3a182f49-9fd9-43cf-983a-9e45aae094ac\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wg445" Apr 22 19:23:52.472281 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:52.472247 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a439b73c-32fc-4331-99ac-1f81f67a857d-metrics-tls\") pod \"dns-default-cj979\" (UID: \"a439b73c-32fc-4331-99ac-1f81f67a857d\") " pod="openshift-dns/dns-default-cj979" Apr 22 19:23:52.472281 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:52.472268 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:23:52.472467 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:52.472293 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/04ec7b20-5a3e-44b4-afea-44994d100476-cert\") pod \"ingress-canary-rbbtr\" (UID: \"04ec7b20-5a3e-44b4-afea-44994d100476\") " pod="openshift-ingress-canary/ingress-canary-rbbtr" Apr 22 19:23:52.472467 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:52.472340 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dd0e38f-633c-4e17-8389-15a41d942093-cluster-monitoring-operator-tls podName:3dd0e38f-633c-4e17-8389-15a41d942093 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:53.472321275 +0000 UTC m=+35.333275003 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/3dd0e38f-633c-4e17-8389-15a41d942093-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-5sfmj" (UID: "3dd0e38f-633c-4e17-8389-15a41d942093") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:23:52.472467 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:52.472348 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:23:52.472467 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:52.472376 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:23:52.472467 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:52.472421 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04ec7b20-5a3e-44b4-afea-44994d100476-cert podName:04ec7b20-5a3e-44b4-afea-44994d100476 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:53.472408739 +0000 UTC m=+35.333362456 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/04ec7b20-5a3e-44b4-afea-44994d100476-cert") pod "ingress-canary-rbbtr" (UID: "04ec7b20-5a3e-44b4-afea-44994d100476") : secret "canary-serving-cert" not found Apr 22 19:23:52.472467 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:52.472421 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:23:52.472467 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:52.472434 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a182f49-9fd9-43cf-983a-9e45aae094ac-networking-console-plugin-cert podName:3a182f49-9fd9-43cf-983a-9e45aae094ac nodeName:}" failed. No retries permitted until 2026-04-22 19:23:53.472428452 +0000 UTC m=+35.333382166 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3a182f49-9fd9-43cf-983a-9e45aae094ac-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wg445" (UID: "3a182f49-9fd9-43cf-983a-9e45aae094ac") : secret "networking-console-plugin-cert" not found Apr 22 19:23:52.472467 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:52.472458 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a439b73c-32fc-4331-99ac-1f81f67a857d-metrics-tls podName:a439b73c-32fc-4331-99ac-1f81f67a857d nodeName:}" failed. No retries permitted until 2026-04-22 19:23:53.472448787 +0000 UTC m=+35.333402502 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a439b73c-32fc-4331-99ac-1f81f67a857d-metrics-tls") pod "dns-default-cj979" (UID: "a439b73c-32fc-4331-99ac-1f81f67a857d") : secret "dns-default-metrics-tls" not found Apr 22 19:23:52.475496 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:52.475479 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw8rh\" (UniqueName: \"kubernetes.io/projected/c887aba7-380b-4a80-bc2c-6e89b986da6b-kube-api-access-jw8rh\") pod \"network-check-target-858rm\" (UID: \"c887aba7-380b-4a80-bc2c-6e89b986da6b\") " pod="openshift-network-diagnostics/network-check-target-858rm" Apr 22 19:23:52.733816 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:52.733786 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rb7nf" Apr 22 19:23:52.733971 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:52.733817 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sl88" Apr 22 19:23:52.733971 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:52.733822 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-858rm" Apr 22 19:23:52.736040 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:52.736017 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 19:23:52.736174 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:52.736071 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tqt8b\"" Apr 22 19:23:52.736174 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:52.736122 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:23:52.736174 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:52.736128 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-7c2z6\"" Apr 22 19:23:52.758321 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:52.758299 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-858rm" Apr 22 19:23:52.897442 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:52.897388 2576 generic.go:358] "Generic (PLEG): container finished" podID="17ada943-fc5e-4b09-bf82-9132909cb32d" containerID="c439ee74d9afea03661449f660e57f58bf42511f7bb96497f442c613fe0ece24" exitCode=0 Apr 22 19:23:52.897792 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:52.897482 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hr5vn" event={"ID":"17ada943-fc5e-4b09-bf82-9132909cb32d","Type":"ContainerDied","Data":"c439ee74d9afea03661449f660e57f58bf42511f7bb96497f442c613fe0ece24"} Apr 22 19:23:52.899651 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:52.899622 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-858rm"] Apr 22 19:23:52.903071 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:52.903025 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-748dj" event={"ID":"97ce33ae-619d-451a-8e53-dc35cdbba9e3","Type":"ContainerStarted","Data":"a4391ffcaccc394df7d639d141e688d234d2a2ceb88b189d79761a3a25ceb251"} Apr 22 19:23:52.904392 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:23:52.904366 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc887aba7_380b_4a80_bc2c_6e89b986da6b.slice/crio-9f4547ae6c4913ff710717513157f049fcda0527a0c631b8d6afd0c6f9ed8233 WatchSource:0}: Error finding container 9f4547ae6c4913ff710717513157f049fcda0527a0c631b8d6afd0c6f9ed8233: Status 404 returned error can't find the container with id 9f4547ae6c4913ff710717513157f049fcda0527a0c631b8d6afd0c6f9ed8233 Apr 22 19:23:52.904641 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:52.904616 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v2f7q" event={"ID":"69e02842-f395-40eb-875a-3561c42e7eef","Type":"ContainerStarted","Data":"49af9b2fbbd7b68a689ccea846503ec472ba8126ac72013dd5a41e176fc4b228"} Apr 22 19:23:52.908318 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:52.908240 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mqx2f" event={"ID":"944e0b61-e3fd-4e73-88f5-25f65ad44d11","Type":"ContainerStarted","Data":"7df1530056a2569081d2a29e6c4063f8f827c6894e23d20e36ab05e48a5af15b"} Apr 22 19:23:52.912358 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:52.912267 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pq94v" event={"ID":"45a00f57-cea9-483d-bda8-358ac125ff3b","Type":"ContainerStarted","Data":"7993132bb7b60bfc5ea1468826dcf48143430745f74bb49db656b2422841cf77"} Apr 22 19:23:52.914779 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:52.914756 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-5rw5p" event={"ID":"af8ba87e-3b79-4273-af80-a16c607dbf85","Type":"ContainerStarted","Data":"68d13a81bdde542de352f9a1191d8840979ed82cc527a030ddf771cda3ce912f"} Apr 22 19:23:52.917234 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:52.917188 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-z7d4f" event={"ID":"403f71c1-3bb4-4aef-9a26-b062489c9d03","Type":"ContainerStarted","Data":"a2fab8ca93ef89bb00e636edd290c2d3f13881593741583c18bfa6586347fc30"} Apr 22 19:23:53.280783 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:53.280029 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14676505-81f2-4619-b4da-c41d68fefcce-registry-tls\") pod \"image-registry-7674868db4-j8tkf\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " pod="openshift-image-registry/image-registry-7674868db4-j8tkf" Apr 22 19:23:53.280783 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:53.280273 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:23:53.280783 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:53.280289 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7674868db4-j8tkf: secret "image-registry-tls" not found Apr 22 19:23:53.280783 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:53.280356 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14676505-81f2-4619-b4da-c41d68fefcce-registry-tls podName:14676505-81f2-4619-b4da-c41d68fefcce nodeName:}" failed. No retries permitted until 2026-04-22 19:23:55.280337143 +0000 UTC m=+37.141290861 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/14676505-81f2-4619-b4da-c41d68fefcce-registry-tls") pod "image-registry-7674868db4-j8tkf" (UID: "14676505-81f2-4619-b4da-c41d68fefcce") : secret "image-registry-tls" not found Apr 22 19:23:53.381254 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:53.381167 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/117697b4-88b4-4152-942c-224dcf13a685-metrics-certs\") pod \"router-default-7697566b5d-hh5dn\" (UID: \"117697b4-88b4-4152-942c-224dcf13a685\") " pod="openshift-ingress/router-default-7697566b5d-hh5dn" Apr 22 19:23:53.381254 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:53.381250 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/117697b4-88b4-4152-942c-224dcf13a685-service-ca-bundle\") pod \"router-default-7697566b5d-hh5dn\" (UID: \"117697b4-88b4-4152-942c-224dcf13a685\") " pod="openshift-ingress/router-default-7697566b5d-hh5dn" Apr 22 19:23:53.381474 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:53.381330 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3543ced1-0882-444d-be8a-53d6ef47c1e6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wrxdf\" (UID: \"3543ced1-0882-444d-be8a-53d6ef47c1e6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrxdf" Apr 22 19:23:53.381550 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:53.381523 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:23:53.381602 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:53.381592 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3543ced1-0882-444d-be8a-53d6ef47c1e6-samples-operator-tls podName:3543ced1-0882-444d-be8a-53d6ef47c1e6 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:55.381572575 +0000 UTC m=+37.242526297 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3543ced1-0882-444d-be8a-53d6ef47c1e6-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-wrxdf" (UID: "3543ced1-0882-444d-be8a-53d6ef47c1e6") : secret "samples-operator-tls" not found Apr 22 19:23:53.382044 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:53.382022 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:23:53.382169 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:53.382082 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/117697b4-88b4-4152-942c-224dcf13a685-metrics-certs podName:117697b4-88b4-4152-942c-224dcf13a685 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:55.382066168 +0000 UTC m=+37.243019889 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/117697b4-88b4-4152-942c-224dcf13a685-metrics-certs") pod "router-default-7697566b5d-hh5dn" (UID: "117697b4-88b4-4152-942c-224dcf13a685") : secret "router-metrics-certs-default" not found Apr 22 19:23:53.382169 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:53.382156 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/117697b4-88b4-4152-942c-224dcf13a685-service-ca-bundle podName:117697b4-88b4-4152-942c-224dcf13a685 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:55.382145895 +0000 UTC m=+37.243099623 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/117697b4-88b4-4152-942c-224dcf13a685-service-ca-bundle") pod "router-default-7697566b5d-hh5dn" (UID: "117697b4-88b4-4152-942c-224dcf13a685") : configmap references non-existent config key: service-ca.crt Apr 22 19:23:53.482622 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:53.482584 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/04ec7b20-5a3e-44b4-afea-44994d100476-cert\") pod \"ingress-canary-rbbtr\" (UID: \"04ec7b20-5a3e-44b4-afea-44994d100476\") " pod="openshift-ingress-canary/ingress-canary-rbbtr" Apr 22 19:23:53.482831 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:53.482744 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3dd0e38f-633c-4e17-8389-15a41d942093-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5sfmj\" (UID: \"3dd0e38f-633c-4e17-8389-15a41d942093\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5sfmj" Apr 22 19:23:53.482831 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:53.482790 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3a182f49-9fd9-43cf-983a-9e45aae094ac-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wg445\" (UID: \"3a182f49-9fd9-43cf-983a-9e45aae094ac\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wg445" Apr 22 19:23:53.482957 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:53.482838 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a439b73c-32fc-4331-99ac-1f81f67a857d-metrics-tls\") pod \"dns-default-cj979\" (UID: \"a439b73c-32fc-4331-99ac-1f81f67a857d\") " pod="openshift-dns/dns-default-cj979" Apr 22 19:23:53.483026 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:53.482979 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:23:53.483077 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:53.483041 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a439b73c-32fc-4331-99ac-1f81f67a857d-metrics-tls podName:a439b73c-32fc-4331-99ac-1f81f67a857d nodeName:}" failed. No retries permitted until 2026-04-22 19:23:55.483021633 +0000 UTC m=+37.343975354 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a439b73c-32fc-4331-99ac-1f81f67a857d-metrics-tls") pod "dns-default-cj979" (UID: "a439b73c-32fc-4331-99ac-1f81f67a857d") : secret "dns-default-metrics-tls" not found Apr 22 19:23:53.483447 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:53.483428 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:23:53.483537 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:53.483485 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04ec7b20-5a3e-44b4-afea-44994d100476-cert podName:04ec7b20-5a3e-44b4-afea-44994d100476 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:55.483467806 +0000 UTC m=+37.344421570 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/04ec7b20-5a3e-44b4-afea-44994d100476-cert") pod "ingress-canary-rbbtr" (UID: "04ec7b20-5a3e-44b4-afea-44994d100476") : secret "canary-serving-cert" not found Apr 22 19:23:53.483602 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:53.483543 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:23:53.483793 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:53.483577 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dd0e38f-633c-4e17-8389-15a41d942093-cluster-monitoring-operator-tls podName:3dd0e38f-633c-4e17-8389-15a41d942093 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:55.483566459 +0000 UTC m=+37.344520177 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/3dd0e38f-633c-4e17-8389-15a41d942093-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-5sfmj" (UID: "3dd0e38f-633c-4e17-8389-15a41d942093") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:23:53.483793 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:53.483671 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:23:53.483793 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:53.483775 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a182f49-9fd9-43cf-983a-9e45aae094ac-networking-console-plugin-cert podName:3a182f49-9fd9-43cf-983a-9e45aae094ac nodeName:}" failed. No retries permitted until 2026-04-22 19:23:55.483762057 +0000 UTC m=+37.344715778 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3a182f49-9fd9-43cf-983a-9e45aae094ac-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wg445" (UID: "3a182f49-9fd9-43cf-983a-9e45aae094ac") : secret "networking-console-plugin-cert" not found Apr 22 19:23:53.924698 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:53.924344 2576 generic.go:358] "Generic (PLEG): container finished" podID="17ada943-fc5e-4b09-bf82-9132909cb32d" containerID="09aea617e633ba2b4d17ca791eee133cdb01e6499f782b757f6a6272940059bd" exitCode=0 Apr 22 19:23:53.924698 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:53.924452 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hr5vn" event={"ID":"17ada943-fc5e-4b09-bf82-9132909cb32d","Type":"ContainerDied","Data":"09aea617e633ba2b4d17ca791eee133cdb01e6499f782b757f6a6272940059bd"} Apr 22 19:23:53.928076 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:53.928013 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-858rm" event={"ID":"c887aba7-380b-4a80-bc2c-6e89b986da6b","Type":"ContainerStarted","Data":"9f4547ae6c4913ff710717513157f049fcda0527a0c631b8d6afd0c6f9ed8233"} Apr 22 19:23:55.303521 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:55.303376 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14676505-81f2-4619-b4da-c41d68fefcce-registry-tls\") pod \"image-registry-7674868db4-j8tkf\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " pod="openshift-image-registry/image-registry-7674868db4-j8tkf" Apr 22 19:23:55.303905 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:55.303540 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:23:55.303905 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:55.303563 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7674868db4-j8tkf: secret "image-registry-tls" not found Apr 22 19:23:55.303905 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:55.303629 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14676505-81f2-4619-b4da-c41d68fefcce-registry-tls podName:14676505-81f2-4619-b4da-c41d68fefcce nodeName:}" failed. No retries permitted until 2026-04-22 19:23:59.303609474 +0000 UTC m=+41.164563221 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/14676505-81f2-4619-b4da-c41d68fefcce-registry-tls") pod "image-registry-7674868db4-j8tkf" (UID: "14676505-81f2-4619-b4da-c41d68fefcce") : secret "image-registry-tls" not found Apr 22 19:23:55.404303 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:55.404211 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3543ced1-0882-444d-be8a-53d6ef47c1e6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wrxdf\" (UID: \"3543ced1-0882-444d-be8a-53d6ef47c1e6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrxdf" Apr 22 19:23:55.404303 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:55.404291 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:23:55.404523 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:55.404417 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3543ced1-0882-444d-be8a-53d6ef47c1e6-samples-operator-tls podName:3543ced1-0882-444d-be8a-53d6ef47c1e6 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:59.404387902 +0000 UTC m=+41.265341656 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3543ced1-0882-444d-be8a-53d6ef47c1e6-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-wrxdf" (UID: "3543ced1-0882-444d-be8a-53d6ef47c1e6") : secret "samples-operator-tls" not found Apr 22 19:23:55.404523 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:55.404492 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/117697b4-88b4-4152-942c-224dcf13a685-metrics-certs\") pod \"router-default-7697566b5d-hh5dn\" (UID: \"117697b4-88b4-4152-942c-224dcf13a685\") " pod="openshift-ingress/router-default-7697566b5d-hh5dn" Apr 22 19:23:55.404616 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:55.404573 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/117697b4-88b4-4152-942c-224dcf13a685-service-ca-bundle\") pod \"router-default-7697566b5d-hh5dn\" (UID: \"117697b4-88b4-4152-942c-224dcf13a685\") " pod="openshift-ingress/router-default-7697566b5d-hh5dn" Apr 22 19:23:55.404741 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:55.404652 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:23:55.404741 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:55.404715 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/117697b4-88b4-4152-942c-224dcf13a685-metrics-certs podName:117697b4-88b4-4152-942c-224dcf13a685 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:59.404696321 +0000 UTC m=+41.265650040 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/117697b4-88b4-4152-942c-224dcf13a685-metrics-certs") pod "router-default-7697566b5d-hh5dn" (UID: "117697b4-88b4-4152-942c-224dcf13a685") : secret "router-metrics-certs-default" not found Apr 22 19:23:55.404833 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:55.404752 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/117697b4-88b4-4152-942c-224dcf13a685-service-ca-bundle podName:117697b4-88b4-4152-942c-224dcf13a685 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:59.404741907 +0000 UTC m=+41.265695623 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/117697b4-88b4-4152-942c-224dcf13a685-service-ca-bundle") pod "router-default-7697566b5d-hh5dn" (UID: "117697b4-88b4-4152-942c-224dcf13a685") : configmap references non-existent config key: service-ca.crt Apr 22 19:23:55.505852 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:55.505755 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3dd0e38f-633c-4e17-8389-15a41d942093-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5sfmj\" (UID: \"3dd0e38f-633c-4e17-8389-15a41d942093\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5sfmj" Apr 22 19:23:55.505852 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:55.505808 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3a182f49-9fd9-43cf-983a-9e45aae094ac-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wg445\" (UID: \"3a182f49-9fd9-43cf-983a-9e45aae094ac\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wg445" Apr 22 19:23:55.506089 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:55.505855 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a439b73c-32fc-4331-99ac-1f81f67a857d-metrics-tls\") pod \"dns-default-cj979\" (UID: \"a439b73c-32fc-4331-99ac-1f81f67a857d\") " pod="openshift-dns/dns-default-cj979" Apr 22 19:23:55.506089 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:55.505895 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:23:55.506089 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:55.505960 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:23:55.506089 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:55.505983 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:23:55.506089 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:55.505905 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/04ec7b20-5a3e-44b4-afea-44994d100476-cert\") pod \"ingress-canary-rbbtr\" (UID: \"04ec7b20-5a3e-44b4-afea-44994d100476\") " pod="openshift-ingress-canary/ingress-canary-rbbtr" Apr 22 19:23:55.506089 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:55.505969 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dd0e38f-633c-4e17-8389-15a41d942093-cluster-monitoring-operator-tls podName:3dd0e38f-633c-4e17-8389-15a41d942093 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:59.505950968 +0000 UTC m=+41.366904683 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/3dd0e38f-633c-4e17-8389-15a41d942093-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-5sfmj" (UID: "3dd0e38f-633c-4e17-8389-15a41d942093") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:23:55.506089 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:55.506008 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:23:55.506089 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:55.506058 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a182f49-9fd9-43cf-983a-9e45aae094ac-networking-console-plugin-cert podName:3a182f49-9fd9-43cf-983a-9e45aae094ac nodeName:}" failed. No retries permitted until 2026-04-22 19:23:59.506018985 +0000 UTC m=+41.366972714 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3a182f49-9fd9-43cf-983a-9e45aae094ac-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wg445" (UID: "3a182f49-9fd9-43cf-983a-9e45aae094ac") : secret "networking-console-plugin-cert" not found Apr 22 19:23:55.506089 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:55.506087 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04ec7b20-5a3e-44b4-afea-44994d100476-cert podName:04ec7b20-5a3e-44b4-afea-44994d100476 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:59.506074043 +0000 UTC m=+41.367027759 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/04ec7b20-5a3e-44b4-afea-44994d100476-cert") pod "ingress-canary-rbbtr" (UID: "04ec7b20-5a3e-44b4-afea-44994d100476") : secret "canary-serving-cert" not found Apr 22 19:23:55.506510 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:55.506110 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a439b73c-32fc-4331-99ac-1f81f67a857d-metrics-tls podName:a439b73c-32fc-4331-99ac-1f81f67a857d nodeName:}" failed. No retries permitted until 2026-04-22 19:23:59.506100403 +0000 UTC m=+41.367054120 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a439b73c-32fc-4331-99ac-1f81f67a857d-metrics-tls") pod "dns-default-cj979" (UID: "a439b73c-32fc-4331-99ac-1f81f67a857d") : secret "dns-default-metrics-tls" not found Apr 22 19:23:59.342347 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:59.342153 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14676505-81f2-4619-b4da-c41d68fefcce-registry-tls\") pod \"image-registry-7674868db4-j8tkf\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " pod="openshift-image-registry/image-registry-7674868db4-j8tkf" Apr 22 19:23:59.342761 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:59.342315 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:23:59.342761 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:59.342402 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7674868db4-j8tkf: secret "image-registry-tls" not found Apr 22 19:23:59.342761 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:59.342471 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14676505-81f2-4619-b4da-c41d68fefcce-registry-tls podName:14676505-81f2-4619-b4da-c41d68fefcce nodeName:}" failed. No retries permitted until 2026-04-22 19:24:07.342452507 +0000 UTC m=+49.203406241 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/14676505-81f2-4619-b4da-c41d68fefcce-registry-tls") pod "image-registry-7674868db4-j8tkf" (UID: "14676505-81f2-4619-b4da-c41d68fefcce") : secret "image-registry-tls" not found Apr 22 19:23:59.443465 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:59.443438 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/117697b4-88b4-4152-942c-224dcf13a685-metrics-certs\") pod \"router-default-7697566b5d-hh5dn\" (UID: \"117697b4-88b4-4152-942c-224dcf13a685\") " pod="openshift-ingress/router-default-7697566b5d-hh5dn" Apr 22 19:23:59.443569 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:59.443507 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/117697b4-88b4-4152-942c-224dcf13a685-service-ca-bundle\") pod \"router-default-7697566b5d-hh5dn\" (UID: \"117697b4-88b4-4152-942c-224dcf13a685\") " pod="openshift-ingress/router-default-7697566b5d-hh5dn" Apr 22 19:23:59.443631 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:59.443587 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3543ced1-0882-444d-be8a-53d6ef47c1e6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wrxdf\" (UID: \"3543ced1-0882-444d-be8a-53d6ef47c1e6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrxdf" Apr 22 19:23:59.443631 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:59.443602 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:23:59.443763 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:59.443677 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/117697b4-88b4-4152-942c-224dcf13a685-metrics-certs podName:117697b4-88b4-4152-942c-224dcf13a685 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:07.443656351 +0000 UTC m=+49.304610068 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/117697b4-88b4-4152-942c-224dcf13a685-metrics-certs") pod "router-default-7697566b5d-hh5dn" (UID: "117697b4-88b4-4152-942c-224dcf13a685") : secret "router-metrics-certs-default" not found Apr 22 19:23:59.443763 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:59.443703 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:23:59.443763 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:59.443715 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/117697b4-88b4-4152-942c-224dcf13a685-service-ca-bundle podName:117697b4-88b4-4152-942c-224dcf13a685 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:07.443692653 +0000 UTC m=+49.304646373 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/117697b4-88b4-4152-942c-224dcf13a685-service-ca-bundle") pod "router-default-7697566b5d-hh5dn" (UID: "117697b4-88b4-4152-942c-224dcf13a685") : configmap references non-existent config key: service-ca.crt Apr 22 19:23:59.443924 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:59.443766 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3543ced1-0882-444d-be8a-53d6ef47c1e6-samples-operator-tls podName:3543ced1-0882-444d-be8a-53d6ef47c1e6 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:07.443752128 +0000 UTC m=+49.304705856 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3543ced1-0882-444d-be8a-53d6ef47c1e6-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-wrxdf" (UID: "3543ced1-0882-444d-be8a-53d6ef47c1e6") : secret "samples-operator-tls" not found Apr 22 19:23:59.546211 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:59.545668 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3dd0e38f-633c-4e17-8389-15a41d942093-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5sfmj\" (UID: \"3dd0e38f-633c-4e17-8389-15a41d942093\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5sfmj" Apr 22 19:23:59.546211 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:59.545741 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3a182f49-9fd9-43cf-983a-9e45aae094ac-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wg445\" (UID: \"3a182f49-9fd9-43cf-983a-9e45aae094ac\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wg445" Apr 22 19:23:59.546211 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:59.545789 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a439b73c-32fc-4331-99ac-1f81f67a857d-metrics-tls\") pod \"dns-default-cj979\" (UID: \"a439b73c-32fc-4331-99ac-1f81f67a857d\") " pod="openshift-dns/dns-default-cj979" Apr 22 19:23:59.546211 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:59.545834 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/04ec7b20-5a3e-44b4-afea-44994d100476-cert\") pod \"ingress-canary-rbbtr\" (UID: \"04ec7b20-5a3e-44b4-afea-44994d100476\") " pod="openshift-ingress-canary/ingress-canary-rbbtr" Apr 22 19:23:59.546211 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:59.545971 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:23:59.546211 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:59.546030 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04ec7b20-5a3e-44b4-afea-44994d100476-cert podName:04ec7b20-5a3e-44b4-afea-44994d100476 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:07.546009521 +0000 UTC m=+49.406963256 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/04ec7b20-5a3e-44b4-afea-44994d100476-cert") pod "ingress-canary-rbbtr" (UID: "04ec7b20-5a3e-44b4-afea-44994d100476") : secret "canary-serving-cert" not found Apr 22 19:23:59.546590 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:59.546304 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:23:59.546590 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:59.546369 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a182f49-9fd9-43cf-983a-9e45aae094ac-networking-console-plugin-cert podName:3a182f49-9fd9-43cf-983a-9e45aae094ac nodeName:}" failed. No retries permitted until 2026-04-22 19:24:07.546351091 +0000 UTC m=+49.407304821 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3a182f49-9fd9-43cf-983a-9e45aae094ac-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wg445" (UID: "3a182f49-9fd9-43cf-983a-9e45aae094ac") : secret "networking-console-plugin-cert" not found Apr 22 19:23:59.546590 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:59.546436 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:23:59.546590 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:59.546440 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:23:59.546590 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:59.546477 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dd0e38f-633c-4e17-8389-15a41d942093-cluster-monitoring-operator-tls podName:3dd0e38f-633c-4e17-8389-15a41d942093 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:07.546465186 +0000 UTC m=+49.407418904 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/3dd0e38f-633c-4e17-8389-15a41d942093-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-5sfmj" (UID: "3dd0e38f-633c-4e17-8389-15a41d942093") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:23:59.546590 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:23:59.546497 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a439b73c-32fc-4331-99ac-1f81f67a857d-metrics-tls podName:a439b73c-32fc-4331-99ac-1f81f67a857d nodeName:}" failed. No retries permitted until 2026-04-22 19:24:07.546487018 +0000 UTC m=+49.407440738 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a439b73c-32fc-4331-99ac-1f81f67a857d-metrics-tls") pod "dns-default-cj979" (UID: "a439b73c-32fc-4331-99ac-1f81f67a857d") : secret "dns-default-metrics-tls" not found Apr 22 19:23:59.944272 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:59.944155 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-748dj" event={"ID":"97ce33ae-619d-451a-8e53-dc35cdbba9e3","Type":"ContainerStarted","Data":"cfc799c07791d37a8068ecf7a9e0901ce7ab181e8afff89503464754567dde2f"} Apr 22 19:23:59.945499 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:59.945472 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v2f7q" event={"ID":"69e02842-f395-40eb-875a-3561c42e7eef","Type":"ContainerStarted","Data":"d88c561140298bca93344fe29cc438abc4f44cbfe4763ca1fc9feb6aec031e23"} Apr 22 19:23:59.946852 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:59.946810 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mqx2f" event={"ID":"944e0b61-e3fd-4e73-88f5-25f65ad44d11","Type":"ContainerStarted","Data":"cf8b5b5df2ad368fe749a52cd2105740eba755753941e3bbbf3401ca9602a4a8"} Apr 22 19:23:59.948409 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:59.948383 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pq94v" event={"ID":"45a00f57-cea9-483d-bda8-358ac125ff3b","Type":"ContainerStarted","Data":"7e0ca03047d72f4c37f3ed37a4535b888c9cd6743cecba4d5d42c10c3b876914"} Apr 22 19:23:59.956029 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:59.956008 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5rw5p_af8ba87e-3b79-4273-af80-a16c607dbf85/console-operator/0.log" Apr 22 19:23:59.956143 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:59.956042 2576 generic.go:358] "Generic (PLEG): container finished" podID="af8ba87e-3b79-4273-af80-a16c607dbf85" containerID="81c7bf14d39c9724a397a123cb38ccbfeb7d0eea6da420b2f5b95f032fdad5fc" exitCode=255 Apr 22 19:23:59.956143 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:59.956106 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-5rw5p" event={"ID":"af8ba87e-3b79-4273-af80-a16c607dbf85","Type":"ContainerDied","Data":"81c7bf14d39c9724a397a123cb38ccbfeb7d0eea6da420b2f5b95f032fdad5fc"} Apr 22 19:23:59.956345 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:59.956327 2576 scope.go:117] "RemoveContainer" containerID="81c7bf14d39c9724a397a123cb38ccbfeb7d0eea6da420b2f5b95f032fdad5fc" Apr 22 19:23:59.957637 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:59.957613 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-z7d4f" event={"ID":"403f71c1-3bb4-4aef-9a26-b062489c9d03","Type":"ContainerStarted","Data":"088fcf9d0cd2c8e680380a5c790b04e4b8688b23ac89db1415950593cb7adcb2"} Apr 22 19:23:59.960039 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:59.959978 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-748dj" podStartSLOduration=11.0213104 podStartE2EDuration="17.959964079s" podCreationTimestamp="2026-04-22 19:23:42 +0000 UTC" firstStartedPulling="2026-04-22 19:23:52.268493731 +0000 UTC m=+34.129447453" lastFinishedPulling="2026-04-22 19:23:59.207147402 +0000 UTC m=+41.068101132" observedRunningTime="2026-04-22 19:23:59.959033875 +0000 UTC m=+41.819987615" watchObservedRunningTime="2026-04-22 19:23:59.959964079 +0000 UTC m=+41.820917819" Apr 22 19:23:59.962125 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:59.962093 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hr5vn" event={"ID":"17ada943-fc5e-4b09-bf82-9132909cb32d","Type":"ContainerStarted","Data":"98a144e81be61b1648c1870e0f1aa22f418a0c75167bb1ad50b255e15da9afc1"} Apr 22 19:23:59.963810 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:59.963788 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-858rm" event={"ID":"c887aba7-380b-4a80-bc2c-6e89b986da6b","Type":"ContainerStarted","Data":"888428c0afbb3b7fdc3424908cebcb1ea02d34e218aed6c46b260729ac0f17e4"} Apr 22 19:23:59.963958 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:59.963942 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-858rm" Apr 22 19:23:59.972857 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:23:59.972794 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pq94v" podStartSLOduration=10.898962715 podStartE2EDuration="17.97278074s" podCreationTimestamp="2026-04-22 19:23:42 +0000 UTC" firstStartedPulling="2026-04-22 19:23:52.133356077 +0000 UTC m=+33.994309797" lastFinishedPulling="2026-04-22 19:23:59.207174096 +0000 UTC m=+41.068127822" observedRunningTime="2026-04-22 19:23:59.971957713 +0000 UTC m=+41.832911449" watchObservedRunningTime="2026-04-22 19:23:59.97278074 +0000 UTC m=+41.833734478" Apr 22 19:24:00.006930 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:00.006879 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-z7d4f" podStartSLOduration=11.069757699 podStartE2EDuration="18.006860524s" podCreationTimestamp="2026-04-22 19:23:42 +0000 UTC" firstStartedPulling="2026-04-22 19:23:52.270045674 +0000 UTC m=+34.130999400" lastFinishedPulling="2026-04-22 19:23:59.207148503 +0000 UTC m=+41.068102225" observedRunningTime="2026-04-22 19:24:00.004985542 +0000 UTC m=+41.865939280" watchObservedRunningTime="2026-04-22 19:24:00.006860524 +0000 UTC m=+41.867814262" Apr 22 19:24:00.020910 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:00.020853 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mqx2f" podStartSLOduration=10.990175199 podStartE2EDuration="18.020834861s" podCreationTimestamp="2026-04-22 19:23:42 +0000 UTC" firstStartedPulling="2026-04-22 19:23:52.270403184 +0000 UTC m=+34.131356902" lastFinishedPulling="2026-04-22 19:23:59.301062848 +0000 UTC m=+41.162016564" observedRunningTime="2026-04-22 19:24:00.020079414 +0000 UTC m=+41.881033151" watchObservedRunningTime="2026-04-22 19:24:00.020834861 +0000 UTC m=+41.881788599" Apr 22 19:24:00.043064 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:00.043001 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v2f7q" podStartSLOduration=11.105219165 podStartE2EDuration="18.04298228s" podCreationTimestamp="2026-04-22 19:23:42 +0000 UTC" firstStartedPulling="2026-04-22 19:23:52.26965107 +0000 UTC m=+34.130604806" lastFinishedPulling="2026-04-22 19:23:59.207414191 +0000 UTC m=+41.068367921" observedRunningTime="2026-04-22 19:24:00.041438225 +0000 UTC m=+41.902391962" watchObservedRunningTime="2026-04-22 19:24:00.04298228 +0000 UTC m=+41.903936019" Apr 22 19:24:00.070754 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:00.068760 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-hr5vn" podStartSLOduration=10.722320606 podStartE2EDuration="41.068716234s" podCreationTimestamp="2026-04-22 19:23:19 +0000 UTC" firstStartedPulling="2026-04-22 19:23:21.409559957 +0000 UTC m=+3.270513691" lastFinishedPulling="2026-04-22 19:23:51.755955588 +0000 UTC m=+33.616909319" observedRunningTime="2026-04-22 19:24:00.068218598 +0000 UTC m=+41.929172338" watchObservedRunningTime="2026-04-22 19:24:00.068716234 +0000 UTC m=+41.929669964" Apr 22 19:24:00.083328 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:00.083279 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-858rm" podStartSLOduration=35.681343150000004 podStartE2EDuration="42.083263667s" podCreationTimestamp="2026-04-22 19:23:18 +0000 UTC" firstStartedPulling="2026-04-22 19:23:52.909106274 +0000 UTC m=+34.770059992" lastFinishedPulling="2026-04-22 19:23:59.311026793 +0000 UTC m=+41.171980509" observedRunningTime="2026-04-22 19:24:00.082315408 +0000 UTC m=+41.943269147" watchObservedRunningTime="2026-04-22 19:24:00.083263667 +0000 UTC m=+41.944217434" Apr 22 19:24:00.552673 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:00.552633 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8-original-pull-secret\") pod \"global-pull-secret-syncer-rb7nf\" (UID: \"b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8\") " pod="kube-system/global-pull-secret-syncer-rb7nf" Apr 22 19:24:00.556774 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:00.556739 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8-original-pull-secret\") pod \"global-pull-secret-syncer-rb7nf\" (UID: \"b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8\") " pod="kube-system/global-pull-secret-syncer-rb7nf" Apr 22 19:24:00.845781 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:00.845674 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rb7nf" Apr 22 19:24:00.966299 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:00.966242 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rb7nf"] Apr 22 19:24:00.969099 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:00.969079 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5rw5p_af8ba87e-3b79-4273-af80-a16c607dbf85/console-operator/1.log" Apr 22 19:24:00.969572 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:00.969486 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5rw5p_af8ba87e-3b79-4273-af80-a16c607dbf85/console-operator/0.log" Apr 22 19:24:00.969572 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:00.969518 2576 generic.go:358] "Generic (PLEG): container finished" podID="af8ba87e-3b79-4273-af80-a16c607dbf85" containerID="8ec1c0f5b0c6ab84a9191944a5bbebe74057ce6ea46aaf7a565dedb1ba9585d3" exitCode=255 Apr 22 19:24:00.969693 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:00.969635 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-5rw5p" event={"ID":"af8ba87e-3b79-4273-af80-a16c607dbf85","Type":"ContainerDied","Data":"8ec1c0f5b0c6ab84a9191944a5bbebe74057ce6ea46aaf7a565dedb1ba9585d3"} Apr 22 19:24:00.969693 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:00.969671 2576 scope.go:117] "RemoveContainer" containerID="81c7bf14d39c9724a397a123cb38ccbfeb7d0eea6da420b2f5b95f032fdad5fc" Apr 22 19:24:00.969968 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:00.969943 2576 scope.go:117] "RemoveContainer" containerID="8ec1c0f5b0c6ab84a9191944a5bbebe74057ce6ea46aaf7a565dedb1ba9585d3" Apr 22 19:24:00.970419 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:24:00.970176 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-5rw5p_openshift-console-operator(af8ba87e-3b79-4273-af80-a16c607dbf85)\"" pod="openshift-console-operator/console-operator-9d4b6777b-5rw5p" podUID="af8ba87e-3b79-4273-af80-a16c607dbf85" Apr 22 19:24:00.972300 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:24:00.972258 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1a86c6d_76f8_443f_bc6b_13cdb22ad4b8.slice/crio-b2761b9430095683d96dd1fbd2403334149ddddb4fb65cb2ecf2d46f26a0943a WatchSource:0}: Error finding container b2761b9430095683d96dd1fbd2403334149ddddb4fb65cb2ecf2d46f26a0943a: Status 404 returned error can't find the container with id b2761b9430095683d96dd1fbd2403334149ddddb4fb65cb2ecf2d46f26a0943a Apr 22 19:24:01.797561 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:01.797527 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5hk26_eb4cf245-3080-4779-80ac-295e37a8327f/dns-node-resolver/0.log" Apr 22 19:24:01.906933 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:01.906893 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-5rw5p" Apr 22 19:24:01.906933 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:01.906934 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-5rw5p" Apr 22 19:24:01.973681 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:01.973651 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5rw5p_af8ba87e-3b79-4273-af80-a16c607dbf85/console-operator/1.log" Apr 22 19:24:01.974150 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:01.974119 2576 scope.go:117] "RemoveContainer" containerID="8ec1c0f5b0c6ab84a9191944a5bbebe74057ce6ea46aaf7a565dedb1ba9585d3" Apr 22 19:24:01.974395 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:24:01.974356 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-5rw5p_openshift-console-operator(af8ba87e-3b79-4273-af80-a16c607dbf85)\"" pod="openshift-console-operator/console-operator-9d4b6777b-5rw5p" podUID="af8ba87e-3b79-4273-af80-a16c607dbf85" Apr 22 19:24:01.974953 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:01.974916 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rb7nf" event={"ID":"b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8","Type":"ContainerStarted","Data":"b2761b9430095683d96dd1fbd2403334149ddddb4fb65cb2ecf2d46f26a0943a"} Apr 22 19:24:02.977872 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:02.977846 2576 scope.go:117] "RemoveContainer" containerID="8ec1c0f5b0c6ab84a9191944a5bbebe74057ce6ea46aaf7a565dedb1ba9585d3" Apr 22 19:24:02.978324 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:24:02.978042 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-5rw5p_openshift-console-operator(af8ba87e-3b79-4273-af80-a16c607dbf85)\"" pod="openshift-console-operator/console-operator-9d4b6777b-5rw5p" podUID="af8ba87e-3b79-4273-af80-a16c607dbf85" Apr 22 19:24:02.997491 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:02.997472 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-pnb2n_a2085208-6038-4f2b-929e-b75fe56d5a28/node-ca/0.log" Apr 22 19:24:05.987672 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:05.987639 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rb7nf" event={"ID":"b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8","Type":"ContainerStarted","Data":"642fd6a82c5a1afd5d320eaac3daf93c6b3a42fdfefca790a7672060403934ae"} Apr 22 19:24:06.003268 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:06.003226 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-rb7nf" podStartSLOduration=33.496084546 podStartE2EDuration="38.003214645s" podCreationTimestamp="2026-04-22 19:23:28 +0000 UTC" firstStartedPulling="2026-04-22 19:24:00.973937541 +0000 UTC m=+42.834891260" lastFinishedPulling="2026-04-22 19:24:05.481067641 +0000 UTC m=+47.342021359" observedRunningTime="2026-04-22 19:24:06.002192316 +0000 UTC m=+47.863146053" watchObservedRunningTime="2026-04-22 19:24:06.003214645 +0000 UTC m=+47.864168382" Apr 22 19:24:07.407536 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:07.407498 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14676505-81f2-4619-b4da-c41d68fefcce-registry-tls\") pod \"image-registry-7674868db4-j8tkf\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " pod="openshift-image-registry/image-registry-7674868db4-j8tkf" Apr 22 19:24:07.407952 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:24:07.407643 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:24:07.407952 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:24:07.407661 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7674868db4-j8tkf: secret "image-registry-tls" not found Apr 22 19:24:07.407952 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:24:07.407717 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14676505-81f2-4619-b4da-c41d68fefcce-registry-tls podName:14676505-81f2-4619-b4da-c41d68fefcce nodeName:}" failed. No retries permitted until 2026-04-22 19:24:23.407701488 +0000 UTC m=+65.268655202 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/14676505-81f2-4619-b4da-c41d68fefcce-registry-tls") pod "image-registry-7674868db4-j8tkf" (UID: "14676505-81f2-4619-b4da-c41d68fefcce") : secret "image-registry-tls" not found Apr 22 19:24:07.508848 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:07.508808 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/117697b4-88b4-4152-942c-224dcf13a685-metrics-certs\") pod \"router-default-7697566b5d-hh5dn\" (UID: \"117697b4-88b4-4152-942c-224dcf13a685\") " pod="openshift-ingress/router-default-7697566b5d-hh5dn" Apr 22 19:24:07.509011 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:07.508863 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/117697b4-88b4-4152-942c-224dcf13a685-service-ca-bundle\") pod \"router-default-7697566b5d-hh5dn\" (UID: \"117697b4-88b4-4152-942c-224dcf13a685\") " pod="openshift-ingress/router-default-7697566b5d-hh5dn" Apr 22 19:24:07.509011 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:07.508922 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3543ced1-0882-444d-be8a-53d6ef47c1e6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wrxdf\" (UID: \"3543ced1-0882-444d-be8a-53d6ef47c1e6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrxdf" Apr 22 19:24:07.509011 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:24:07.508954 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:24:07.509111 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:24:07.509019 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/117697b4-88b4-4152-942c-224dcf13a685-metrics-certs podName:117697b4-88b4-4152-942c-224dcf13a685 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:23.509004658 +0000 UTC m=+65.369958373 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/117697b4-88b4-4152-942c-224dcf13a685-metrics-certs") pod "router-default-7697566b5d-hh5dn" (UID: "117697b4-88b4-4152-942c-224dcf13a685") : secret "router-metrics-certs-default" not found Apr 22 19:24:07.509111 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:24:07.509027 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:24:07.509111 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:24:07.509058 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/117697b4-88b4-4152-942c-224dcf13a685-service-ca-bundle podName:117697b4-88b4-4152-942c-224dcf13a685 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:23.50904277 +0000 UTC m=+65.369996497 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/117697b4-88b4-4152-942c-224dcf13a685-service-ca-bundle") pod "router-default-7697566b5d-hh5dn" (UID: "117697b4-88b4-4152-942c-224dcf13a685") : configmap references non-existent config key: service-ca.crt Apr 22 19:24:07.509111 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:24:07.509077 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3543ced1-0882-444d-be8a-53d6ef47c1e6-samples-operator-tls podName:3543ced1-0882-444d-be8a-53d6ef47c1e6 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:23.509070047 +0000 UTC m=+65.370023774 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3543ced1-0882-444d-be8a-53d6ef47c1e6-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-wrxdf" (UID: "3543ced1-0882-444d-be8a-53d6ef47c1e6") : secret "samples-operator-tls" not found Apr 22 19:24:07.610067 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:07.610028 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3dd0e38f-633c-4e17-8389-15a41d942093-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5sfmj\" (UID: \"3dd0e38f-633c-4e17-8389-15a41d942093\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5sfmj" Apr 22 19:24:07.610067 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:07.610072 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3a182f49-9fd9-43cf-983a-9e45aae094ac-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wg445\" (UID: \"3a182f49-9fd9-43cf-983a-9e45aae094ac\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wg445" Apr 22 19:24:07.610267 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:24:07.610160 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:24:07.610267 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:24:07.610161 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:24:07.610267 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:07.610197 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a439b73c-32fc-4331-99ac-1f81f67a857d-metrics-tls\") pod \"dns-default-cj979\" (UID: \"a439b73c-32fc-4331-99ac-1f81f67a857d\") " pod="openshift-dns/dns-default-cj979" Apr 22 19:24:07.610267 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:24:07.610208 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a182f49-9fd9-43cf-983a-9e45aae094ac-networking-console-plugin-cert podName:3a182f49-9fd9-43cf-983a-9e45aae094ac nodeName:}" failed. No retries permitted until 2026-04-22 19:24:23.61019438 +0000 UTC m=+65.471148095 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3a182f49-9fd9-43cf-983a-9e45aae094ac-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wg445" (UID: "3a182f49-9fd9-43cf-983a-9e45aae094ac") : secret "networking-console-plugin-cert" not found Apr 22 19:24:07.610267 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:24:07.610267 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:07.610426 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:24:07.610268 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dd0e38f-633c-4e17-8389-15a41d942093-cluster-monitoring-operator-tls podName:3dd0e38f-633c-4e17-8389-15a41d942093 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:23.610251648 +0000 UTC m=+65.471205367 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/3dd0e38f-633c-4e17-8389-15a41d942093-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-5sfmj" (UID: "3dd0e38f-633c-4e17-8389-15a41d942093") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:24:07.610426 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:24:07.610312 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a439b73c-32fc-4331-99ac-1f81f67a857d-metrics-tls podName:a439b73c-32fc-4331-99ac-1f81f67a857d nodeName:}" failed. No retries permitted until 2026-04-22 19:24:23.610301899 +0000 UTC m=+65.471255614 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a439b73c-32fc-4331-99ac-1f81f67a857d-metrics-tls") pod "dns-default-cj979" (UID: "a439b73c-32fc-4331-99ac-1f81f67a857d") : secret "dns-default-metrics-tls" not found Apr 22 19:24:07.610426 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:07.610358 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/04ec7b20-5a3e-44b4-afea-44994d100476-cert\") pod \"ingress-canary-rbbtr\" (UID: \"04ec7b20-5a3e-44b4-afea-44994d100476\") " pod="openshift-ingress-canary/ingress-canary-rbbtr" Apr 22 19:24:07.610535 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:24:07.610448 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:07.610535 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:24:07.610474 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04ec7b20-5a3e-44b4-afea-44994d100476-cert podName:04ec7b20-5a3e-44b4-afea-44994d100476 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:23.610467107 +0000 UTC m=+65.471420822 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/04ec7b20-5a3e-44b4-afea-44994d100476-cert") pod "ingress-canary-rbbtr" (UID: "04ec7b20-5a3e-44b4-afea-44994d100476") : secret "canary-serving-cert" not found Apr 22 19:24:16.890708 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:16.890679 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-r2gjj" Apr 22 19:24:17.733595 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:17.733566 2576 scope.go:117] "RemoveContainer" containerID="8ec1c0f5b0c6ab84a9191944a5bbebe74057ce6ea46aaf7a565dedb1ba9585d3" Apr 22 19:24:18.022927 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:18.022848 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5rw5p_af8ba87e-3b79-4273-af80-a16c607dbf85/console-operator/1.log" Apr 22 19:24:18.022927 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:18.022902 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-5rw5p" event={"ID":"af8ba87e-3b79-4273-af80-a16c607dbf85","Type":"ContainerStarted","Data":"ed898a269db8480ffa579376d5b2777b2bf8a95f8ae50ef961d42d5ccf015c83"} Apr 22 19:24:18.023392 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:18.023131 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-5rw5p" Apr 22 19:24:18.042692 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:18.042633 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-5rw5p" podStartSLOduration=28.968359702 podStartE2EDuration="36.042616901s" podCreationTimestamp="2026-04-22 19:23:42 +0000 UTC" firstStartedPulling="2026-04-22 19:23:52.13304115 +0000 UTC m=+33.993994869" lastFinishedPulling="2026-04-22 19:23:59.207298351 +0000 UTC m=+41.068252068" observedRunningTime="2026-04-22 19:24:18.042017042 +0000 UTC m=+59.902970813" watchObservedRunningTime="2026-04-22 19:24:18.042616901 +0000 UTC m=+59.903570639" Apr 22 19:24:18.577872 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:18.577840 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-5rw5p" Apr 22 19:24:23.442314 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:23.442273 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14676505-81f2-4619-b4da-c41d68fefcce-registry-tls\") pod \"image-registry-7674868db4-j8tkf\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " pod="openshift-image-registry/image-registry-7674868db4-j8tkf" Apr 22 19:24:23.444838 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:23.444806 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14676505-81f2-4619-b4da-c41d68fefcce-registry-tls\") pod \"image-registry-7674868db4-j8tkf\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " pod="openshift-image-registry/image-registry-7674868db4-j8tkf" Apr 22 19:24:23.542708 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:23.542671 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/117697b4-88b4-4152-942c-224dcf13a685-metrics-certs\") pod \"router-default-7697566b5d-hh5dn\" (UID: \"117697b4-88b4-4152-942c-224dcf13a685\") " pod="openshift-ingress/router-default-7697566b5d-hh5dn" Apr 22 19:24:23.542882 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:23.542723 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/117697b4-88b4-4152-942c-224dcf13a685-service-ca-bundle\") pod \"router-default-7697566b5d-hh5dn\" (UID: \"117697b4-88b4-4152-942c-224dcf13a685\") " pod="openshift-ingress/router-default-7697566b5d-hh5dn" Apr 22 19:24:23.542882 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:23.542794 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3543ced1-0882-444d-be8a-53d6ef47c1e6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wrxdf\" (UID: \"3543ced1-0882-444d-be8a-53d6ef47c1e6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrxdf" Apr 22 19:24:23.543459 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:23.543433 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/117697b4-88b4-4152-942c-224dcf13a685-service-ca-bundle\") pod \"router-default-7697566b5d-hh5dn\" (UID: \"117697b4-88b4-4152-942c-224dcf13a685\") " pod="openshift-ingress/router-default-7697566b5d-hh5dn" Apr 22 19:24:23.545432 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:23.545408 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3543ced1-0882-444d-be8a-53d6ef47c1e6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wrxdf\" (UID: \"3543ced1-0882-444d-be8a-53d6ef47c1e6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrxdf" Apr 22 19:24:23.545542 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:23.545519 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/117697b4-88b4-4152-942c-224dcf13a685-metrics-certs\") pod \"router-default-7697566b5d-hh5dn\" (UID: \"117697b4-88b4-4152-942c-224dcf13a685\") " pod="openshift-ingress/router-default-7697566b5d-hh5dn" Apr 22 19:24:23.643824 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:23.643791 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/04ec7b20-5a3e-44b4-afea-44994d100476-cert\") pod \"ingress-canary-rbbtr\" (UID: \"04ec7b20-5a3e-44b4-afea-44994d100476\") " pod="openshift-ingress-canary/ingress-canary-rbbtr" Apr 22 19:24:23.643984 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:23.643870 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3dd0e38f-633c-4e17-8389-15a41d942093-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5sfmj\" (UID: \"3dd0e38f-633c-4e17-8389-15a41d942093\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5sfmj" Apr 22 19:24:23.644047 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:23.644024 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3a182f49-9fd9-43cf-983a-9e45aae094ac-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wg445\" (UID: \"3a182f49-9fd9-43cf-983a-9e45aae094ac\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wg445" Apr 22 19:24:23.644098 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:23.644070 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a439b73c-32fc-4331-99ac-1f81f67a857d-metrics-tls\") pod \"dns-default-cj979\" (UID: \"a439b73c-32fc-4331-99ac-1f81f67a857d\") " pod="openshift-dns/dns-default-cj979" Apr 22 19:24:23.646188 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:23.646164 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a439b73c-32fc-4331-99ac-1f81f67a857d-metrics-tls\") pod \"dns-default-cj979\" (UID: \"a439b73c-32fc-4331-99ac-1f81f67a857d\") " pod="openshift-dns/dns-default-cj979" Apr 22 19:24:23.646384 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:23.646365 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3dd0e38f-633c-4e17-8389-15a41d942093-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5sfmj\" (UID: \"3dd0e38f-633c-4e17-8389-15a41d942093\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5sfmj" Apr 22 19:24:23.646384 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:23.646376 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/04ec7b20-5a3e-44b4-afea-44994d100476-cert\") pod \"ingress-canary-rbbtr\" (UID: \"04ec7b20-5a3e-44b4-afea-44994d100476\") " pod="openshift-ingress-canary/ingress-canary-rbbtr" Apr 22 19:24:23.646470 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:23.646431 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3a182f49-9fd9-43cf-983a-9e45aae094ac-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wg445\" (UID: \"3a182f49-9fd9-43cf-983a-9e45aae094ac\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wg445" Apr 22 19:24:23.680143 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:23.680114 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-df49v\"" Apr 22 19:24:23.688560 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:23.688532 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7674868db4-j8tkf" Apr 22 19:24:23.729999 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:23.729973 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-hv58p\"" Apr 22 19:24:23.738836 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:23.738798 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrxdf" Apr 22 19:24:23.783503 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:23.783478 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-4sm6v\"" Apr 22 19:24:23.791945 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:23.791914 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7697566b5d-hh5dn" Apr 22 19:24:23.812454 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:23.812400 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7674868db4-j8tkf"] Apr 22 19:24:23.818081 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:24:23.818049 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14676505_81f2_4619_b4da_c41d68fefcce.slice/crio-f36a2bebd526e34c20fb02bf1a4eedcf9b73256be1acccdea371062aed58c8fe WatchSource:0}: Error finding container f36a2bebd526e34c20fb02bf1a4eedcf9b73256be1acccdea371062aed58c8fe: Status 404 returned error can't find the container with id f36a2bebd526e34c20fb02bf1a4eedcf9b73256be1acccdea371062aed58c8fe Apr 22 19:24:23.822359 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:23.822314 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-kb7vh\"" Apr 22 19:24:23.830947 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:23.830917 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5sfmj" Apr 22 19:24:23.834656 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:23.834391 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-f2r5m\"" Apr 22 19:24:23.843546 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:23.843520 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wg445" Apr 22 19:24:23.866336 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:23.866301 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-wknft\"" Apr 22 19:24:23.875318 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:23.875204 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rbbtr" Apr 22 19:24:23.882654 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:23.881338 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrxdf"] Apr 22 19:24:23.897829 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:23.897720 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-bspvd\"" Apr 22 19:24:23.906896 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:23.906358 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cj979" Apr 22 19:24:23.954276 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:23.954244 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7697566b5d-hh5dn"] Apr 22 19:24:23.958925 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:24:23.958878 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod117697b4_88b4_4152_942c_224dcf13a685.slice/crio-8d5a7a91ef315a56bb7b4cbdab8952afc5fd1332803378a2b2b1482a2deddad5 WatchSource:0}: Error finding container 8d5a7a91ef315a56bb7b4cbdab8952afc5fd1332803378a2b2b1482a2deddad5: Status 404 returned error can't find the container with id 8d5a7a91ef315a56bb7b4cbdab8952afc5fd1332803378a2b2b1482a2deddad5 Apr 22 19:24:24.017984 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:24.017936 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-5sfmj"] Apr 22 19:24:24.024299 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:24:24.024097 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dd0e38f_633c_4e17_8389_15a41d942093.slice/crio-c7415b86ff5f399a29e67144b862054449e4e803e50617636f3090f7eb04fa06 WatchSource:0}: Error finding container c7415b86ff5f399a29e67144b862054449e4e803e50617636f3090f7eb04fa06: Status 404 returned error can't find the container with id c7415b86ff5f399a29e67144b862054449e4e803e50617636f3090f7eb04fa06 Apr 22 19:24:24.031813 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:24.031781 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-wg445"] Apr 22 19:24:24.039965 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:24.039927 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrxdf" event={"ID":"3543ced1-0882-444d-be8a-53d6ef47c1e6","Type":"ContainerStarted","Data":"ce16e7e2f3b5b6a34a46505bdf2ad3f5d03e10ae32b9d2c824d7b8b1d60ebc49"} Apr 22 19:24:24.041507 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:24.041474 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5sfmj" event={"ID":"3dd0e38f-633c-4e17-8389-15a41d942093","Type":"ContainerStarted","Data":"c7415b86ff5f399a29e67144b862054449e4e803e50617636f3090f7eb04fa06"} Apr 22 19:24:24.042093 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:24:24.042015 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a182f49_9fd9_43cf_983a_9e45aae094ac.slice/crio-f61300743fa1ee6da4791ac15b0d1e10852550f6965ab93cfc048b751c7e1370 WatchSource:0}: Error finding container f61300743fa1ee6da4791ac15b0d1e10852550f6965ab93cfc048b751c7e1370: Status 404 returned error can't find the container with id f61300743fa1ee6da4791ac15b0d1e10852550f6965ab93cfc048b751c7e1370 Apr 22 19:24:24.043578 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:24.043498 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7674868db4-j8tkf" event={"ID":"14676505-81f2-4619-b4da-c41d68fefcce","Type":"ContainerStarted","Data":"2baba7c44e8ee27495a8b91f13ccbfbc72e336f8c371bac83e201f3730906fc5"} Apr 22 19:24:24.043578 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:24.043531 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7674868db4-j8tkf" event={"ID":"14676505-81f2-4619-b4da-c41d68fefcce","Type":"ContainerStarted","Data":"f36a2bebd526e34c20fb02bf1a4eedcf9b73256be1acccdea371062aed58c8fe"} Apr 22 19:24:24.043694 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:24.043641 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7674868db4-j8tkf" Apr 22 19:24:24.045397 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:24.045334 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7697566b5d-hh5dn" event={"ID":"117697b4-88b4-4152-942c-224dcf13a685","Type":"ContainerStarted","Data":"8d5a7a91ef315a56bb7b4cbdab8952afc5fd1332803378a2b2b1482a2deddad5"} Apr 22 19:24:24.082620 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:24.082478 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7674868db4-j8tkf" podStartSLOduration=65.082445552 podStartE2EDuration="1m5.082445552s" podCreationTimestamp="2026-04-22 19:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:24:24.065432133 +0000 UTC m=+65.926385874" watchObservedRunningTime="2026-04-22 19:24:24.082445552 +0000 UTC m=+65.943399291" Apr 22 19:24:24.085976 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:24.084569 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rbbtr"] Apr 22 19:24:24.097493 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:24:24.097459 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04ec7b20_5a3e_44b4_afea_44994d100476.slice/crio-6bdee510b2c17b70badd4759a98472c6c7c484aaf6b419c804923605f6a784af WatchSource:0}: Error finding container 6bdee510b2c17b70badd4759a98472c6c7c484aaf6b419c804923605f6a784af: Status 404 returned error can't find the container with id 6bdee510b2c17b70badd4759a98472c6c7c484aaf6b419c804923605f6a784af Apr 22 19:24:24.110429 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:24.110399 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cj979"] Apr 22 19:24:24.117575 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:24:24.117508 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda439b73c_32fc_4331_99ac_1f81f67a857d.slice/crio-166e4fd53bba836f50bc7ac396032e73a9ae1bf7be7c2d3f8cf76fdc74b44206 WatchSource:0}: Error finding container 166e4fd53bba836f50bc7ac396032e73a9ae1bf7be7c2d3f8cf76fdc74b44206: Status 404 returned error can't find the container with id 166e4fd53bba836f50bc7ac396032e73a9ae1bf7be7c2d3f8cf76fdc74b44206 Apr 22 19:24:24.451360 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:24.451324 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac86a56e-148b-4fb4-8415-acff438d7915-metrics-certs\") pod \"network-metrics-daemon-4sl88\" (UID: \"ac86a56e-148b-4fb4-8415-acff438d7915\") " pod="openshift-multus/network-metrics-daemon-4sl88" Apr 22 19:24:24.453387 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:24.453367 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:24:24.464473 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:24.464443 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac86a56e-148b-4fb4-8415-acff438d7915-metrics-certs\") pod \"network-metrics-daemon-4sl88\" (UID: \"ac86a56e-148b-4fb4-8415-acff438d7915\") " pod="openshift-multus/network-metrics-daemon-4sl88" Apr 22 19:24:24.555039 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:24.555007 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tqt8b\"" Apr 22 19:24:24.563506 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:24.563480 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4sl88" Apr 22 19:24:24.707909 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:24.707813 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4sl88"] Apr 22 19:24:24.713010 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:24:24.712974 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac86a56e_148b_4fb4_8415_acff438d7915.slice/crio-c0055242df2a2066952852291c250849d7b5cc666b326d076cf5f2129688e0c4 WatchSource:0}: Error finding container c0055242df2a2066952852291c250849d7b5cc666b326d076cf5f2129688e0c4: Status 404 returned error can't find the container with id c0055242df2a2066952852291c250849d7b5cc666b326d076cf5f2129688e0c4 Apr 22 19:24:25.052776 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:25.052638 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cj979" event={"ID":"a439b73c-32fc-4331-99ac-1f81f67a857d","Type":"ContainerStarted","Data":"166e4fd53bba836f50bc7ac396032e73a9ae1bf7be7c2d3f8cf76fdc74b44206"} Apr 22 19:24:25.055068 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:25.055003 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wg445" event={"ID":"3a182f49-9fd9-43cf-983a-9e45aae094ac","Type":"ContainerStarted","Data":"f61300743fa1ee6da4791ac15b0d1e10852550f6965ab93cfc048b751c7e1370"} Apr 22 19:24:25.058898 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:25.058259 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7697566b5d-hh5dn" event={"ID":"117697b4-88b4-4152-942c-224dcf13a685","Type":"ContainerStarted","Data":"7426c21a87337a21b3cc74756db4fb306f90a8fab7938298218783ed5bdf198e"} Apr 22 19:24:25.059994 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:25.059952 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4sl88" event={"ID":"ac86a56e-148b-4fb4-8415-acff438d7915","Type":"ContainerStarted","Data":"c0055242df2a2066952852291c250849d7b5cc666b326d076cf5f2129688e0c4"} Apr 22 19:24:25.062281 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:25.062237 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rbbtr" event={"ID":"04ec7b20-5a3e-44b4-afea-44994d100476","Type":"ContainerStarted","Data":"6bdee510b2c17b70badd4759a98472c6c7c484aaf6b419c804923605f6a784af"} Apr 22 19:24:25.079111 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:25.078187 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7697566b5d-hh5dn" podStartSLOduration=43.07817157 podStartE2EDuration="43.07817157s" podCreationTimestamp="2026-04-22 19:23:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:24:25.077385753 +0000 UTC m=+66.938339493" watchObservedRunningTime="2026-04-22 19:24:25.07817157 +0000 UTC m=+66.939125309" Apr 22 19:24:25.412945 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:25.412863 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-h762s"] Apr 22 19:24:25.450771 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:25.450682 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-h762s"] Apr 22 19:24:25.450942 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:25.450873 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-h762s" Apr 22 19:24:25.453677 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:25.453653 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-pwghq\"" Apr 22 19:24:25.453677 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:25.453674 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 19:24:25.454274 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:25.453680 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 19:24:25.561647 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:25.561601 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/96abbc7c-9222-4307-b3ba-c6d559756d44-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-h762s\" (UID: \"96abbc7c-9222-4307-b3ba-c6d559756d44\") " pod="openshift-insights/insights-runtime-extractor-h762s" Apr 22 19:24:25.561847 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:25.561672 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/96abbc7c-9222-4307-b3ba-c6d559756d44-data-volume\") pod \"insights-runtime-extractor-h762s\" (UID: \"96abbc7c-9222-4307-b3ba-c6d559756d44\") " pod="openshift-insights/insights-runtime-extractor-h762s" Apr 22 19:24:25.561847 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:25.561796 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/96abbc7c-9222-4307-b3ba-c6d559756d44-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-h762s\" (UID: \"96abbc7c-9222-4307-b3ba-c6d559756d44\") " pod="openshift-insights/insights-runtime-extractor-h762s" Apr 22 19:24:25.561954 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:25.561871 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf4dw\" (UniqueName: \"kubernetes.io/projected/96abbc7c-9222-4307-b3ba-c6d559756d44-kube-api-access-kf4dw\") pod \"insights-runtime-extractor-h762s\" (UID: \"96abbc7c-9222-4307-b3ba-c6d559756d44\") " pod="openshift-insights/insights-runtime-extractor-h762s" Apr 22 19:24:25.561954 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:25.561940 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/96abbc7c-9222-4307-b3ba-c6d559756d44-crio-socket\") pod \"insights-runtime-extractor-h762s\" (UID: \"96abbc7c-9222-4307-b3ba-c6d559756d44\") " pod="openshift-insights/insights-runtime-extractor-h762s" Apr 22 19:24:25.662746 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:25.662697 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/96abbc7c-9222-4307-b3ba-c6d559756d44-data-volume\") pod \"insights-runtime-extractor-h762s\" (UID: \"96abbc7c-9222-4307-b3ba-c6d559756d44\") " pod="openshift-insights/insights-runtime-extractor-h762s" Apr 22 19:24:25.662933 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:25.662778 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/96abbc7c-9222-4307-b3ba-c6d559756d44-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-h762s\" (UID: \"96abbc7c-9222-4307-b3ba-c6d559756d44\") " pod="openshift-insights/insights-runtime-extractor-h762s" Apr 22 19:24:25.662933 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:25.662833 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kf4dw\" (UniqueName: \"kubernetes.io/projected/96abbc7c-9222-4307-b3ba-c6d559756d44-kube-api-access-kf4dw\") pod \"insights-runtime-extractor-h762s\" (UID: \"96abbc7c-9222-4307-b3ba-c6d559756d44\") " pod="openshift-insights/insights-runtime-extractor-h762s" Apr 22 19:24:25.662933 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:25.662865 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/96abbc7c-9222-4307-b3ba-c6d559756d44-crio-socket\") pod \"insights-runtime-extractor-h762s\" (UID: \"96abbc7c-9222-4307-b3ba-c6d559756d44\") " pod="openshift-insights/insights-runtime-extractor-h762s" Apr 22 19:24:25.662933 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:25.662902 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/96abbc7c-9222-4307-b3ba-c6d559756d44-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-h762s\" (UID: \"96abbc7c-9222-4307-b3ba-c6d559756d44\") " pod="openshift-insights/insights-runtime-extractor-h762s" Apr 22 19:24:25.663186 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:25.663043 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/96abbc7c-9222-4307-b3ba-c6d559756d44-crio-socket\") pod \"insights-runtime-extractor-h762s\" (UID: \"96abbc7c-9222-4307-b3ba-c6d559756d44\") " pod="openshift-insights/insights-runtime-extractor-h762s" Apr 22 19:24:25.663313 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:25.663203 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/96abbc7c-9222-4307-b3ba-c6d559756d44-data-volume\") pod \"insights-runtime-extractor-h762s\" (UID: \"96abbc7c-9222-4307-b3ba-c6d559756d44\") " pod="openshift-insights/insights-runtime-extractor-h762s" Apr 22 19:24:25.663605 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:25.663559 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/96abbc7c-9222-4307-b3ba-c6d559756d44-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-h762s\" (UID: \"96abbc7c-9222-4307-b3ba-c6d559756d44\") " pod="openshift-insights/insights-runtime-extractor-h762s" Apr 22 19:24:25.665598 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:25.665554 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/96abbc7c-9222-4307-b3ba-c6d559756d44-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-h762s\" (UID: \"96abbc7c-9222-4307-b3ba-c6d559756d44\") " pod="openshift-insights/insights-runtime-extractor-h762s" Apr 22 19:24:25.671137 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:25.671098 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf4dw\" (UniqueName: \"kubernetes.io/projected/96abbc7c-9222-4307-b3ba-c6d559756d44-kube-api-access-kf4dw\") pod \"insights-runtime-extractor-h762s\" (UID: \"96abbc7c-9222-4307-b3ba-c6d559756d44\") " pod="openshift-insights/insights-runtime-extractor-h762s" Apr 22 19:24:25.767422 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:25.767381 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-h762s" Apr 22 19:24:25.792901 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:25.792846 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7697566b5d-hh5dn" Apr 22 19:24:25.795789 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:25.795764 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7697566b5d-hh5dn" Apr 22 19:24:26.065409 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:26.065376 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-7697566b5d-hh5dn" Apr 22 19:24:26.066691 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:26.066666 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7697566b5d-hh5dn" Apr 22 19:24:28.898847 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:28.898821 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-h762s"] Apr 22 19:24:28.903514 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:24:28.903485 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96abbc7c_9222_4307_b3ba_c6d559756d44.slice/crio-e88fe65302e21fe56670aa4da546cd11111a6aa75cc715bdc66afa9d504902b3 WatchSource:0}: Error finding container e88fe65302e21fe56670aa4da546cd11111a6aa75cc715bdc66afa9d504902b3: Status 404 returned error can't find the container with id e88fe65302e21fe56670aa4da546cd11111a6aa75cc715bdc66afa9d504902b3 Apr 22 19:24:29.075157 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:29.075117 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrxdf" event={"ID":"3543ced1-0882-444d-be8a-53d6ef47c1e6","Type":"ContainerStarted","Data":"771b6443f4dc12d8222d724d4800ffeb8ff8c5e11feba327901da8111a02829a"} Apr 22 19:24:29.076664 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:29.076634 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5sfmj" event={"ID":"3dd0e38f-633c-4e17-8389-15a41d942093","Type":"ContainerStarted","Data":"ccaf14dced7311db46b2dfb78a17d7d4aaf5d1fc898eba7886da5edb420c8fcf"} Apr 22 19:24:29.078396 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:29.078367 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cj979" event={"ID":"a439b73c-32fc-4331-99ac-1f81f67a857d","Type":"ContainerStarted","Data":"1fe993b155f939f1c78c0bc120b69d3f031a98eb27316d2e8a2b091bce6dbc03"} Apr 22 19:24:29.080397 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:29.080173 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wg445" event={"ID":"3a182f49-9fd9-43cf-983a-9e45aae094ac","Type":"ContainerStarted","Data":"c75cba2478940b0f5506dfc595b659f9f612f895d0b3262b5dac514584a58e2b"} Apr 22 19:24:29.081411 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:29.081387 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-h762s" event={"ID":"96abbc7c-9222-4307-b3ba-c6d559756d44","Type":"ContainerStarted","Data":"e88fe65302e21fe56670aa4da546cd11111a6aa75cc715bdc66afa9d504902b3"} Apr 22 19:24:29.082667 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:29.082646 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rbbtr" event={"ID":"04ec7b20-5a3e-44b4-afea-44994d100476","Type":"ContainerStarted","Data":"644a6c0019f46b855d963d5ac69e08f56f6aefb8778ca892777edbe083ac480b"} Apr 22 19:24:29.094222 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:29.094173 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5sfmj" podStartSLOduration=42.364299104 podStartE2EDuration="47.094159492s" podCreationTimestamp="2026-04-22 19:23:42 +0000 UTC" firstStartedPulling="2026-04-22 19:24:24.033320473 +0000 UTC m=+65.894274188" lastFinishedPulling="2026-04-22 19:24:28.76318085 +0000 UTC m=+70.624134576" observedRunningTime="2026-04-22 19:24:29.092675883 +0000 UTC m=+70.953629621" watchObservedRunningTime="2026-04-22 19:24:29.094159492 +0000 UTC m=+70.955113228" Apr 22 19:24:29.108218 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:29.108168 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wg445" podStartSLOduration=40.390328315 podStartE2EDuration="45.108154779s" podCreationTimestamp="2026-04-22 19:23:44 +0000 UTC" firstStartedPulling="2026-04-22 19:24:24.044146517 +0000 UTC m=+65.905100247" lastFinishedPulling="2026-04-22 19:24:28.761972991 +0000 UTC m=+70.622926711" observedRunningTime="2026-04-22 19:24:29.107030377 +0000 UTC m=+70.967984115" watchObservedRunningTime="2026-04-22 19:24:29.108154779 +0000 UTC m=+70.969108569" Apr 22 19:24:29.125610 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:29.125559 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-rbbtr" podStartSLOduration=33.463261735 podStartE2EDuration="38.125544736s" podCreationTimestamp="2026-04-22 19:23:51 +0000 UTC" firstStartedPulling="2026-04-22 19:24:24.099668273 +0000 UTC m=+65.960621996" lastFinishedPulling="2026-04-22 19:24:28.761951276 +0000 UTC m=+70.622904997" observedRunningTime="2026-04-22 19:24:29.123616276 +0000 UTC m=+70.984570008" watchObservedRunningTime="2026-04-22 19:24:29.125544736 +0000 UTC m=+70.986498530" Apr 22 19:24:30.090606 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:30.090568 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrxdf" event={"ID":"3543ced1-0882-444d-be8a-53d6ef47c1e6","Type":"ContainerStarted","Data":"18b7f27763215e9658fbb61c317b3e7631658a5d992fc913061aa95cfbe86e1e"} Apr 22 19:24:30.092243 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:30.092208 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cj979" event={"ID":"a439b73c-32fc-4331-99ac-1f81f67a857d","Type":"ContainerStarted","Data":"24e3f80dfe5d2780da69535330155e18dc4bd8bf4688de1b134ec01db3093660"} Apr 22 19:24:30.092386 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:30.092310 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-cj979" Apr 22 19:24:30.093600 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:30.093578 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-h762s" event={"ID":"96abbc7c-9222-4307-b3ba-c6d559756d44","Type":"ContainerStarted","Data":"7890ba15b1f588473f5406dfd1837930cc99739cc32ca1311e8f52d5f689e06d"} Apr 22 19:24:30.095186 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:30.095159 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4sl88" event={"ID":"ac86a56e-148b-4fb4-8415-acff438d7915","Type":"ContainerStarted","Data":"4221a6d995d1b77f1f1fc4b65f7f245c50ba212b7a2a3fc51f03fae9fd00e28d"} Apr 22 19:24:30.095297 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:30.095193 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4sl88" event={"ID":"ac86a56e-148b-4fb4-8415-acff438d7915","Type":"ContainerStarted","Data":"de1f878ed8ef4583cb268040536f0a002c78a49b00b578968b8e9434a23ea58b"} Apr 22 19:24:30.113857 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:30.113812 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wrxdf" podStartSLOduration=43.321846362 podStartE2EDuration="48.113795518s" podCreationTimestamp="2026-04-22 19:23:42 +0000 UTC" firstStartedPulling="2026-04-22 19:24:23.969852837 +0000 UTC m=+65.830806568" lastFinishedPulling="2026-04-22 19:24:28.761802009 +0000 UTC m=+70.622755724" observedRunningTime="2026-04-22 19:24:30.112822067 +0000 UTC m=+71.973775804" watchObservedRunningTime="2026-04-22 19:24:30.113795518 +0000 UTC m=+71.974749255" Apr 22 19:24:30.135901 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:30.135847 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4sl88" podStartSLOduration=66.690377075 podStartE2EDuration="1m11.13583202s" podCreationTimestamp="2026-04-22 19:23:19 +0000 UTC" firstStartedPulling="2026-04-22 19:24:24.717460454 +0000 UTC m=+66.578414174" lastFinishedPulling="2026-04-22 19:24:29.16291539 +0000 UTC m=+71.023869119" observedRunningTime="2026-04-22 19:24:30.135114667 +0000 UTC m=+71.996068455" watchObservedRunningTime="2026-04-22 19:24:30.13583202 +0000 UTC m=+71.996785755" Apr 22 19:24:30.151239 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:30.151175 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-cj979" podStartSLOduration=34.508765303 podStartE2EDuration="39.151156041s" podCreationTimestamp="2026-04-22 19:23:51 +0000 UTC" firstStartedPulling="2026-04-22 19:24:24.119532416 +0000 UTC m=+65.980486146" lastFinishedPulling="2026-04-22 19:24:28.761923157 +0000 UTC m=+70.622876884" observedRunningTime="2026-04-22 19:24:30.150742937 +0000 UTC m=+72.011696672" watchObservedRunningTime="2026-04-22 19:24:30.151156041 +0000 UTC m=+72.012109785" Apr 22 19:24:30.973272 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:30.973243 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-858rm" Apr 22 19:24:31.100029 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:31.099993 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-h762s" event={"ID":"96abbc7c-9222-4307-b3ba-c6d559756d44","Type":"ContainerStarted","Data":"ac9ec7d8bdbe6dda5307089f07e7a717ca350e89da629d866d1e6866eaff51fe"} Apr 22 19:24:32.110424 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:32.110393 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-h762s" event={"ID":"96abbc7c-9222-4307-b3ba-c6d559756d44","Type":"ContainerStarted","Data":"76fa02ddc1594e94b664bd1333b91243d8b69d01a05d337e8a859c131493f6ee"} Apr 22 19:24:32.130255 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:32.130204 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-h762s" podStartSLOduration=4.444994378 podStartE2EDuration="7.13019149s" podCreationTimestamp="2026-04-22 19:24:25 +0000 UTC" firstStartedPulling="2026-04-22 19:24:29.267900513 +0000 UTC m=+71.128854233" lastFinishedPulling="2026-04-22 19:24:31.953097626 +0000 UTC m=+73.814051345" observedRunningTime="2026-04-22 19:24:32.128394687 +0000 UTC m=+73.989348423" watchObservedRunningTime="2026-04-22 19:24:32.13019149 +0000 UTC m=+73.991145227" Apr 22 19:24:38.931980 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:38.931869 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-jc52p"] Apr 22 19:24:38.936254 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:38.936236 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jc52p" Apr 22 19:24:38.938415 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:38.938394 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 19:24:38.938518 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:38.938467 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-kkgwm\"" Apr 22 19:24:38.938518 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:38.938502 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 19:24:38.938622 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:38.938548 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 19:24:38.938798 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:38.938782 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 19:24:39.071478 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:39.071441 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/789c291d-695f-4244-bda8-52bd3b3ab880-node-exporter-textfile\") pod \"node-exporter-jc52p\" (UID: \"789c291d-695f-4244-bda8-52bd3b3ab880\") " pod="openshift-monitoring/node-exporter-jc52p" Apr 22 19:24:39.071478 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:39.071480 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/789c291d-695f-4244-bda8-52bd3b3ab880-node-exporter-accelerators-collector-config\") pod \"node-exporter-jc52p\" (UID: \"789c291d-695f-4244-bda8-52bd3b3ab880\") " pod="openshift-monitoring/node-exporter-jc52p" Apr 22 19:24:39.071681 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:39.071504 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/789c291d-695f-4244-bda8-52bd3b3ab880-node-exporter-wtmp\") pod \"node-exporter-jc52p\" (UID: \"789c291d-695f-4244-bda8-52bd3b3ab880\") " pod="openshift-monitoring/node-exporter-jc52p" Apr 22 19:24:39.071681 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:39.071521 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/789c291d-695f-4244-bda8-52bd3b3ab880-metrics-client-ca\") pod \"node-exporter-jc52p\" (UID: \"789c291d-695f-4244-bda8-52bd3b3ab880\") " pod="openshift-monitoring/node-exporter-jc52p" Apr 22 19:24:39.071681 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:39.071616 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/789c291d-695f-4244-bda8-52bd3b3ab880-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jc52p\" (UID: \"789c291d-695f-4244-bda8-52bd3b3ab880\") " pod="openshift-monitoring/node-exporter-jc52p" Apr 22 19:24:39.071681 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:39.071649 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/789c291d-695f-4244-bda8-52bd3b3ab880-node-exporter-tls\") pod \"node-exporter-jc52p\" (UID: \"789c291d-695f-4244-bda8-52bd3b3ab880\") " pod="openshift-monitoring/node-exporter-jc52p" Apr 22 19:24:39.071839 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:39.071754 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/789c291d-695f-4244-bda8-52bd3b3ab880-root\") pod \"node-exporter-jc52p\" (UID: \"789c291d-695f-4244-bda8-52bd3b3ab880\") " pod="openshift-monitoring/node-exporter-jc52p" Apr 22 19:24:39.071839 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:39.071780 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9twg\" (UniqueName: \"kubernetes.io/projected/789c291d-695f-4244-bda8-52bd3b3ab880-kube-api-access-c9twg\") pod \"node-exporter-jc52p\" (UID: \"789c291d-695f-4244-bda8-52bd3b3ab880\") " pod="openshift-monitoring/node-exporter-jc52p" Apr 22 19:24:39.071839 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:39.071813 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/789c291d-695f-4244-bda8-52bd3b3ab880-sys\") pod \"node-exporter-jc52p\" (UID: \"789c291d-695f-4244-bda8-52bd3b3ab880\") " pod="openshift-monitoring/node-exporter-jc52p" Apr 22 19:24:39.173078 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:39.173047 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/789c291d-695f-4244-bda8-52bd3b3ab880-node-exporter-textfile\") pod \"node-exporter-jc52p\" (UID: \"789c291d-695f-4244-bda8-52bd3b3ab880\") " pod="openshift-monitoring/node-exporter-jc52p" Apr 22 19:24:39.173078 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:39.173084 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/789c291d-695f-4244-bda8-52bd3b3ab880-node-exporter-accelerators-collector-config\") pod \"node-exporter-jc52p\" (UID: \"789c291d-695f-4244-bda8-52bd3b3ab880\") " pod="openshift-monitoring/node-exporter-jc52p" Apr 22 19:24:39.173316 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:39.173115 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/789c291d-695f-4244-bda8-52bd3b3ab880-node-exporter-wtmp\") pod \"node-exporter-jc52p\" (UID: \"789c291d-695f-4244-bda8-52bd3b3ab880\") " pod="openshift-monitoring/node-exporter-jc52p" Apr 22 19:24:39.173316 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:39.173143 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/789c291d-695f-4244-bda8-52bd3b3ab880-metrics-client-ca\") pod \"node-exporter-jc52p\" (UID: \"789c291d-695f-4244-bda8-52bd3b3ab880\") " pod="openshift-monitoring/node-exporter-jc52p" Apr 22 19:24:39.173316 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:39.173183 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/789c291d-695f-4244-bda8-52bd3b3ab880-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jc52p\" (UID: \"789c291d-695f-4244-bda8-52bd3b3ab880\") " pod="openshift-monitoring/node-exporter-jc52p" Apr 22 19:24:39.173316 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:39.173211 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/789c291d-695f-4244-bda8-52bd3b3ab880-node-exporter-tls\") pod \"node-exporter-jc52p\" (UID: \"789c291d-695f-4244-bda8-52bd3b3ab880\") " pod="openshift-monitoring/node-exporter-jc52p" Apr 22 19:24:39.173316 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:39.173264 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/789c291d-695f-4244-bda8-52bd3b3ab880-root\") pod \"node-exporter-jc52p\" (UID: \"789c291d-695f-4244-bda8-52bd3b3ab880\") " pod="openshift-monitoring/node-exporter-jc52p" Apr 22 19:24:39.173316 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:39.173275 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/789c291d-695f-4244-bda8-52bd3b3ab880-node-exporter-wtmp\") pod \"node-exporter-jc52p\" (UID: \"789c291d-695f-4244-bda8-52bd3b3ab880\") " pod="openshift-monitoring/node-exporter-jc52p" Apr 22 19:24:39.173599 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:39.173322 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/789c291d-695f-4244-bda8-52bd3b3ab880-root\") pod \"node-exporter-jc52p\" (UID: \"789c291d-695f-4244-bda8-52bd3b3ab880\") " pod="openshift-monitoring/node-exporter-jc52p" Apr 22 19:24:39.173599 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:39.173462 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9twg\" (UniqueName: \"kubernetes.io/projected/789c291d-695f-4244-bda8-52bd3b3ab880-kube-api-access-c9twg\") pod \"node-exporter-jc52p\" (UID: \"789c291d-695f-4244-bda8-52bd3b3ab880\") " pod="openshift-monitoring/node-exporter-jc52p" Apr 22 19:24:39.173599 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:39.173498 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/789c291d-695f-4244-bda8-52bd3b3ab880-sys\") pod \"node-exporter-jc52p\" (UID: \"789c291d-695f-4244-bda8-52bd3b3ab880\") " pod="openshift-monitoring/node-exporter-jc52p" Apr 22 19:24:39.173599 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:39.173504 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/789c291d-695f-4244-bda8-52bd3b3ab880-node-exporter-textfile\") pod \"node-exporter-jc52p\" (UID: \"789c291d-695f-4244-bda8-52bd3b3ab880\") " pod="openshift-monitoring/node-exporter-jc52p" Apr 22 19:24:39.173814 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:39.173642 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/789c291d-695f-4244-bda8-52bd3b3ab880-sys\") pod \"node-exporter-jc52p\" (UID: \"789c291d-695f-4244-bda8-52bd3b3ab880\") " pod="openshift-monitoring/node-exporter-jc52p" Apr 22 19:24:39.173814 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:39.173800 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/789c291d-695f-4244-bda8-52bd3b3ab880-metrics-client-ca\") pod \"node-exporter-jc52p\" (UID: \"789c291d-695f-4244-bda8-52bd3b3ab880\") " pod="openshift-monitoring/node-exporter-jc52p" Apr 22 19:24:39.173903 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:39.173815 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/789c291d-695f-4244-bda8-52bd3b3ab880-node-exporter-accelerators-collector-config\") pod \"node-exporter-jc52p\" (UID: \"789c291d-695f-4244-bda8-52bd3b3ab880\") " pod="openshift-monitoring/node-exporter-jc52p" Apr 22 19:24:39.175568 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:39.175548 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/789c291d-695f-4244-bda8-52bd3b3ab880-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jc52p\" (UID: \"789c291d-695f-4244-bda8-52bd3b3ab880\") " pod="openshift-monitoring/node-exporter-jc52p" Apr 22 19:24:39.175712 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:39.175696 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/789c291d-695f-4244-bda8-52bd3b3ab880-node-exporter-tls\") pod \"node-exporter-jc52p\" (UID: \"789c291d-695f-4244-bda8-52bd3b3ab880\") " pod="openshift-monitoring/node-exporter-jc52p" Apr 22 19:24:39.180869 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:39.180848 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9twg\" (UniqueName: \"kubernetes.io/projected/789c291d-695f-4244-bda8-52bd3b3ab880-kube-api-access-c9twg\") pod \"node-exporter-jc52p\" (UID: \"789c291d-695f-4244-bda8-52bd3b3ab880\") " pod="openshift-monitoring/node-exporter-jc52p" Apr 22 19:24:39.245692 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:39.245662 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jc52p" Apr 22 19:24:39.255787 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:24:39.255758 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod789c291d_695f_4244_bda8_52bd3b3ab880.slice/crio-7a590992286edd9d5fa306bc59b4140d5f7bfaf6289cc87164a9c64fdc391ced WatchSource:0}: Error finding container 7a590992286edd9d5fa306bc59b4140d5f7bfaf6289cc87164a9c64fdc391ced: Status 404 returned error can't find the container with id 7a590992286edd9d5fa306bc59b4140d5f7bfaf6289cc87164a9c64fdc391ced Apr 22 19:24:40.102457 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:40.102431 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-cj979" Apr 22 19:24:40.132808 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:40.132768 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jc52p" event={"ID":"789c291d-695f-4244-bda8-52bd3b3ab880","Type":"ContainerStarted","Data":"7a590992286edd9d5fa306bc59b4140d5f7bfaf6289cc87164a9c64fdc391ced"} Apr 22 19:24:41.136950 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:41.136915 2576 generic.go:358] "Generic (PLEG): container finished" podID="789c291d-695f-4244-bda8-52bd3b3ab880" containerID="5bbd8c65ae21936a1de7b844baab852983324ec66a73588bddcf31fcf96a49ca" exitCode=0 Apr 22 19:24:41.137339 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:41.136958 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jc52p" event={"ID":"789c291d-695f-4244-bda8-52bd3b3ab880","Type":"ContainerDied","Data":"5bbd8c65ae21936a1de7b844baab852983324ec66a73588bddcf31fcf96a49ca"} Apr 22 19:24:42.142034 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:42.141996 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jc52p" event={"ID":"789c291d-695f-4244-bda8-52bd3b3ab880","Type":"ContainerStarted","Data":"63a463528eadead4f1a13c8a3accc5fe44a8afb3c4a7321ee0d813bb8e7d3f3b"} Apr 22 19:24:42.142034 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:42.142037 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jc52p" event={"ID":"789c291d-695f-4244-bda8-52bd3b3ab880","Type":"ContainerStarted","Data":"676e56c8a940b9babde45063fb5b5630675293940080a1675ed6d979eadc8dbd"} Apr 22 19:24:42.158897 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:42.158850 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-jc52p" podStartSLOduration=3.343202672 podStartE2EDuration="4.158836744s" podCreationTimestamp="2026-04-22 19:24:38 +0000 UTC" firstStartedPulling="2026-04-22 19:24:39.257849391 +0000 UTC m=+81.118803105" lastFinishedPulling="2026-04-22 19:24:40.07348345 +0000 UTC m=+81.934437177" observedRunningTime="2026-04-22 19:24:42.157848875 +0000 UTC m=+84.018802611" watchObservedRunningTime="2026-04-22 19:24:42.158836744 +0000 UTC m=+84.019790480" Apr 22 19:24:45.067029 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:45.067001 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7674868db4-j8tkf" Apr 22 19:24:47.728476 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:24:47.728442 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7674868db4-j8tkf"] Apr 22 19:25:12.747412 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:12.747347 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7674868db4-j8tkf" podUID="14676505-81f2-4619-b4da-c41d68fefcce" containerName="registry" containerID="cri-o://2baba7c44e8ee27495a8b91f13ccbfbc72e336f8c371bac83e201f3730906fc5" gracePeriod=30 Apr 22 19:25:12.984737 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:12.984698 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7674868db4-j8tkf" Apr 22 19:25:13.051278 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:13.051197 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrl5c\" (UniqueName: \"kubernetes.io/projected/14676505-81f2-4619-b4da-c41d68fefcce-kube-api-access-qrl5c\") pod \"14676505-81f2-4619-b4da-c41d68fefcce\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " Apr 22 19:25:13.051278 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:13.051248 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/14676505-81f2-4619-b4da-c41d68fefcce-image-registry-private-configuration\") pod \"14676505-81f2-4619-b4da-c41d68fefcce\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " Apr 22 19:25:13.051278 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:13.051271 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14676505-81f2-4619-b4da-c41d68fefcce-installation-pull-secrets\") pod \"14676505-81f2-4619-b4da-c41d68fefcce\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " Apr 22 19:25:13.051521 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:13.051308 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14676505-81f2-4619-b4da-c41d68fefcce-trusted-ca\") pod \"14676505-81f2-4619-b4da-c41d68fefcce\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " Apr 22 19:25:13.051521 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:13.051324 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14676505-81f2-4619-b4da-c41d68fefcce-bound-sa-token\") pod \"14676505-81f2-4619-b4da-c41d68fefcce\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " Apr 22 19:25:13.051521 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:13.051350 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14676505-81f2-4619-b4da-c41d68fefcce-registry-tls\") pod \"14676505-81f2-4619-b4da-c41d68fefcce\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " Apr 22 19:25:13.051521 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:13.051372 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14676505-81f2-4619-b4da-c41d68fefcce-registry-certificates\") pod \"14676505-81f2-4619-b4da-c41d68fefcce\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " Apr 22 19:25:13.051521 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:13.051408 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14676505-81f2-4619-b4da-c41d68fefcce-ca-trust-extracted\") pod \"14676505-81f2-4619-b4da-c41d68fefcce\" (UID: \"14676505-81f2-4619-b4da-c41d68fefcce\") " Apr 22 19:25:13.051905 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:13.051870 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14676505-81f2-4619-b4da-c41d68fefcce-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "14676505-81f2-4619-b4da-c41d68fefcce" (UID: "14676505-81f2-4619-b4da-c41d68fefcce"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:25:13.052016 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:13.051981 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14676505-81f2-4619-b4da-c41d68fefcce-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "14676505-81f2-4619-b4da-c41d68fefcce" (UID: "14676505-81f2-4619-b4da-c41d68fefcce"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:25:13.053919 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:13.053865 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14676505-81f2-4619-b4da-c41d68fefcce-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "14676505-81f2-4619-b4da-c41d68fefcce" (UID: "14676505-81f2-4619-b4da-c41d68fefcce"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:25:13.054012 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:13.053920 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14676505-81f2-4619-b4da-c41d68fefcce-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "14676505-81f2-4619-b4da-c41d68fefcce" (UID: "14676505-81f2-4619-b4da-c41d68fefcce"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:25:13.054012 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:13.053951 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14676505-81f2-4619-b4da-c41d68fefcce-kube-api-access-qrl5c" (OuterVolumeSpecName: "kube-api-access-qrl5c") pod "14676505-81f2-4619-b4da-c41d68fefcce" (UID: "14676505-81f2-4619-b4da-c41d68fefcce"). InnerVolumeSpecName "kube-api-access-qrl5c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:25:13.054100 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:13.054081 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14676505-81f2-4619-b4da-c41d68fefcce-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "14676505-81f2-4619-b4da-c41d68fefcce" (UID: "14676505-81f2-4619-b4da-c41d68fefcce"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:25:13.054273 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:13.054252 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14676505-81f2-4619-b4da-c41d68fefcce-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "14676505-81f2-4619-b4da-c41d68fefcce" (UID: "14676505-81f2-4619-b4da-c41d68fefcce"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:25:13.060227 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:13.060197 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14676505-81f2-4619-b4da-c41d68fefcce-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "14676505-81f2-4619-b4da-c41d68fefcce" (UID: "14676505-81f2-4619-b4da-c41d68fefcce"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:25:13.152067 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:13.152032 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14676505-81f2-4619-b4da-c41d68fefcce-registry-tls\") on node \"ip-10-0-129-175.ec2.internal\" DevicePath \"\"" Apr 22 19:25:13.152067 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:13.152059 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14676505-81f2-4619-b4da-c41d68fefcce-registry-certificates\") on node \"ip-10-0-129-175.ec2.internal\" DevicePath \"\"" Apr 22 19:25:13.152067 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:13.152073 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14676505-81f2-4619-b4da-c41d68fefcce-ca-trust-extracted\") on node \"ip-10-0-129-175.ec2.internal\" DevicePath \"\"" Apr 22 19:25:13.152303 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:13.152083 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qrl5c\" (UniqueName: \"kubernetes.io/projected/14676505-81f2-4619-b4da-c41d68fefcce-kube-api-access-qrl5c\") on node \"ip-10-0-129-175.ec2.internal\" DevicePath \"\"" Apr 22 19:25:13.152303 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:13.152092 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/14676505-81f2-4619-b4da-c41d68fefcce-image-registry-private-configuration\") on node \"ip-10-0-129-175.ec2.internal\" DevicePath \"\"" Apr 22 19:25:13.152303 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:13.152103 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14676505-81f2-4619-b4da-c41d68fefcce-installation-pull-secrets\") on node \"ip-10-0-129-175.ec2.internal\" DevicePath \"\"" Apr 22 19:25:13.152303 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:13.152114 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14676505-81f2-4619-b4da-c41d68fefcce-trusted-ca\") on node \"ip-10-0-129-175.ec2.internal\" DevicePath \"\"" Apr 22 19:25:13.152303 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:13.152123 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14676505-81f2-4619-b4da-c41d68fefcce-bound-sa-token\") on node \"ip-10-0-129-175.ec2.internal\" DevicePath \"\"" Apr 22 19:25:13.228895 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:13.228856 2576 generic.go:358] "Generic (PLEG): container finished" podID="14676505-81f2-4619-b4da-c41d68fefcce" containerID="2baba7c44e8ee27495a8b91f13ccbfbc72e336f8c371bac83e201f3730906fc5" exitCode=0 Apr 22 19:25:13.229054 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:13.228922 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7674868db4-j8tkf" Apr 22 19:25:13.229054 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:13.228938 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7674868db4-j8tkf" event={"ID":"14676505-81f2-4619-b4da-c41d68fefcce","Type":"ContainerDied","Data":"2baba7c44e8ee27495a8b91f13ccbfbc72e336f8c371bac83e201f3730906fc5"} Apr 22 19:25:13.229054 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:13.228973 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7674868db4-j8tkf" event={"ID":"14676505-81f2-4619-b4da-c41d68fefcce","Type":"ContainerDied","Data":"f36a2bebd526e34c20fb02bf1a4eedcf9b73256be1acccdea371062aed58c8fe"} Apr 22 19:25:13.229054 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:13.228988 2576 scope.go:117] "RemoveContainer" containerID="2baba7c44e8ee27495a8b91f13ccbfbc72e336f8c371bac83e201f3730906fc5" Apr 22 19:25:13.237467 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:13.237448 2576 scope.go:117] "RemoveContainer" containerID="2baba7c44e8ee27495a8b91f13ccbfbc72e336f8c371bac83e201f3730906fc5" Apr 22 19:25:13.237702 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:25:13.237685 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2baba7c44e8ee27495a8b91f13ccbfbc72e336f8c371bac83e201f3730906fc5\": container with ID starting with 2baba7c44e8ee27495a8b91f13ccbfbc72e336f8c371bac83e201f3730906fc5 not found: ID does not exist" containerID="2baba7c44e8ee27495a8b91f13ccbfbc72e336f8c371bac83e201f3730906fc5" Apr 22 19:25:13.237768 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:13.237710 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2baba7c44e8ee27495a8b91f13ccbfbc72e336f8c371bac83e201f3730906fc5"} err="failed to get container status \"2baba7c44e8ee27495a8b91f13ccbfbc72e336f8c371bac83e201f3730906fc5\": rpc error: code = NotFound desc = could not find container \"2baba7c44e8ee27495a8b91f13ccbfbc72e336f8c371bac83e201f3730906fc5\": container with ID starting with 2baba7c44e8ee27495a8b91f13ccbfbc72e336f8c371bac83e201f3730906fc5 not found: ID does not exist" Apr 22 19:25:13.249072 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:13.249049 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7674868db4-j8tkf"] Apr 22 19:25:13.254839 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:13.254816 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7674868db4-j8tkf"] Apr 22 19:25:14.737390 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:14.737355 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14676505-81f2-4619-b4da-c41d68fefcce" path="/var/lib/kubelet/pods/14676505-81f2-4619-b4da-c41d68fefcce/volumes" Apr 22 19:25:15.236870 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:15.236838 2576 generic.go:358] "Generic (PLEG): container finished" podID="403f71c1-3bb4-4aef-9a26-b062489c9d03" containerID="088fcf9d0cd2c8e680380a5c790b04e4b8688b23ac89db1415950593cb7adcb2" exitCode=0 Apr 22 19:25:15.237046 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:15.236883 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-z7d4f" event={"ID":"403f71c1-3bb4-4aef-9a26-b062489c9d03","Type":"ContainerDied","Data":"088fcf9d0cd2c8e680380a5c790b04e4b8688b23ac89db1415950593cb7adcb2"} Apr 22 19:25:15.237218 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:15.237206 2576 scope.go:117] "RemoveContainer" containerID="088fcf9d0cd2c8e680380a5c790b04e4b8688b23ac89db1415950593cb7adcb2" Apr 22 19:25:16.241453 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:16.241417 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-z7d4f" event={"ID":"403f71c1-3bb4-4aef-9a26-b062489c9d03","Type":"ContainerStarted","Data":"da500bcbfd07a92dcf0e52aa19d2b00be9c2f2f8eaf8331ddc926752c85c9819"} Apr 22 19:25:26.269410 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:26.269373 2576 generic.go:358] "Generic (PLEG): container finished" podID="944e0b61-e3fd-4e73-88f5-25f65ad44d11" containerID="cf8b5b5df2ad368fe749a52cd2105740eba755753941e3bbbf3401ca9602a4a8" exitCode=0 Apr 22 19:25:26.269935 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:26.269454 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mqx2f" event={"ID":"944e0b61-e3fd-4e73-88f5-25f65ad44d11","Type":"ContainerDied","Data":"cf8b5b5df2ad368fe749a52cd2105740eba755753941e3bbbf3401ca9602a4a8"} Apr 22 19:25:26.269935 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:26.269876 2576 scope.go:117] "RemoveContainer" containerID="cf8b5b5df2ad368fe749a52cd2105740eba755753941e3bbbf3401ca9602a4a8" Apr 22 19:25:27.274252 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:27.274212 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mqx2f" event={"ID":"944e0b61-e3fd-4e73-88f5-25f65ad44d11","Type":"ContainerStarted","Data":"636d39db87e048a9f462f6d4a3708b3a37e217653fdbc4c3c04a885173e93b6a"} Apr 22 19:25:30.283778 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:30.283741 2576 generic.go:358] "Generic (PLEG): container finished" podID="97ce33ae-619d-451a-8e53-dc35cdbba9e3" containerID="cfc799c07791d37a8068ecf7a9e0901ce7ab181e8afff89503464754567dde2f" exitCode=0 Apr 22 19:25:30.284244 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:30.283801 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-748dj" event={"ID":"97ce33ae-619d-451a-8e53-dc35cdbba9e3","Type":"ContainerDied","Data":"cfc799c07791d37a8068ecf7a9e0901ce7ab181e8afff89503464754567dde2f"} Apr 22 19:25:30.284244 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:30.284210 2576 scope.go:117] "RemoveContainer" containerID="cfc799c07791d37a8068ecf7a9e0901ce7ab181e8afff89503464754567dde2f" Apr 22 19:25:31.287509 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:25:31.287467 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-748dj" event={"ID":"97ce33ae-619d-451a-8e53-dc35cdbba9e3","Type":"ContainerStarted","Data":"8c94f50dd2a223125bfbe9cbeab53c31b8fe9be1b3b779da55e0d80f64c720c9"} Apr 22 19:27:10.083946 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.083908 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-78d6fdc488-skp2j"] Apr 22 19:27:10.084386 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.084190 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="14676505-81f2-4619-b4da-c41d68fefcce" containerName="registry" Apr 22 19:27:10.084386 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.084203 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="14676505-81f2-4619-b4da-c41d68fefcce" containerName="registry" Apr 22 19:27:10.084386 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.084266 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="14676505-81f2-4619-b4da-c41d68fefcce" containerName="registry" Apr 22 19:27:10.086897 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.086877 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-78d6fdc488-skp2j" Apr 22 19:27:10.088655 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.088635 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 19:27:10.089201 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.089173 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 19:27:10.089201 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.089174 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 19:27:10.089378 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.089175 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 22 19:27:10.095825 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.095792 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-78d6fdc488-skp2j"] Apr 22 19:27:10.136103 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.136068 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68d99d77bb-8wrl7"] Apr 22 19:27:10.139306 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.139290 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68d99d77bb-8wrl7" Apr 22 19:27:10.141358 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.141338 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 22 19:27:10.141507 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.141489 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 22 19:27:10.141576 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.141565 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 22 19:27:10.141953 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.141935 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 22 19:27:10.150442 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.150421 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68d99d77bb-8wrl7"] Apr 22 19:27:10.221032 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.220999 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/d84a500c-4007-49f8-bc5c-961dbc03d506-hub\") pod \"cluster-proxy-proxy-agent-68d99d77bb-8wrl7\" (UID: \"d84a500c-4007-49f8-bc5c-961dbc03d506\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68d99d77bb-8wrl7" Apr 22 19:27:10.221032 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.221039 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d84a500c-4007-49f8-bc5c-961dbc03d506-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-68d99d77bb-8wrl7\" (UID: \"d84a500c-4007-49f8-bc5c-961dbc03d506\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68d99d77bb-8wrl7" Apr 22 19:27:10.221292 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.221072 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/b972967b-4d95-47d8-ad48-5f46693f4a94-klusterlet-config\") pod \"klusterlet-addon-workmgr-78d6fdc488-skp2j\" (UID: \"b972967b-4d95-47d8-ad48-5f46693f4a94\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-78d6fdc488-skp2j" Apr 22 19:27:10.221292 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.221117 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/d84a500c-4007-49f8-bc5c-961dbc03d506-ca\") pod \"cluster-proxy-proxy-agent-68d99d77bb-8wrl7\" (UID: \"d84a500c-4007-49f8-bc5c-961dbc03d506\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68d99d77bb-8wrl7" Apr 22 19:27:10.221292 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.221140 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/d84a500c-4007-49f8-bc5c-961dbc03d506-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-68d99d77bb-8wrl7\" (UID: \"d84a500c-4007-49f8-bc5c-961dbc03d506\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68d99d77bb-8wrl7" Apr 22 19:27:10.221292 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.221181 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsfxz\" (UniqueName: \"kubernetes.io/projected/d84a500c-4007-49f8-bc5c-961dbc03d506-kube-api-access-zsfxz\") pod \"cluster-proxy-proxy-agent-68d99d77bb-8wrl7\" (UID: \"d84a500c-4007-49f8-bc5c-961dbc03d506\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68d99d77bb-8wrl7" Apr 22 19:27:10.221292 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.221208 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s84xc\" (UniqueName: \"kubernetes.io/projected/b972967b-4d95-47d8-ad48-5f46693f4a94-kube-api-access-s84xc\") pod \"klusterlet-addon-workmgr-78d6fdc488-skp2j\" (UID: \"b972967b-4d95-47d8-ad48-5f46693f4a94\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-78d6fdc488-skp2j" Apr 22 19:27:10.221292 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.221233 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b972967b-4d95-47d8-ad48-5f46693f4a94-tmp\") pod \"klusterlet-addon-workmgr-78d6fdc488-skp2j\" (UID: \"b972967b-4d95-47d8-ad48-5f46693f4a94\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-78d6fdc488-skp2j" Apr 22 19:27:10.221474 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.221307 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/d84a500c-4007-49f8-bc5c-961dbc03d506-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-68d99d77bb-8wrl7\" (UID: \"d84a500c-4007-49f8-bc5c-961dbc03d506\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68d99d77bb-8wrl7" Apr 22 19:27:10.322365 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.322333 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/d84a500c-4007-49f8-bc5c-961dbc03d506-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-68d99d77bb-8wrl7\" (UID: \"d84a500c-4007-49f8-bc5c-961dbc03d506\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68d99d77bb-8wrl7" Apr 22 19:27:10.322524 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.322382 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/d84a500c-4007-49f8-bc5c-961dbc03d506-hub\") pod \"cluster-proxy-proxy-agent-68d99d77bb-8wrl7\" (UID: \"d84a500c-4007-49f8-bc5c-961dbc03d506\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68d99d77bb-8wrl7" Apr 22 19:27:10.322524 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.322419 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d84a500c-4007-49f8-bc5c-961dbc03d506-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-68d99d77bb-8wrl7\" (UID: \"d84a500c-4007-49f8-bc5c-961dbc03d506\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68d99d77bb-8wrl7" Apr 22 19:27:10.322524 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.322443 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/b972967b-4d95-47d8-ad48-5f46693f4a94-klusterlet-config\") pod \"klusterlet-addon-workmgr-78d6fdc488-skp2j\" (UID: \"b972967b-4d95-47d8-ad48-5f46693f4a94\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-78d6fdc488-skp2j" Apr 22 19:27:10.322524 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.322461 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/d84a500c-4007-49f8-bc5c-961dbc03d506-ca\") pod \"cluster-proxy-proxy-agent-68d99d77bb-8wrl7\" (UID: \"d84a500c-4007-49f8-bc5c-961dbc03d506\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68d99d77bb-8wrl7" Apr 22 19:27:10.322524 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.322478 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/d84a500c-4007-49f8-bc5c-961dbc03d506-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-68d99d77bb-8wrl7\" (UID: \"d84a500c-4007-49f8-bc5c-961dbc03d506\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68d99d77bb-8wrl7" Apr 22 19:27:10.322524 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.322506 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zsfxz\" (UniqueName: \"kubernetes.io/projected/d84a500c-4007-49f8-bc5c-961dbc03d506-kube-api-access-zsfxz\") pod \"cluster-proxy-proxy-agent-68d99d77bb-8wrl7\" (UID: \"d84a500c-4007-49f8-bc5c-961dbc03d506\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68d99d77bb-8wrl7" Apr 22 19:27:10.322868 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.322537 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s84xc\" (UniqueName: \"kubernetes.io/projected/b972967b-4d95-47d8-ad48-5f46693f4a94-kube-api-access-s84xc\") pod \"klusterlet-addon-workmgr-78d6fdc488-skp2j\" (UID: \"b972967b-4d95-47d8-ad48-5f46693f4a94\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-78d6fdc488-skp2j" Apr 22 19:27:10.322868 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.322566 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b972967b-4d95-47d8-ad48-5f46693f4a94-tmp\") pod \"klusterlet-addon-workmgr-78d6fdc488-skp2j\" (UID: \"b972967b-4d95-47d8-ad48-5f46693f4a94\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-78d6fdc488-skp2j" Apr 22 19:27:10.323291 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.323006 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b972967b-4d95-47d8-ad48-5f46693f4a94-tmp\") pod \"klusterlet-addon-workmgr-78d6fdc488-skp2j\" (UID: \"b972967b-4d95-47d8-ad48-5f46693f4a94\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-78d6fdc488-skp2j" Apr 22 19:27:10.323291 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.323280 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/d84a500c-4007-49f8-bc5c-961dbc03d506-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-68d99d77bb-8wrl7\" (UID: \"d84a500c-4007-49f8-bc5c-961dbc03d506\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68d99d77bb-8wrl7" Apr 22 19:27:10.325053 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.325023 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/d84a500c-4007-49f8-bc5c-961dbc03d506-ca\") pod \"cluster-proxy-proxy-agent-68d99d77bb-8wrl7\" (UID: \"d84a500c-4007-49f8-bc5c-961dbc03d506\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68d99d77bb-8wrl7" Apr 22 19:27:10.325053 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.325046 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/d84a500c-4007-49f8-bc5c-961dbc03d506-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-68d99d77bb-8wrl7\" (UID: \"d84a500c-4007-49f8-bc5c-961dbc03d506\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68d99d77bb-8wrl7" Apr 22 19:27:10.325310 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.325288 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/b972967b-4d95-47d8-ad48-5f46693f4a94-klusterlet-config\") pod \"klusterlet-addon-workmgr-78d6fdc488-skp2j\" (UID: \"b972967b-4d95-47d8-ad48-5f46693f4a94\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-78d6fdc488-skp2j" Apr 22 19:27:10.325353 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.325299 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d84a500c-4007-49f8-bc5c-961dbc03d506-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-68d99d77bb-8wrl7\" (UID: \"d84a500c-4007-49f8-bc5c-961dbc03d506\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68d99d77bb-8wrl7" Apr 22 19:27:10.325534 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.325517 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/d84a500c-4007-49f8-bc5c-961dbc03d506-hub\") pod \"cluster-proxy-proxy-agent-68d99d77bb-8wrl7\" (UID: \"d84a500c-4007-49f8-bc5c-961dbc03d506\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68d99d77bb-8wrl7" Apr 22 19:27:10.329748 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.329704 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsfxz\" (UniqueName: \"kubernetes.io/projected/d84a500c-4007-49f8-bc5c-961dbc03d506-kube-api-access-zsfxz\") pod \"cluster-proxy-proxy-agent-68d99d77bb-8wrl7\" (UID: \"d84a500c-4007-49f8-bc5c-961dbc03d506\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68d99d77bb-8wrl7" Apr 22 19:27:10.329837 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.329781 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s84xc\" (UniqueName: \"kubernetes.io/projected/b972967b-4d95-47d8-ad48-5f46693f4a94-kube-api-access-s84xc\") pod \"klusterlet-addon-workmgr-78d6fdc488-skp2j\" (UID: \"b972967b-4d95-47d8-ad48-5f46693f4a94\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-78d6fdc488-skp2j" Apr 22 19:27:10.396690 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.396595 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-78d6fdc488-skp2j" Apr 22 19:27:10.462256 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.462224 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68d99d77bb-8wrl7" Apr 22 19:27:10.527353 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.524197 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-78d6fdc488-skp2j"] Apr 22 19:27:10.530904 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:27:10.530876 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb972967b_4d95_47d8_ad48_5f46693f4a94.slice/crio-044256005eae0928c02a7fa22da1fd69bb80bb3194f70609b9864a4cffe93497 WatchSource:0}: Error finding container 044256005eae0928c02a7fa22da1fd69bb80bb3194f70609b9864a4cffe93497: Status 404 returned error can't find the container with id 044256005eae0928c02a7fa22da1fd69bb80bb3194f70609b9864a4cffe93497 Apr 22 19:27:10.563511 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.563476 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-78d6fdc488-skp2j" event={"ID":"b972967b-4d95-47d8-ad48-5f46693f4a94","Type":"ContainerStarted","Data":"044256005eae0928c02a7fa22da1fd69bb80bb3194f70609b9864a4cffe93497"} Apr 22 19:27:10.587234 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:10.587200 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68d99d77bb-8wrl7"] Apr 22 19:27:10.590513 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:27:10.590484 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd84a500c_4007_49f8_bc5c_961dbc03d506.slice/crio-24bbc73bbfa495785e47acc5c885d1abfc1b3090e825c6f4b3c9bbe32b8d1870 WatchSource:0}: Error finding container 24bbc73bbfa495785e47acc5c885d1abfc1b3090e825c6f4b3c9bbe32b8d1870: Status 404 returned error can't find the container with id 24bbc73bbfa495785e47acc5c885d1abfc1b3090e825c6f4b3c9bbe32b8d1870 Apr 22 19:27:11.567474 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:11.567432 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68d99d77bb-8wrl7" event={"ID":"d84a500c-4007-49f8-bc5c-961dbc03d506","Type":"ContainerStarted","Data":"24bbc73bbfa495785e47acc5c885d1abfc1b3090e825c6f4b3c9bbe32b8d1870"} Apr 22 19:27:14.578832 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:14.578796 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68d99d77bb-8wrl7" event={"ID":"d84a500c-4007-49f8-bc5c-961dbc03d506","Type":"ContainerStarted","Data":"63a327fdcd1c40b7a3472fb57204e8fa3feff770dabaaef24ae2e8eb0fa2bca1"} Apr 22 19:27:15.584152 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:15.584070 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-78d6fdc488-skp2j" event={"ID":"b972967b-4d95-47d8-ad48-5f46693f4a94","Type":"ContainerStarted","Data":"2f26472e904fcc09ba00df60a75aa1f84700f29142954749867f627759fa6e0f"} Apr 22 19:27:15.584596 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:15.584401 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-78d6fdc488-skp2j" Apr 22 19:27:15.586050 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:15.586026 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-78d6fdc488-skp2j" Apr 22 19:27:15.599982 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:15.599929 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-78d6fdc488-skp2j" podStartSLOduration=0.86642747 podStartE2EDuration="5.599914351s" podCreationTimestamp="2026-04-22 19:27:10 +0000 UTC" firstStartedPulling="2026-04-22 19:27:10.532647297 +0000 UTC m=+232.393601012" lastFinishedPulling="2026-04-22 19:27:15.266134178 +0000 UTC m=+237.127087893" observedRunningTime="2026-04-22 19:27:15.598938542 +0000 UTC m=+237.459892279" watchObservedRunningTime="2026-04-22 19:27:15.599914351 +0000 UTC m=+237.460868089" Apr 22 19:27:17.590823 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:17.590785 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68d99d77bb-8wrl7" event={"ID":"d84a500c-4007-49f8-bc5c-961dbc03d506","Type":"ContainerStarted","Data":"baf86df4a9efa7eeb4bbfd4cceee06b6caf7a24cbe57d61b139a25188dec9392"} Apr 22 19:27:17.590823 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:17.590826 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68d99d77bb-8wrl7" event={"ID":"d84a500c-4007-49f8-bc5c-961dbc03d506","Type":"ContainerStarted","Data":"d7d210e2cb484304a5ca43c413ad02b1d7b94e1194c117acfdb0d0e5d0e2c2a7"} Apr 22 19:27:17.606951 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:27:17.606882 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68d99d77bb-8wrl7" podStartSLOduration=1.217071309 podStartE2EDuration="7.606868752s" podCreationTimestamp="2026-04-22 19:27:10 +0000 UTC" firstStartedPulling="2026-04-22 19:27:10.592167233 +0000 UTC m=+232.453120948" lastFinishedPulling="2026-04-22 19:27:16.981964675 +0000 UTC m=+238.842918391" observedRunningTime="2026-04-22 19:27:17.60632092 +0000 UTC m=+239.467274666" watchObservedRunningTime="2026-04-22 19:27:17.606868752 +0000 UTC m=+239.467822489" Apr 22 19:28:18.614675 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:28:18.614644 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5rw5p_af8ba87e-3b79-4273-af80-a16c607dbf85/console-operator/1.log" Apr 22 19:28:18.615632 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:28:18.615613 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5rw5p_af8ba87e-3b79-4273-af80-a16c607dbf85/console-operator/1.log" Apr 22 19:28:18.623797 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:28:18.623768 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 19:30:51.741795 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:30:51.741757 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-fvvmd"] Apr 22 19:30:51.743614 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:30:51.743597 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-fvvmd" Apr 22 19:30:51.746305 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:30:51.746284 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 19:30:51.746428 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:30:51.746334 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-cbcgb\"" Apr 22 19:30:51.746595 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:30:51.746582 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 22 19:30:51.747696 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:30:51.747683 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 19:30:51.761651 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:30:51.761625 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-fvvmd"] Apr 22 19:30:51.804079 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:30:51.804050 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtctv\" (UniqueName: \"kubernetes.io/projected/0fef4c3a-421b-488c-909a-9f8cde300f7c-kube-api-access-jtctv\") pod \"model-serving-api-86f7b4b499-fvvmd\" (UID: \"0fef4c3a-421b-488c-909a-9f8cde300f7c\") " pod="kserve/model-serving-api-86f7b4b499-fvvmd" Apr 22 19:30:51.804232 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:30:51.804087 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0fef4c3a-421b-488c-909a-9f8cde300f7c-tls-certs\") pod \"model-serving-api-86f7b4b499-fvvmd\" (UID: \"0fef4c3a-421b-488c-909a-9f8cde300f7c\") " pod="kserve/model-serving-api-86f7b4b499-fvvmd" Apr 22 19:30:51.905040 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:30:51.905004 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jtctv\" (UniqueName: \"kubernetes.io/projected/0fef4c3a-421b-488c-909a-9f8cde300f7c-kube-api-access-jtctv\") pod \"model-serving-api-86f7b4b499-fvvmd\" (UID: \"0fef4c3a-421b-488c-909a-9f8cde300f7c\") " pod="kserve/model-serving-api-86f7b4b499-fvvmd" Apr 22 19:30:51.905040 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:30:51.905046 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0fef4c3a-421b-488c-909a-9f8cde300f7c-tls-certs\") pod \"model-serving-api-86f7b4b499-fvvmd\" (UID: \"0fef4c3a-421b-488c-909a-9f8cde300f7c\") " pod="kserve/model-serving-api-86f7b4b499-fvvmd" Apr 22 19:30:51.907592 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:30:51.907564 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0fef4c3a-421b-488c-909a-9f8cde300f7c-tls-certs\") pod \"model-serving-api-86f7b4b499-fvvmd\" (UID: \"0fef4c3a-421b-488c-909a-9f8cde300f7c\") " pod="kserve/model-serving-api-86f7b4b499-fvvmd" Apr 22 19:30:51.913044 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:30:51.913021 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtctv\" (UniqueName: \"kubernetes.io/projected/0fef4c3a-421b-488c-909a-9f8cde300f7c-kube-api-access-jtctv\") pod \"model-serving-api-86f7b4b499-fvvmd\" (UID: \"0fef4c3a-421b-488c-909a-9f8cde300f7c\") " pod="kserve/model-serving-api-86f7b4b499-fvvmd" Apr 22 19:30:52.054356 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:30:52.054272 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-fvvmd" Apr 22 19:30:52.178056 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:30:52.178030 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-fvvmd"] Apr 22 19:30:52.180455 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:30:52.180429 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fef4c3a_421b_488c_909a_9f8cde300f7c.slice/crio-bff944e251f5e1ad1afe31572404e9b0a4ea42bd05d2e3f833ad562c72193215 WatchSource:0}: Error finding container bff944e251f5e1ad1afe31572404e9b0a4ea42bd05d2e3f833ad562c72193215: Status 404 returned error can't find the container with id bff944e251f5e1ad1afe31572404e9b0a4ea42bd05d2e3f833ad562c72193215 Apr 22 19:30:52.182022 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:30:52.182005 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:30:53.183476 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:30:53.183443 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-fvvmd" event={"ID":"0fef4c3a-421b-488c-909a-9f8cde300f7c","Type":"ContainerStarted","Data":"bff944e251f5e1ad1afe31572404e9b0a4ea42bd05d2e3f833ad562c72193215"} Apr 22 19:30:55.193529 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:30:55.193489 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-fvvmd" event={"ID":"0fef4c3a-421b-488c-909a-9f8cde300f7c","Type":"ContainerStarted","Data":"80f785636ac53a2637bbab96dfdaf9967a68a98cf42efb2854c2a6a9ab1c37f6"} Apr 22 19:30:55.194005 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:30:55.193583 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-fvvmd" Apr 22 19:30:55.211460 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:30:55.211412 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-fvvmd" podStartSLOduration=2.128510133 podStartE2EDuration="4.211398766s" podCreationTimestamp="2026-04-22 19:30:51 +0000 UTC" firstStartedPulling="2026-04-22 19:30:52.182126831 +0000 UTC m=+454.043080546" lastFinishedPulling="2026-04-22 19:30:54.265015458 +0000 UTC m=+456.125969179" observedRunningTime="2026-04-22 19:30:55.209521756 +0000 UTC m=+457.070475493" watchObservedRunningTime="2026-04-22 19:30:55.211398766 +0000 UTC m=+457.072352700" Apr 22 19:31:06.199827 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:31:06.199799 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-fvvmd" Apr 22 19:31:27.137023 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:31:27.136983 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7988f94697-hbhs6"] Apr 22 19:31:27.140325 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:31:27.140307 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7988f94697-hbhs6" Apr 22 19:31:27.142100 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:31:27.142074 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-5ktbc\"" Apr 22 19:31:27.148007 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:31:27.147979 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7988f94697-hbhs6"] Apr 22 19:31:27.295720 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:31:27.295689 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b502dc7-70d7-4339-a699-6072ff2e605b-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-7988f94697-hbhs6\" (UID: \"2b502dc7-70d7-4339-a699-6072ff2e605b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7988f94697-hbhs6" Apr 22 19:31:27.325476 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:31:27.325444 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-xx2ph"] Apr 22 19:31:27.328755 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:31:27.328736 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-xx2ph" Apr 22 19:31:27.338877 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:31:27.338853 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-xx2ph"] Apr 22 19:31:27.396609 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:31:27.396514 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b502dc7-70d7-4339-a699-6072ff2e605b-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-7988f94697-hbhs6\" (UID: \"2b502dc7-70d7-4339-a699-6072ff2e605b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7988f94697-hbhs6" Apr 22 19:31:27.396961 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:31:27.396939 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b502dc7-70d7-4339-a699-6072ff2e605b-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-7988f94697-hbhs6\" (UID: \"2b502dc7-70d7-4339-a699-6072ff2e605b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7988f94697-hbhs6" Apr 22 19:31:27.451666 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:31:27.451627 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7988f94697-hbhs6" Apr 22 19:31:27.498069 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:31:27.498020 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f2d66057-78ce-40fc-b24b-8a7fab95a623-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-xx2ph\" (UID: \"f2d66057-78ce-40fc-b24b-8a7fab95a623\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-xx2ph" Apr 22 19:31:27.575623 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:31:27.575573 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7988f94697-hbhs6"] Apr 22 19:31:27.579534 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:31:27.579505 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b502dc7_70d7_4339_a699_6072ff2e605b.slice/crio-914457c0eee34fa10258bcf5b944b995178658a2bbe87239ef2cf66f03956879 WatchSource:0}: Error finding container 914457c0eee34fa10258bcf5b944b995178658a2bbe87239ef2cf66f03956879: Status 404 returned error can't find the container with id 914457c0eee34fa10258bcf5b944b995178658a2bbe87239ef2cf66f03956879 Apr 22 19:31:27.598955 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:31:27.598924 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f2d66057-78ce-40fc-b24b-8a7fab95a623-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-xx2ph\" (UID: \"f2d66057-78ce-40fc-b24b-8a7fab95a623\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-xx2ph" Apr 22 19:31:27.599270 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:31:27.599251 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f2d66057-78ce-40fc-b24b-8a7fab95a623-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-xx2ph\" (UID: \"f2d66057-78ce-40fc-b24b-8a7fab95a623\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-xx2ph" Apr 22 19:31:27.639851 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:31:27.639814 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-xx2ph" Apr 22 19:31:27.756342 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:31:27.756311 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-xx2ph"] Apr 22 19:31:27.760277 ip-10-0-129-175 kubenswrapper[2576]: W0422 19:31:27.760247 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2d66057_78ce_40fc_b24b_8a7fab95a623.slice/crio-4d23ca3d011f2e663565654d433166f274aed49354477a13d141945eb2947758 WatchSource:0}: Error finding container 4d23ca3d011f2e663565654d433166f274aed49354477a13d141945eb2947758: Status 404 returned error can't find the container with id 4d23ca3d011f2e663565654d433166f274aed49354477a13d141945eb2947758 Apr 22 19:31:28.290162 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:31:28.290108 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7988f94697-hbhs6" event={"ID":"2b502dc7-70d7-4339-a699-6072ff2e605b","Type":"ContainerStarted","Data":"914457c0eee34fa10258bcf5b944b995178658a2bbe87239ef2cf66f03956879"} Apr 22 19:31:28.291559 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:31:28.291532 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-xx2ph" event={"ID":"f2d66057-78ce-40fc-b24b-8a7fab95a623","Type":"ContainerStarted","Data":"4d23ca3d011f2e663565654d433166f274aed49354477a13d141945eb2947758"} Apr 22 19:31:32.310432 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:31:32.310392 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7988f94697-hbhs6" event={"ID":"2b502dc7-70d7-4339-a699-6072ff2e605b","Type":"ContainerStarted","Data":"c75ad21171ee009141d5ddf7b6664193856be802b4b3d2c92b6fce1c71e6f542"} Apr 22 19:31:32.311824 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:31:32.311789 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-xx2ph" event={"ID":"f2d66057-78ce-40fc-b24b-8a7fab95a623","Type":"ContainerStarted","Data":"edae3de741c555f84c3ba3a0074b949dbf5702208c829a19e1554fe750b41432"} Apr 22 19:31:36.326310 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:31:36.326277 2576 generic.go:358] "Generic (PLEG): container finished" podID="2b502dc7-70d7-4339-a699-6072ff2e605b" containerID="c75ad21171ee009141d5ddf7b6664193856be802b4b3d2c92b6fce1c71e6f542" exitCode=0 Apr 22 19:31:36.326713 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:31:36.326352 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7988f94697-hbhs6" event={"ID":"2b502dc7-70d7-4339-a699-6072ff2e605b","Type":"ContainerDied","Data":"c75ad21171ee009141d5ddf7b6664193856be802b4b3d2c92b6fce1c71e6f542"} Apr 22 19:31:36.327790 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:31:36.327769 2576 generic.go:358] "Generic (PLEG): container finished" podID="f2d66057-78ce-40fc-b24b-8a7fab95a623" containerID="edae3de741c555f84c3ba3a0074b949dbf5702208c829a19e1554fe750b41432" exitCode=0 Apr 22 19:31:36.327900 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:31:36.327813 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-xx2ph" event={"ID":"f2d66057-78ce-40fc-b24b-8a7fab95a623","Type":"ContainerDied","Data":"edae3de741c555f84c3ba3a0074b949dbf5702208c829a19e1554fe750b41432"} Apr 22 19:32:00.416637 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:32:00.416604 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7988f94697-hbhs6" event={"ID":"2b502dc7-70d7-4339-a699-6072ff2e605b","Type":"ContainerStarted","Data":"43677c5fa010d588f1490f5fb006fb69881557ff08f6231a8d65cc2d10b9eac7"} Apr 22 19:32:01.421449 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:32:01.421408 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-xx2ph" event={"ID":"f2d66057-78ce-40fc-b24b-8a7fab95a623","Type":"ContainerStarted","Data":"b223c7114bb2c4be3140d9edb46231f97c24fa5d7f8cf00275c536686c3f0140"} Apr 22 19:32:01.421928 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:32:01.421619 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7988f94697-hbhs6" Apr 22 19:32:01.421928 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:32:01.421764 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-xx2ph" Apr 22 19:32:01.423059 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:32:01.422992 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-xx2ph" podUID="f2d66057-78ce-40fc-b24b-8a7fab95a623" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 19:32:01.423178 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:32:01.423073 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7988f94697-hbhs6" podUID="2b502dc7-70d7-4339-a699-6072ff2e605b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 22 19:32:01.439508 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:32:01.439439 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7988f94697-hbhs6" podStartSLOduration=1.6993811509999999 podStartE2EDuration="34.439429159s" podCreationTimestamp="2026-04-22 19:31:27 +0000 UTC" firstStartedPulling="2026-04-22 19:31:27.58137373 +0000 UTC m=+489.442327449" lastFinishedPulling="2026-04-22 19:32:00.321421739 +0000 UTC m=+522.182375457" observedRunningTime="2026-04-22 19:32:01.438596124 +0000 UTC m=+523.299549861" watchObservedRunningTime="2026-04-22 19:32:01.439429159 +0000 UTC m=+523.300382895" Apr 22 19:32:01.455660 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:32:01.455618 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-xx2ph" podStartSLOduration=1.5851411149999999 podStartE2EDuration="34.455607495s" podCreationTimestamp="2026-04-22 19:31:27 +0000 UTC" firstStartedPulling="2026-04-22 19:31:27.762285381 +0000 UTC m=+489.623239095" lastFinishedPulling="2026-04-22 19:32:00.632751621 +0000 UTC m=+522.493705475" observedRunningTime="2026-04-22 19:32:01.454037684 +0000 UTC m=+523.314991421" watchObservedRunningTime="2026-04-22 19:32:01.455607495 +0000 UTC m=+523.316561231" Apr 22 19:32:02.426918 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:32:02.426413 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-xx2ph" podUID="f2d66057-78ce-40fc-b24b-8a7fab95a623" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 19:32:02.427360 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:32:02.427216 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7988f94697-hbhs6" podUID="2b502dc7-70d7-4339-a699-6072ff2e605b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 22 19:32:12.426442 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:32:12.426344 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-xx2ph" podUID="f2d66057-78ce-40fc-b24b-8a7fab95a623" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 19:32:12.427615 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:32:12.427590 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7988f94697-hbhs6" podUID="2b502dc7-70d7-4339-a699-6072ff2e605b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 22 19:32:22.426528 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:32:22.426472 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-xx2ph" podUID="f2d66057-78ce-40fc-b24b-8a7fab95a623" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 19:32:22.427751 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:32:22.427701 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7988f94697-hbhs6" podUID="2b502dc7-70d7-4339-a699-6072ff2e605b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 22 19:32:32.425819 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:32:32.425776 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-xx2ph" podUID="f2d66057-78ce-40fc-b24b-8a7fab95a623" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 19:32:32.428091 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:32:32.428058 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7988f94697-hbhs6" podUID="2b502dc7-70d7-4339-a699-6072ff2e605b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 22 19:32:42.426333 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:32:42.426285 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-xx2ph" podUID="f2d66057-78ce-40fc-b24b-8a7fab95a623" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 19:32:42.427538 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:32:42.427514 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7988f94697-hbhs6" podUID="2b502dc7-70d7-4339-a699-6072ff2e605b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 22 19:32:52.426147 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:32:52.426100 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-xx2ph" podUID="f2d66057-78ce-40fc-b24b-8a7fab95a623" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 19:32:52.427384 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:32:52.427354 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7988f94697-hbhs6" podUID="2b502dc7-70d7-4339-a699-6072ff2e605b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 22 19:33:02.427416 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:02.427379 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-xx2ph" Apr 22 19:33:02.427846 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:02.427652 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7988f94697-hbhs6" podUID="2b502dc7-70d7-4339-a699-6072ff2e605b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 22 19:33:12.428905 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:12.428868 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7988f94697-hbhs6" Apr 22 19:33:18.637310 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:18.637283 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5rw5p_af8ba87e-3b79-4273-af80-a16c607dbf85/console-operator/1.log" Apr 22 19:33:18.637762 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:18.637290 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5rw5p_af8ba87e-3b79-4273-af80-a16c607dbf85/console-operator/1.log" Apr 22 19:33:47.428991 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:47.428915 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7988f94697-hbhs6"] Apr 22 19:33:47.429440 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:47.429209 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7988f94697-hbhs6" podUID="2b502dc7-70d7-4339-a699-6072ff2e605b" containerName="kserve-container" containerID="cri-o://43677c5fa010d588f1490f5fb006fb69881557ff08f6231a8d65cc2d10b9eac7" gracePeriod=30 Apr 22 19:33:47.543634 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:47.543601 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-xx2ph"] Apr 22 19:33:47.543917 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:47.543879 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-xx2ph" podUID="f2d66057-78ce-40fc-b24b-8a7fab95a623" containerName="kserve-container" containerID="cri-o://b223c7114bb2c4be3140d9edb46231f97c24fa5d7f8cf00275c536686c3f0140" gracePeriod=30 Apr 22 19:33:51.283995 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:51.283968 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-xx2ph" Apr 22 19:33:51.461211 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:51.461167 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f2d66057-78ce-40fc-b24b-8a7fab95a623-kserve-provision-location\") pod \"f2d66057-78ce-40fc-b24b-8a7fab95a623\" (UID: \"f2d66057-78ce-40fc-b24b-8a7fab95a623\") " Apr 22 19:33:51.461555 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:51.461525 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2d66057-78ce-40fc-b24b-8a7fab95a623-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f2d66057-78ce-40fc-b24b-8a7fab95a623" (UID: "f2d66057-78ce-40fc-b24b-8a7fab95a623"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:33:51.562004 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:51.561965 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f2d66057-78ce-40fc-b24b-8a7fab95a623-kserve-provision-location\") on node \"ip-10-0-129-175.ec2.internal\" DevicePath \"\"" Apr 22 19:33:51.747210 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:51.747118 2576 generic.go:358] "Generic (PLEG): container finished" podID="f2d66057-78ce-40fc-b24b-8a7fab95a623" containerID="b223c7114bb2c4be3140d9edb46231f97c24fa5d7f8cf00275c536686c3f0140" exitCode=0 Apr 22 19:33:51.747210 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:51.747189 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-xx2ph" Apr 22 19:33:51.747381 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:51.747210 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-xx2ph" event={"ID":"f2d66057-78ce-40fc-b24b-8a7fab95a623","Type":"ContainerDied","Data":"b223c7114bb2c4be3140d9edb46231f97c24fa5d7f8cf00275c536686c3f0140"} Apr 22 19:33:51.747381 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:51.747252 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-xx2ph" event={"ID":"f2d66057-78ce-40fc-b24b-8a7fab95a623","Type":"ContainerDied","Data":"4d23ca3d011f2e663565654d433166f274aed49354477a13d141945eb2947758"} Apr 22 19:33:51.747381 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:51.747268 2576 scope.go:117] "RemoveContainer" containerID="b223c7114bb2c4be3140d9edb46231f97c24fa5d7f8cf00275c536686c3f0140" Apr 22 19:33:51.755343 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:51.755223 2576 scope.go:117] "RemoveContainer" containerID="edae3de741c555f84c3ba3a0074b949dbf5702208c829a19e1554fe750b41432" Apr 22 19:33:51.762134 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:51.762119 2576 scope.go:117] "RemoveContainer" containerID="b223c7114bb2c4be3140d9edb46231f97c24fa5d7f8cf00275c536686c3f0140" Apr 22 19:33:51.762365 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:33:51.762346 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b223c7114bb2c4be3140d9edb46231f97c24fa5d7f8cf00275c536686c3f0140\": container with ID starting with b223c7114bb2c4be3140d9edb46231f97c24fa5d7f8cf00275c536686c3f0140 not found: ID does not exist" containerID="b223c7114bb2c4be3140d9edb46231f97c24fa5d7f8cf00275c536686c3f0140" Apr 22 19:33:51.762415 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:51.762372 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b223c7114bb2c4be3140d9edb46231f97c24fa5d7f8cf00275c536686c3f0140"} err="failed to get container status \"b223c7114bb2c4be3140d9edb46231f97c24fa5d7f8cf00275c536686c3f0140\": rpc error: code = NotFound desc = could not find container \"b223c7114bb2c4be3140d9edb46231f97c24fa5d7f8cf00275c536686c3f0140\": container with ID starting with b223c7114bb2c4be3140d9edb46231f97c24fa5d7f8cf00275c536686c3f0140 not found: ID does not exist" Apr 22 19:33:51.762415 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:51.762397 2576 scope.go:117] "RemoveContainer" containerID="edae3de741c555f84c3ba3a0074b949dbf5702208c829a19e1554fe750b41432" Apr 22 19:33:51.762623 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:33:51.762605 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edae3de741c555f84c3ba3a0074b949dbf5702208c829a19e1554fe750b41432\": container with ID starting with edae3de741c555f84c3ba3a0074b949dbf5702208c829a19e1554fe750b41432 not found: ID does not exist" containerID="edae3de741c555f84c3ba3a0074b949dbf5702208c829a19e1554fe750b41432" Apr 22 19:33:51.762675 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:51.762633 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edae3de741c555f84c3ba3a0074b949dbf5702208c829a19e1554fe750b41432"} err="failed to get container status \"edae3de741c555f84c3ba3a0074b949dbf5702208c829a19e1554fe750b41432\": rpc error: code = NotFound desc = could not find container \"edae3de741c555f84c3ba3a0074b949dbf5702208c829a19e1554fe750b41432\": container with ID starting with edae3de741c555f84c3ba3a0074b949dbf5702208c829a19e1554fe750b41432 not found: ID does not exist" Apr 22 19:33:51.769432 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:51.769407 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-xx2ph"] Apr 22 19:33:51.772992 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:51.772970 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-xx2ph"] Apr 22 19:33:51.970006 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:51.969983 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7988f94697-hbhs6" Apr 22 19:33:52.066420 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:52.066340 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b502dc7-70d7-4339-a699-6072ff2e605b-kserve-provision-location\") pod \"2b502dc7-70d7-4339-a699-6072ff2e605b\" (UID: \"2b502dc7-70d7-4339-a699-6072ff2e605b\") " Apr 22 19:33:52.066649 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:52.066627 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b502dc7-70d7-4339-a699-6072ff2e605b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2b502dc7-70d7-4339-a699-6072ff2e605b" (UID: "2b502dc7-70d7-4339-a699-6072ff2e605b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:33:52.167804 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:52.167768 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b502dc7-70d7-4339-a699-6072ff2e605b-kserve-provision-location\") on node \"ip-10-0-129-175.ec2.internal\" DevicePath \"\"" Apr 22 19:33:52.736760 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:52.736709 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2d66057-78ce-40fc-b24b-8a7fab95a623" path="/var/lib/kubelet/pods/f2d66057-78ce-40fc-b24b-8a7fab95a623/volumes" Apr 22 19:33:52.751024 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:52.750998 2576 generic.go:358] "Generic (PLEG): container finished" podID="2b502dc7-70d7-4339-a699-6072ff2e605b" containerID="43677c5fa010d588f1490f5fb006fb69881557ff08f6231a8d65cc2d10b9eac7" exitCode=0 Apr 22 19:33:52.751145 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:52.751069 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7988f94697-hbhs6" Apr 22 19:33:52.751145 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:52.751078 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7988f94697-hbhs6" event={"ID":"2b502dc7-70d7-4339-a699-6072ff2e605b","Type":"ContainerDied","Data":"43677c5fa010d588f1490f5fb006fb69881557ff08f6231a8d65cc2d10b9eac7"} Apr 22 19:33:52.751145 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:52.751104 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7988f94697-hbhs6" event={"ID":"2b502dc7-70d7-4339-a699-6072ff2e605b","Type":"ContainerDied","Data":"914457c0eee34fa10258bcf5b944b995178658a2bbe87239ef2cf66f03956879"} Apr 22 19:33:52.751145 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:52.751124 2576 scope.go:117] "RemoveContainer" containerID="43677c5fa010d588f1490f5fb006fb69881557ff08f6231a8d65cc2d10b9eac7" Apr 22 19:33:52.759227 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:52.759209 2576 scope.go:117] "RemoveContainer" containerID="c75ad21171ee009141d5ddf7b6664193856be802b4b3d2c92b6fce1c71e6f542" Apr 22 19:33:52.764912 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:52.764888 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7988f94697-hbhs6"] Apr 22 19:33:52.766466 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:52.766449 2576 scope.go:117] "RemoveContainer" containerID="43677c5fa010d588f1490f5fb006fb69881557ff08f6231a8d65cc2d10b9eac7" Apr 22 19:33:52.766686 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:33:52.766668 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43677c5fa010d588f1490f5fb006fb69881557ff08f6231a8d65cc2d10b9eac7\": container with ID starting with 43677c5fa010d588f1490f5fb006fb69881557ff08f6231a8d65cc2d10b9eac7 not found: ID does not exist" containerID="43677c5fa010d588f1490f5fb006fb69881557ff08f6231a8d65cc2d10b9eac7" Apr 22 19:33:52.766827 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:52.766694 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43677c5fa010d588f1490f5fb006fb69881557ff08f6231a8d65cc2d10b9eac7"} err="failed to get container status \"43677c5fa010d588f1490f5fb006fb69881557ff08f6231a8d65cc2d10b9eac7\": rpc error: code = NotFound desc = could not find container \"43677c5fa010d588f1490f5fb006fb69881557ff08f6231a8d65cc2d10b9eac7\": container with ID starting with 43677c5fa010d588f1490f5fb006fb69881557ff08f6231a8d65cc2d10b9eac7 not found: ID does not exist" Apr 22 19:33:52.766827 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:52.766712 2576 scope.go:117] "RemoveContainer" containerID="c75ad21171ee009141d5ddf7b6664193856be802b4b3d2c92b6fce1c71e6f542" Apr 22 19:33:52.766993 ip-10-0-129-175 kubenswrapper[2576]: E0422 19:33:52.766974 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c75ad21171ee009141d5ddf7b6664193856be802b4b3d2c92b6fce1c71e6f542\": container with ID starting with c75ad21171ee009141d5ddf7b6664193856be802b4b3d2c92b6fce1c71e6f542 not found: ID does not exist" containerID="c75ad21171ee009141d5ddf7b6664193856be802b4b3d2c92b6fce1c71e6f542" Apr 22 19:33:52.767035 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:52.767000 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c75ad21171ee009141d5ddf7b6664193856be802b4b3d2c92b6fce1c71e6f542"} err="failed to get container status \"c75ad21171ee009141d5ddf7b6664193856be802b4b3d2c92b6fce1c71e6f542\": rpc error: code = NotFound desc = could not find container \"c75ad21171ee009141d5ddf7b6664193856be802b4b3d2c92b6fce1c71e6f542\": container with ID starting with c75ad21171ee009141d5ddf7b6664193856be802b4b3d2c92b6fce1c71e6f542 not found: ID does not exist" Apr 22 19:33:52.771153 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:52.771134 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7988f94697-hbhs6"] Apr 22 19:33:54.737092 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:33:54.737056 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b502dc7-70d7-4339-a699-6072ff2e605b" path="/var/lib/kubelet/pods/2b502dc7-70d7-4339-a699-6072ff2e605b/volumes" Apr 22 19:38:18.659785 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:38:18.659749 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5rw5p_af8ba87e-3b79-4273-af80-a16c607dbf85/console-operator/1.log" Apr 22 19:38:18.660443 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:38:18.660423 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5rw5p_af8ba87e-3b79-4273-af80-a16c607dbf85/console-operator/1.log" Apr 22 19:43:18.678378 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:43:18.678351 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5rw5p_af8ba87e-3b79-4273-af80-a16c607dbf85/console-operator/1.log" Apr 22 19:43:18.680576 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:43:18.680552 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5rw5p_af8ba87e-3b79-4273-af80-a16c607dbf85/console-operator/1.log" Apr 22 19:48:18.699584 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:48:18.699554 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5rw5p_af8ba87e-3b79-4273-af80-a16c607dbf85/console-operator/1.log" Apr 22 19:48:18.702085 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:48:18.701686 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5rw5p_af8ba87e-3b79-4273-af80-a16c607dbf85/console-operator/1.log" Apr 22 19:53:18.723471 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:53:18.723352 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5rw5p_af8ba87e-3b79-4273-af80-a16c607dbf85/console-operator/1.log" Apr 22 19:53:18.728403 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:53:18.725989 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5rw5p_af8ba87e-3b79-4273-af80-a16c607dbf85/console-operator/1.log" Apr 22 19:58:18.745872 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:58:18.745761 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5rw5p_af8ba87e-3b79-4273-af80-a16c607dbf85/console-operator/1.log" Apr 22 19:58:18.748885 ip-10-0-129-175 kubenswrapper[2576]: I0422 19:58:18.747909 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5rw5p_af8ba87e-3b79-4273-af80-a16c607dbf85/console-operator/1.log" Apr 22 20:03:18.764884 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:03:18.764780 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5rw5p_af8ba87e-3b79-4273-af80-a16c607dbf85/console-operator/1.log" Apr 22 20:03:18.767901 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:03:18.767365 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5rw5p_af8ba87e-3b79-4273-af80-a16c607dbf85/console-operator/1.log" Apr 22 20:08:18.786384 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:08:18.786272 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5rw5p_af8ba87e-3b79-4273-af80-a16c607dbf85/console-operator/1.log" Apr 22 20:08:18.789536 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:08:18.789010 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5rw5p_af8ba87e-3b79-4273-af80-a16c607dbf85/console-operator/1.log" Apr 22 20:11:54.919857 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:11:54.919823 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-rb7nf_b1a86c6d-76f8-443f-bc6b-13cdb22ad4b8/global-pull-secret-syncer/0.log" Apr 22 20:11:55.053398 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:11:55.053369 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-w4jz7_2fb0effb-90c6-4bf4-8981-baf430cec62a/konnectivity-agent/0.log" Apr 22 20:11:55.090987 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:11:55.090941 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-175.ec2.internal_65b71d693c1ea98bc6c9a5017263d773/haproxy/0.log" Apr 22 20:11:58.526874 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:11:58.526803 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-5sfmj_3dd0e38f-633c-4e17-8389-15a41d942093/cluster-monitoring-operator/0.log" Apr 22 20:11:58.741906 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:11:58.741879 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jc52p_789c291d-695f-4244-bda8-52bd3b3ab880/node-exporter/0.log" Apr 22 20:11:58.759662 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:11:58.759635 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jc52p_789c291d-695f-4244-bda8-52bd3b3ab880/kube-rbac-proxy/0.log" Apr 22 20:11:58.776836 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:11:58.776815 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jc52p_789c291d-695f-4244-bda8-52bd3b3ab880/init-textfile/0.log" Apr 22 20:12:00.453070 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:00.453046 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-wg445_3a182f49-9fd9-43cf-983a-9e45aae094ac/networking-console-plugin/0.log" Apr 22 20:12:00.899606 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:00.899528 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5rw5p_af8ba87e-3b79-4273-af80-a16c607dbf85/console-operator/1.log" Apr 22 20:12:00.904261 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:00.904243 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5rw5p_af8ba87e-3b79-4273-af80-a16c607dbf85/console-operator/2.log" Apr 22 20:12:01.668000 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:01.667967 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-pq94v_45a00f57-cea9-483d-bda8-358ac125ff3b/volume-data-source-validator/0.log" Apr 22 20:12:02.120496 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.120466 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-94jtb/perf-node-gather-daemonset-49whh"] Apr 22 20:12:02.120811 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.120796 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f2d66057-78ce-40fc-b24b-8a7fab95a623" containerName="kserve-container" Apr 22 20:12:02.120899 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.120813 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2d66057-78ce-40fc-b24b-8a7fab95a623" containerName="kserve-container" Apr 22 20:12:02.120899 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.120830 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f2d66057-78ce-40fc-b24b-8a7fab95a623" containerName="storage-initializer" Apr 22 20:12:02.120899 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.120837 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2d66057-78ce-40fc-b24b-8a7fab95a623" containerName="storage-initializer" Apr 22 20:12:02.120899 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.120844 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b502dc7-70d7-4339-a699-6072ff2e605b" containerName="kserve-container" Apr 22 20:12:02.120899 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.120849 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b502dc7-70d7-4339-a699-6072ff2e605b" containerName="kserve-container" Apr 22 20:12:02.120899 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.120864 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b502dc7-70d7-4339-a699-6072ff2e605b" containerName="storage-initializer" Apr 22 20:12:02.120899 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.120870 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b502dc7-70d7-4339-a699-6072ff2e605b" containerName="storage-initializer" Apr 22 20:12:02.121101 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.120921 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f2d66057-78ce-40fc-b24b-8a7fab95a623" containerName="kserve-container" Apr 22 20:12:02.121101 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.120931 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b502dc7-70d7-4339-a699-6072ff2e605b" containerName="kserve-container" Apr 22 20:12:02.123793 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.123777 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-49whh" Apr 22 20:12:02.125799 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.125773 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-94jtb\"/\"default-dockercfg-btnhf\"" Apr 22 20:12:02.125947 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.125928 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-94jtb\"/\"openshift-service-ca.crt\"" Apr 22 20:12:02.126094 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.126078 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-94jtb\"/\"kube-root-ca.crt\"" Apr 22 20:12:02.130721 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.130695 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-94jtb/perf-node-gather-daemonset-49whh"] Apr 22 20:12:02.168495 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.168461 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/bdef8c07-649b-4b60-9f82-aa5f4bf84ff5-podres\") pod \"perf-node-gather-daemonset-49whh\" (UID: \"bdef8c07-649b-4b60-9f82-aa5f4bf84ff5\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-49whh" Apr 22 20:12:02.168495 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.168496 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bdef8c07-649b-4b60-9f82-aa5f4bf84ff5-sys\") pod \"perf-node-gather-daemonset-49whh\" (UID: \"bdef8c07-649b-4b60-9f82-aa5f4bf84ff5\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-49whh" Apr 22 20:12:02.168819 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.168513 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnrxd\" (UniqueName: \"kubernetes.io/projected/bdef8c07-649b-4b60-9f82-aa5f4bf84ff5-kube-api-access-rnrxd\") pod \"perf-node-gather-daemonset-49whh\" (UID: \"bdef8c07-649b-4b60-9f82-aa5f4bf84ff5\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-49whh" Apr 22 20:12:02.168819 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.168570 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/bdef8c07-649b-4b60-9f82-aa5f4bf84ff5-proc\") pod \"perf-node-gather-daemonset-49whh\" (UID: \"bdef8c07-649b-4b60-9f82-aa5f4bf84ff5\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-49whh" Apr 22 20:12:02.168819 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.168624 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bdef8c07-649b-4b60-9f82-aa5f4bf84ff5-lib-modules\") pod \"perf-node-gather-daemonset-49whh\" (UID: \"bdef8c07-649b-4b60-9f82-aa5f4bf84ff5\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-49whh" Apr 22 20:12:02.252045 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.252018 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cj979_a439b73c-32fc-4331-99ac-1f81f67a857d/dns/0.log" Apr 22 20:12:02.269022 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.268999 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cj979_a439b73c-32fc-4331-99ac-1f81f67a857d/kube-rbac-proxy/0.log" Apr 22 20:12:02.269142 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.269036 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/bdef8c07-649b-4b60-9f82-aa5f4bf84ff5-proc\") pod \"perf-node-gather-daemonset-49whh\" (UID: \"bdef8c07-649b-4b60-9f82-aa5f4bf84ff5\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-49whh" Apr 22 20:12:02.269142 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.269079 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bdef8c07-649b-4b60-9f82-aa5f4bf84ff5-lib-modules\") pod \"perf-node-gather-daemonset-49whh\" (UID: \"bdef8c07-649b-4b60-9f82-aa5f4bf84ff5\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-49whh" Apr 22 20:12:02.269142 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.269099 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/bdef8c07-649b-4b60-9f82-aa5f4bf84ff5-proc\") pod \"perf-node-gather-daemonset-49whh\" (UID: \"bdef8c07-649b-4b60-9f82-aa5f4bf84ff5\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-49whh" Apr 22 20:12:02.269253 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.269169 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/bdef8c07-649b-4b60-9f82-aa5f4bf84ff5-podres\") pod \"perf-node-gather-daemonset-49whh\" (UID: \"bdef8c07-649b-4b60-9f82-aa5f4bf84ff5\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-49whh" Apr 22 20:12:02.269253 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.269199 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bdef8c07-649b-4b60-9f82-aa5f4bf84ff5-sys\") pod \"perf-node-gather-daemonset-49whh\" (UID: \"bdef8c07-649b-4b60-9f82-aa5f4bf84ff5\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-49whh" Apr 22 20:12:02.269253 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.269220 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bdef8c07-649b-4b60-9f82-aa5f4bf84ff5-lib-modules\") pod \"perf-node-gather-daemonset-49whh\" (UID: \"bdef8c07-649b-4b60-9f82-aa5f4bf84ff5\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-49whh" Apr 22 20:12:02.269253 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.269226 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rnrxd\" (UniqueName: \"kubernetes.io/projected/bdef8c07-649b-4b60-9f82-aa5f4bf84ff5-kube-api-access-rnrxd\") pod \"perf-node-gather-daemonset-49whh\" (UID: \"bdef8c07-649b-4b60-9f82-aa5f4bf84ff5\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-49whh" Apr 22 20:12:02.269376 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.269290 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bdef8c07-649b-4b60-9f82-aa5f4bf84ff5-sys\") pod \"perf-node-gather-daemonset-49whh\" (UID: \"bdef8c07-649b-4b60-9f82-aa5f4bf84ff5\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-49whh" Apr 22 20:12:02.269376 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.269297 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/bdef8c07-649b-4b60-9f82-aa5f4bf84ff5-podres\") pod \"perf-node-gather-daemonset-49whh\" (UID: \"bdef8c07-649b-4b60-9f82-aa5f4bf84ff5\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-49whh" Apr 22 20:12:02.276266 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.276237 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnrxd\" (UniqueName: \"kubernetes.io/projected/bdef8c07-649b-4b60-9f82-aa5f4bf84ff5-kube-api-access-rnrxd\") pod \"perf-node-gather-daemonset-49whh\" (UID: \"bdef8c07-649b-4b60-9f82-aa5f4bf84ff5\") " pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-49whh" Apr 22 20:12:02.367722 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.367697 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5hk26_eb4cf245-3080-4779-80ac-295e37a8327f/dns-node-resolver/0.log" Apr 22 20:12:02.434047 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.433962 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-49whh" Apr 22 20:12:02.553112 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.553077 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-94jtb/perf-node-gather-daemonset-49whh"] Apr 22 20:12:02.556158 ip-10-0-129-175 kubenswrapper[2576]: W0422 20:12:02.556131 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podbdef8c07_649b_4b60_9f82_aa5f4bf84ff5.slice/crio-e153a1f0bcb18811c315ad4dbb50e6d635d15fa6f10089388f182fa69776ae46 WatchSource:0}: Error finding container e153a1f0bcb18811c315ad4dbb50e6d635d15fa6f10089388f182fa69776ae46: Status 404 returned error can't find the container with id e153a1f0bcb18811c315ad4dbb50e6d635d15fa6f10089388f182fa69776ae46 Apr 22 20:12:02.557825 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.557805 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:12:02.842054 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:02.842024 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-pnb2n_a2085208-6038-4f2b-929e-b75fe56d5a28/node-ca/0.log" Apr 22 20:12:03.106617 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:03.106519 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-49whh" event={"ID":"bdef8c07-649b-4b60-9f82-aa5f4bf84ff5","Type":"ContainerStarted","Data":"0c7a56dc57c36cac6e8504361e0555f62331ca657fa1e6526eb0061bbb5bdbea"} Apr 22 20:12:03.106617 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:03.106554 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-49whh" event={"ID":"bdef8c07-649b-4b60-9f82-aa5f4bf84ff5","Type":"ContainerStarted","Data":"e153a1f0bcb18811c315ad4dbb50e6d635d15fa6f10089388f182fa69776ae46"} Apr 22 20:12:03.106899 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:03.106719 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-49whh" Apr 22 20:12:03.139299 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:03.139236 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-49whh" podStartSLOduration=1.139215686 podStartE2EDuration="1.139215686s" podCreationTimestamp="2026-04-22 20:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:12:03.137665963 +0000 UTC m=+2924.998619700" watchObservedRunningTime="2026-04-22 20:12:03.139215686 +0000 UTC m=+2925.000169424" Apr 22 20:12:03.524287 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:03.524262 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7697566b5d-hh5dn_117697b4-88b4-4152-942c-224dcf13a685/router/0.log" Apr 22 20:12:03.835320 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:03.835239 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-rbbtr_04ec7b20-5a3e-44b4-afea-44994d100476/serve-healthcheck-canary/0.log" Apr 22 20:12:04.128054 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:04.127975 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-748dj_97ce33ae-619d-451a-8e53-dc35cdbba9e3/insights-operator/0.log" Apr 22 20:12:04.128714 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:04.128695 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-748dj_97ce33ae-619d-451a-8e53-dc35cdbba9e3/insights-operator/1.log" Apr 22 20:12:04.196056 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:04.196029 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-h762s_96abbc7c-9222-4307-b3ba-c6d559756d44/kube-rbac-proxy/0.log" Apr 22 20:12:04.213068 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:04.212996 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-h762s_96abbc7c-9222-4307-b3ba-c6d559756d44/exporter/0.log" Apr 22 20:12:04.243910 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:04.243886 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-h762s_96abbc7c-9222-4307-b3ba-c6d559756d44/extractor/0.log" Apr 22 20:12:06.227905 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:06.227879 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-fvvmd_0fef4c3a-421b-488c-909a-9f8cde300f7c/server/0.log" Apr 22 20:12:09.118376 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:09.118346 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-94jtb/perf-node-gather-daemonset-49whh" Apr 22 20:12:10.276955 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:10.276923 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-mqx2f_944e0b61-e3fd-4e73-88f5-25f65ad44d11/kube-storage-version-migrator-operator/1.log" Apr 22 20:12:10.277759 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:10.277709 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-mqx2f_944e0b61-e3fd-4e73-88f5-25f65ad44d11/kube-storage-version-migrator-operator/0.log" Apr 22 20:12:11.536399 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:11.536324 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hr5vn_17ada943-fc5e-4b09-bf82-9132909cb32d/kube-multus-additional-cni-plugins/0.log" Apr 22 20:12:11.553891 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:11.553867 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hr5vn_17ada943-fc5e-4b09-bf82-9132909cb32d/egress-router-binary-copy/0.log" Apr 22 20:12:11.575014 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:11.574990 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hr5vn_17ada943-fc5e-4b09-bf82-9132909cb32d/cni-plugins/0.log" Apr 22 20:12:11.595937 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:11.595909 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hr5vn_17ada943-fc5e-4b09-bf82-9132909cb32d/bond-cni-plugin/0.log" Apr 22 20:12:11.613567 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:11.613547 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hr5vn_17ada943-fc5e-4b09-bf82-9132909cb32d/routeoverride-cni/0.log" Apr 22 20:12:11.632483 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:11.632460 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hr5vn_17ada943-fc5e-4b09-bf82-9132909cb32d/whereabouts-cni-bincopy/0.log" Apr 22 20:12:11.649065 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:11.649043 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hr5vn_17ada943-fc5e-4b09-bf82-9132909cb32d/whereabouts-cni/0.log" Apr 22 20:12:11.676808 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:11.676787 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nflqk_4b532865-112c-43d5-a22b-58a5cece9682/kube-multus/0.log" Apr 22 20:12:11.699765 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:11.699719 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4sl88_ac86a56e-148b-4fb4-8415-acff438d7915/network-metrics-daemon/0.log" Apr 22 20:12:11.716581 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:11.716556 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4sl88_ac86a56e-148b-4fb4-8415-acff438d7915/kube-rbac-proxy/0.log" Apr 22 20:12:12.424657 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:12.424631 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r2gjj_7ec73ca3-22df-4ef0-ad03-92031016c8b8/ovn-controller/0.log" Apr 22 20:12:12.452063 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:12.452032 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r2gjj_7ec73ca3-22df-4ef0-ad03-92031016c8b8/ovn-acl-logging/0.log" Apr 22 20:12:12.466657 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:12.466635 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r2gjj_7ec73ca3-22df-4ef0-ad03-92031016c8b8/kube-rbac-proxy-node/0.log" Apr 22 20:12:12.484457 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:12.484433 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r2gjj_7ec73ca3-22df-4ef0-ad03-92031016c8b8/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 20:12:12.501586 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:12.501558 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r2gjj_7ec73ca3-22df-4ef0-ad03-92031016c8b8/northd/0.log" Apr 22 20:12:12.521467 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:12.521448 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r2gjj_7ec73ca3-22df-4ef0-ad03-92031016c8b8/nbdb/0.log" Apr 22 20:12:12.541208 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:12.541187 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r2gjj_7ec73ca3-22df-4ef0-ad03-92031016c8b8/sbdb/0.log" Apr 22 20:12:12.636905 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:12.636876 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r2gjj_7ec73ca3-22df-4ef0-ad03-92031016c8b8/ovnkube-controller/0.log" Apr 22 20:12:14.031676 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:14.031649 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-v2f7q_69e02842-f395-40eb-875a-3561c42e7eef/check-endpoints/0.log" Apr 22 20:12:14.049735 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:14.049709 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-858rm_c887aba7-380b-4a80-bc2c-6e89b986da6b/network-check-target-container/0.log" Apr 22 20:12:14.849579 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:14.849551 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-68jrq_6c9d2810-4d20-4eb6-9318-15765f879dfa/iptables-alerter/0.log" Apr 22 20:12:15.454979 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:15.454950 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-tmk7w_cf7e7909-ac67-46c2-af45-9ceb49eb60c2/tuned/0.log" Apr 22 20:12:16.951754 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:16.951704 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-wrxdf_3543ced1-0882-444d-be8a-53d6ef47c1e6/cluster-samples-operator/0.log" Apr 22 20:12:16.963642 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:16.963622 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-wrxdf_3543ced1-0882-444d-be8a-53d6ef47c1e6/cluster-samples-operator-watch/0.log" Apr 22 20:12:17.835143 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:17.835115 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-z7d4f_403f71c1-3bb4-4aef-9a26-b062489c9d03/service-ca-operator/1.log" Apr 22 20:12:17.835912 ip-10-0-129-175 kubenswrapper[2576]: I0422 20:12:17.835899 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-z7d4f_403f71c1-3bb4-4aef-9a26-b062489c9d03/service-ca-operator/0.log"