Apr 17 11:27:47.289975 ip-10-0-134-64 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 11:27:47.289988 ip-10-0-134-64 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 11:27:47.289997 ip-10-0-134-64 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 11:27:47.290299 ip-10-0-134-64 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 11:27:57.417664 ip-10-0-134-64 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 11:27:57.417678 ip-10-0-134-64 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 78ef1bcba298437fa3d28f0728fef4a7 -- Apr 17 11:30:22.084588 ip-10-0-134-64 systemd[1]: Starting Kubernetes Kubelet... Apr 17 11:30:22.534475 ip-10-0-134-64 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:30:22.534475 ip-10-0-134-64 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 11:30:22.534475 ip-10-0-134-64 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:30:22.534475 ip-10-0-134-64 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 11:30:22.534475 ip-10-0-134-64 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:30:22.537626 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.537534 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 11:30:22.542744 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542728 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:30:22.542744 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542743 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:30:22.542805 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542747 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:30:22.542805 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542751 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:30:22.542805 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542754 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:30:22.542805 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542757 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:30:22.542805 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542761 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:30:22.542805 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542763 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:30:22.542805 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542771 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:30:22.542805 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542774 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:30:22.542805 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542777 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:30:22.542805 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542780 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:30:22.542805 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542782 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:30:22.542805 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542785 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:30:22.542805 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542787 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:30:22.542805 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542790 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:30:22.542805 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542792 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:30:22.542805 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542795 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:30:22.542805 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542797 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:30:22.542805 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542800 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:30:22.542805 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542803 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:30:22.542805 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542806 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:30:22.543297 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542809 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:30:22.543297 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542811 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:30:22.543297 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542815 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:30:22.543297 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542817 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:30:22.543297 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542820 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:30:22.543297 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542823 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:30:22.543297 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542825 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:30:22.543297 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542828 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:30:22.543297 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542830 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:30:22.543297 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542833 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:30:22.543297 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542836 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:30:22.543297 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542839 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:30:22.543297 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542841 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:30:22.543297 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542844 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:30:22.543297 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542847 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:30:22.543297 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542850 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:30:22.543297 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542853 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:30:22.543297 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542856 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:30:22.543297 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542858 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:30:22.543297 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542861 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:30:22.543830 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542863 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:30:22.543830 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542866 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:30:22.543830 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542869 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:30:22.543830 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542871 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:30:22.543830 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542873 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:30:22.543830 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542876 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:30:22.543830 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542879 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:30:22.543830 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542881 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:30:22.543830 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542884 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:30:22.543830 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542886 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:30:22.543830 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542889 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:30:22.543830 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542891 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:30:22.543830 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542895 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:30:22.543830 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542899 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:30:22.543830 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542901 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:30:22.543830 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542904 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:30:22.543830 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542909 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:30:22.543830 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542912 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:30:22.543830 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542916 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:30:22.544311 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542919 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:30:22.544311 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542921 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:30:22.544311 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542924 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:30:22.544311 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542927 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:30:22.544311 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542930 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:30:22.544311 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542932 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:30:22.544311 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542935 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:30:22.544311 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542938 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:30:22.544311 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542941 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:30:22.544311 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542943 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:30:22.544311 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542947 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:30:22.544311 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542949 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:30:22.544311 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542952 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:30:22.544311 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542955 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:30:22.544311 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542957 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:30:22.544311 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542960 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:30:22.544311 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542964 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:30:22.544311 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542967 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:30:22.544311 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542969 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:30:22.544311 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542972 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:30:22.544798 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542974 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:30:22.544798 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542977 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:30:22.544798 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542981 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:30:22.544798 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542984 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:30:22.544798 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.542987 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:30:22.544798 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544537 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:30:22.544798 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544545 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:30:22.544798 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544548 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:30:22.544798 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544551 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:30:22.544798 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544554 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:30:22.544798 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544556 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:30:22.544798 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544559 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:30:22.544798 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544562 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:30:22.544798 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544564 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:30:22.544798 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544567 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:30:22.544798 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544569 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:30:22.544798 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544572 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:30:22.544798 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544574 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:30:22.544798 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544578 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:30:22.544798 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544582 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:30:22.545288 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544586 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:30:22.545288 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544590 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:30:22.545288 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544593 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:30:22.545288 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544597 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:30:22.545288 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544599 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:30:22.545288 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544602 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:30:22.545288 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544605 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:30:22.545288 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544607 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:30:22.545288 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544610 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:30:22.545288 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544612 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:30:22.545288 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544615 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:30:22.545288 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544617 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:30:22.545288 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544620 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:30:22.545288 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544623 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:30:22.545288 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544626 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:30:22.545288 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544629 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:30:22.545288 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544631 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:30:22.545288 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544634 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:30:22.545288 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544636 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:30:22.545764 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544639 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:30:22.545764 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544642 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:30:22.545764 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544644 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:30:22.545764 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544647 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:30:22.545764 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544650 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:30:22.545764 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544652 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:30:22.545764 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544655 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:30:22.545764 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544658 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:30:22.545764 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544661 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:30:22.545764 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544664 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:30:22.545764 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544666 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:30:22.545764 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544669 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:30:22.545764 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544671 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:30:22.545764 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544674 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:30:22.545764 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544677 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:30:22.545764 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544679 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:30:22.545764 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544682 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:30:22.545764 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544685 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:30:22.545764 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544687 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:30:22.545764 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544690 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:30:22.546242 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544692 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:30:22.546242 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544695 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:30:22.546242 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544697 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:30:22.546242 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544700 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:30:22.546242 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544703 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:30:22.546242 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544705 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:30:22.546242 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544710 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:30:22.546242 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544713 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:30:22.546242 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544715 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:30:22.546242 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544718 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:30:22.546242 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544720 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:30:22.546242 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544723 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:30:22.546242 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544725 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:30:22.546242 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544728 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:30:22.546242 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544731 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:30:22.546242 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544734 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:30:22.546242 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544736 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:30:22.546242 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544739 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:30:22.546242 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544741 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:30:22.546242 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544744 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:30:22.546745 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544746 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:30:22.546745 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544749 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:30:22.546745 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544751 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:30:22.546745 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544753 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:30:22.546745 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544756 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:30:22.546745 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544758 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:30:22.546745 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544761 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:30:22.546745 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544764 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:30:22.546745 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544766 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:30:22.546745 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544769 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:30:22.546745 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544771 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:30:22.546745 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.544774 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:30:22.546745 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544842 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 11:30:22.546745 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544849 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 11:30:22.546745 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544856 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 11:30:22.546745 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544861 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 11:30:22.546745 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544866 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 11:30:22.546745 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544869 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 11:30:22.546745 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544876 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 11:30:22.546745 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544881 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 11:30:22.546745 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544885 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 11:30:22.547245 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544888 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 11:30:22.547245 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544892 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 11:30:22.547245 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544895 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 11:30:22.547245 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544898 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 11:30:22.547245 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544901 2577 flags.go:64] FLAG: --cgroup-root="" Apr 17 11:30:22.547245 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544904 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 11:30:22.547245 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544907 2577 flags.go:64] FLAG: --client-ca-file="" Apr 17 11:30:22.547245 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544910 2577 flags.go:64] FLAG: --cloud-config="" Apr 17 11:30:22.547245 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544913 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 17 11:30:22.547245 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544916 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 11:30:22.547245 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544920 2577 flags.go:64] FLAG: --cluster-domain="" Apr 17 11:30:22.547245 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544923 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 11:30:22.547245 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544926 2577 flags.go:64] FLAG: --config-dir="" Apr 17 11:30:22.547245 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544929 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 11:30:22.547245 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544932 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 11:30:22.547245 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544936 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 11:30:22.547245 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544939 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 11:30:22.547245 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544942 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 11:30:22.547245 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544945 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 11:30:22.547245 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544949 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 17 11:30:22.547245 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544952 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 11:30:22.547245 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544955 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 11:30:22.547245 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544958 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 11:30:22.547245 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544962 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 11:30:22.547245 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544966 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 11:30:22.547894 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544969 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 11:30:22.547894 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544973 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 11:30:22.547894 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544976 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 11:30:22.547894 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544979 2577 flags.go:64] FLAG: --enable-server="true" Apr 17 11:30:22.547894 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544983 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 11:30:22.547894 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544990 2577 flags.go:64] FLAG: --event-burst="100" Apr 17 11:30:22.547894 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544993 2577 flags.go:64] FLAG: --event-qps="50" Apr 17 11:30:22.547894 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544996 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 11:30:22.547894 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.544999 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 11:30:22.547894 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545003 2577 flags.go:64] FLAG: --eviction-hard="" Apr 17 11:30:22.547894 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545006 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 11:30:22.547894 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545009 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 11:30:22.547894 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545012 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 11:30:22.547894 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545015 2577 flags.go:64] FLAG: --eviction-soft="" Apr 17 11:30:22.547894 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545018 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 11:30:22.547894 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545021 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 11:30:22.547894 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545024 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 11:30:22.547894 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545027 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 11:30:22.547894 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545030 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 11:30:22.547894 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545033 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 11:30:22.547894 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545036 2577 flags.go:64] FLAG: --feature-gates="" Apr 17 11:30:22.547894 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545040 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 11:30:22.547894 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545043 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 11:30:22.547894 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545046 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 11:30:22.547894 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545050 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 11:30:22.548569 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545052 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 17 11:30:22.548569 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545056 2577 flags.go:64] FLAG: --help="false" Apr 17 11:30:22.548569 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545059 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-134-64.ec2.internal" Apr 17 11:30:22.548569 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545062 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 11:30:22.548569 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545065 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 11:30:22.548569 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545068 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 11:30:22.548569 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545071 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 11:30:22.548569 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545075 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 11:30:22.548569 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545078 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 11:30:22.548569 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545081 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 11:30:22.548569 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545083 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 11:30:22.548569 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545088 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 11:30:22.548569 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545091 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 11:30:22.548569 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545094 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 11:30:22.548569 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545097 2577 flags.go:64] FLAG: --kube-reserved="" Apr 17 11:30:22.548569 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545100 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 11:30:22.548569 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545103 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 11:30:22.548569 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545106 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 11:30:22.548569 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545109 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 11:30:22.548569 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545112 2577 flags.go:64] FLAG: --lock-file="" Apr 17 11:30:22.548569 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545114 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 11:30:22.548569 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545117 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 11:30:22.548569 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545120 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 11:30:22.548569 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545126 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 11:30:22.549184 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545128 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 11:30:22.549184 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545131 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 11:30:22.549184 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545134 2577 flags.go:64] FLAG: --logging-format="text" Apr 17 11:30:22.549184 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545137 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 11:30:22.549184 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545140 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 11:30:22.549184 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545143 2577 flags.go:64] FLAG: --manifest-url="" Apr 17 11:30:22.549184 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545146 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 17 11:30:22.549184 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545150 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 11:30:22.549184 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545153 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 11:30:22.549184 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545158 2577 flags.go:64] FLAG: --max-pods="110" Apr 17 11:30:22.549184 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545161 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 11:30:22.549184 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545164 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 11:30:22.549184 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545167 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 11:30:22.549184 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545170 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 11:30:22.549184 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545173 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 11:30:22.549184 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545176 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 11:30:22.549184 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545179 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 11:30:22.549184 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545187 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 11:30:22.549184 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545190 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 11:30:22.549184 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545197 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 11:30:22.549184 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545200 2577 flags.go:64] FLAG: --pod-cidr="" Apr 17 11:30:22.549184 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545203 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 11:30:22.549184 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545209 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 11:30:22.549749 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545212 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 11:30:22.549749 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545215 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 17 11:30:22.549749 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545218 2577 flags.go:64] FLAG: --port="10250" Apr 17 11:30:22.549749 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545221 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 11:30:22.549749 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545224 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-03be51cce89292678" Apr 17 11:30:22.549749 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545228 2577 flags.go:64] FLAG: --qos-reserved="" Apr 17 11:30:22.549749 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545231 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 17 11:30:22.549749 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545234 2577 flags.go:64] FLAG: --register-node="true" Apr 17 11:30:22.549749 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545237 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 17 11:30:22.549749 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545240 2577 flags.go:64] FLAG: --register-with-taints="" Apr 17 11:30:22.549749 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545244 2577 flags.go:64] FLAG: --registry-burst="10" Apr 17 11:30:22.549749 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545247 2577 flags.go:64] FLAG: --registry-qps="5" Apr 17 11:30:22.549749 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545250 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 17 11:30:22.549749 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545253 2577 flags.go:64] FLAG: --reserved-memory="" Apr 17 11:30:22.549749 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545256 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 11:30:22.549749 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545259 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 11:30:22.549749 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545262 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 11:30:22.549749 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545265 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 11:30:22.549749 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545280 2577 flags.go:64] FLAG: --runonce="false" Apr 17 11:30:22.549749 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545283 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 11:30:22.549749 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545287 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 11:30:22.549749 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545290 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 17 11:30:22.549749 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545293 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 11:30:22.549749 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545297 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 11:30:22.549749 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545300 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 11:30:22.549749 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545306 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 11:30:22.550415 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545309 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 11:30:22.550415 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545312 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 11:30:22.550415 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545316 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 11:30:22.550415 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545319 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 11:30:22.550415 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545322 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 11:30:22.550415 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545325 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 11:30:22.550415 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545329 2577 flags.go:64] FLAG: --system-cgroups="" Apr 17 11:30:22.550415 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545331 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 11:30:22.550415 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545337 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 11:30:22.550415 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545340 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 17 11:30:22.550415 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545342 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 11:30:22.550415 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545346 2577 flags.go:64] FLAG: --tls-min-version="" Apr 17 11:30:22.550415 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545349 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 11:30:22.550415 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545352 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 11:30:22.550415 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545355 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 11:30:22.550415 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545357 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 11:30:22.550415 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545361 2577 flags.go:64] FLAG: --v="2" Apr 17 11:30:22.550415 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545365 2577 flags.go:64] FLAG: --version="false" Apr 17 11:30:22.550415 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545369 2577 flags.go:64] FLAG: --vmodule="" Apr 17 11:30:22.550415 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545373 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 11:30:22.550415 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.545376 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 11:30:22.550415 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545468 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:30:22.550415 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545472 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:30:22.550415 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545476 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:30:22.550988 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545479 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:30:22.550988 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545482 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:30:22.550988 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545485 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:30:22.550988 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545487 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:30:22.550988 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545491 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:30:22.550988 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545493 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:30:22.550988 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545496 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:30:22.550988 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545500 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:30:22.550988 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545503 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:30:22.550988 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545505 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:30:22.550988 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545508 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:30:22.550988 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545511 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:30:22.550988 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545514 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:30:22.550988 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545516 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:30:22.550988 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545519 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:30:22.550988 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545521 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:30:22.550988 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545524 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:30:22.550988 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545527 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:30:22.550988 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545529 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:30:22.551520 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545532 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:30:22.551520 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545535 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:30:22.551520 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545537 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:30:22.551520 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545540 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:30:22.551520 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545542 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:30:22.551520 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545545 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:30:22.551520 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545548 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:30:22.551520 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545550 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:30:22.551520 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545554 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:30:22.551520 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545558 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:30:22.551520 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545561 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:30:22.551520 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545564 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:30:22.551520 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545566 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:30:22.551520 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545569 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:30:22.551520 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545572 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:30:22.551520 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545575 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:30:22.551520 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545578 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:30:22.551520 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545580 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:30:22.551520 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545583 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:30:22.551520 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545586 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:30:22.552092 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545590 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:30:22.552092 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545592 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:30:22.552092 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545595 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:30:22.552092 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545598 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:30:22.552092 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545600 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:30:22.552092 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545603 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:30:22.552092 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545606 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:30:22.552092 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545608 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:30:22.552092 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545611 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:30:22.552092 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545613 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:30:22.552092 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545616 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:30:22.552092 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545618 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:30:22.552092 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545621 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:30:22.552092 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545624 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:30:22.552092 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545626 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:30:22.552092 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545629 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:30:22.552092 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545632 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:30:22.552092 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545634 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:30:22.552092 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545637 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:30:22.552092 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545640 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:30:22.552909 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545642 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:30:22.552909 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545644 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:30:22.552909 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545647 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:30:22.552909 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545650 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:30:22.552909 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545652 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:30:22.552909 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545655 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:30:22.552909 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545658 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:30:22.552909 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545660 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:30:22.552909 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545663 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:30:22.552909 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545665 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:30:22.552909 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545669 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:30:22.552909 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545673 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:30:22.552909 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545677 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:30:22.552909 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545680 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:30:22.552909 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545683 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:30:22.552909 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545686 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:30:22.552909 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545689 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:30:22.552909 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545692 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:30:22.552909 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545695 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:30:22.552909 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545697 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:30:22.553521 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545700 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:30:22.553521 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545702 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:30:22.553521 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545705 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:30:22.553521 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.545707 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:30:22.553521 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.546455 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:30:22.554648 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.554628 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 11:30:22.554687 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.554649 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 11:30:22.554715 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554697 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:30:22.554715 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554704 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:30:22.554715 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554708 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:30:22.554715 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554711 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:30:22.554715 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554714 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:30:22.554715 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554717 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:30:22.554864 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554720 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:30:22.554864 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554723 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:30:22.554864 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554726 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:30:22.554864 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554729 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:30:22.554864 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554732 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:30:22.554864 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554735 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:30:22.554864 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554737 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:30:22.554864 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554740 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:30:22.554864 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554743 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:30:22.554864 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554746 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:30:22.554864 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554749 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:30:22.554864 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554752 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:30:22.554864 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554754 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:30:22.554864 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554757 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:30:22.554864 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554759 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:30:22.554864 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554762 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:30:22.554864 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554765 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:30:22.554864 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554768 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:30:22.554864 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554771 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:30:22.554864 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554773 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:30:22.555401 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554777 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:30:22.555401 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554780 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:30:22.555401 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554783 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:30:22.555401 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554785 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:30:22.555401 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554789 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:30:22.555401 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554791 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:30:22.555401 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554794 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:30:22.555401 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554797 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:30:22.555401 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554799 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:30:22.555401 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554802 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:30:22.555401 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554804 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:30:22.555401 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554807 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:30:22.555401 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554810 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:30:22.555401 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554813 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:30:22.555401 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554816 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:30:22.555401 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554819 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:30:22.555401 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554822 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:30:22.555401 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554825 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:30:22.555401 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554827 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:30:22.555875 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554830 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:30:22.555875 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554832 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:30:22.555875 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554835 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:30:22.555875 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554838 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:30:22.555875 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554840 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:30:22.555875 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554843 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:30:22.555875 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554846 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:30:22.555875 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554849 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:30:22.555875 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554851 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:30:22.555875 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554854 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:30:22.555875 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554856 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:30:22.555875 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554859 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:30:22.555875 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554862 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:30:22.555875 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554864 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:30:22.555875 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554867 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:30:22.555875 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554870 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:30:22.555875 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554873 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:30:22.555875 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554876 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:30:22.555875 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554878 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:30:22.555875 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554881 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:30:22.556384 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554884 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:30:22.556384 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554886 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:30:22.556384 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554889 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:30:22.556384 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554892 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:30:22.556384 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554894 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:30:22.556384 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554897 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:30:22.556384 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554900 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:30:22.556384 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554903 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:30:22.556384 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554906 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:30:22.556384 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554909 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:30:22.556384 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554911 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:30:22.556384 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554915 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:30:22.556384 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554917 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:30:22.556384 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554920 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:30:22.556384 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554923 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:30:22.556384 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554925 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:30:22.556384 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554928 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:30:22.556384 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554931 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:30:22.556384 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554933 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:30:22.556384 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554936 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:30:22.556869 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.554938 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:30:22.556869 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.554943 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:30:22.556869 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555042 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:30:22.556869 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555047 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:30:22.556869 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555050 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:30:22.556869 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555053 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:30:22.556869 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555056 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:30:22.556869 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555058 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:30:22.556869 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555062 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:30:22.556869 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555065 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:30:22.556869 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555068 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:30:22.556869 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555071 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:30:22.556869 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555074 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:30:22.556869 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555077 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:30:22.556869 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555079 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:30:22.556869 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555082 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:30:22.557335 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555085 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:30:22.557335 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555087 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:30:22.557335 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555090 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:30:22.557335 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555092 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:30:22.557335 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555095 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:30:22.557335 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555098 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:30:22.557335 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555100 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:30:22.557335 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555103 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:30:22.557335 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555106 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:30:22.557335 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555109 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:30:22.557335 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555111 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:30:22.557335 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555114 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:30:22.557335 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555116 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:30:22.557335 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555119 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:30:22.557335 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555121 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:30:22.557335 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555124 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:30:22.557335 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555126 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:30:22.557335 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555129 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:30:22.557335 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555131 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:30:22.557335 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555134 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:30:22.557823 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555137 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:30:22.557823 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555139 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:30:22.557823 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555142 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:30:22.557823 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555144 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:30:22.557823 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555147 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:30:22.557823 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555150 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:30:22.557823 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555152 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:30:22.557823 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555155 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:30:22.557823 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555158 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:30:22.557823 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555161 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:30:22.557823 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555163 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:30:22.557823 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555166 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:30:22.557823 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555168 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:30:22.557823 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555171 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:30:22.557823 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555173 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:30:22.557823 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555176 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:30:22.557823 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555179 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:30:22.557823 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555181 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:30:22.557823 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555184 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:30:22.558290 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555186 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:30:22.558290 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555189 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:30:22.558290 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555193 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:30:22.558290 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555195 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:30:22.558290 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555199 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:30:22.558290 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555202 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:30:22.558290 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555204 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:30:22.558290 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555207 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:30:22.558290 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555209 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:30:22.558290 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555212 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:30:22.558290 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555214 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:30:22.558290 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555218 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:30:22.558290 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555220 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:30:22.558290 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555223 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:30:22.558290 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555226 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:30:22.558290 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555228 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:30:22.558290 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555232 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:30:22.558290 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555235 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:30:22.558290 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555239 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:30:22.558780 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555242 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:30:22.558780 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555244 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:30:22.558780 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555247 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:30:22.558780 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555250 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:30:22.558780 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555252 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:30:22.558780 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555255 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:30:22.558780 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555258 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:30:22.558780 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555260 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:30:22.558780 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555263 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:30:22.558780 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555283 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:30:22.558780 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555286 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:30:22.558780 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555289 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:30:22.558780 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555291 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:30:22.558780 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:22.555294 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:30:22.558780 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.555299 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:30:22.559160 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.555960 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 11:30:22.559160 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.558024 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 11:30:22.559216 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.559159 2577 server.go:1019] "Starting client certificate rotation" Apr 17 11:30:22.559285 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.559253 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 11:30:22.559322 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.559308 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 11:30:22.587181 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.587161 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 11:30:22.589833 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.589819 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 11:30:22.603788 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.603767 2577 log.go:25] "Validated CRI v1 runtime API" Apr 17 11:30:22.610515 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.610496 2577 log.go:25] "Validated CRI v1 image API" Apr 17 11:30:22.611803 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.611780 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 11:30:22.613325 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.613304 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 11:30:22.618686 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.618667 2577 fs.go:135] Filesystem UUIDs: map[6f4acf59-6ffe-47ad-9dde-0bef30c9e3ab:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 f2baf5d0-d8f7-48c1-8f5f-e7389b22acfd:/dev/nvme0n1p3] Apr 17 11:30:22.618753 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.618686 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 11:30:22.625590 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.625475 2577 manager.go:217] Machine: {Timestamp:2026-04-17 11:30:22.623494354 +0000 UTC m=+0.421208966 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3095359 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2e10880f9740e188235ff48f1f6d0b SystemUUID:ec2e1088-0f97-40e1-8823-5ff48f1f6d0b BootID:78ef1bcb-a298-437f-a3d2-8f0728fef4a7 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:71:ad:72:d4:ef Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:71:ad:72:d4:ef Speed:0 Mtu:9001} {Name:ovs-system MacAddress:b6:5e:67:eb:04:70 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 11:30:22.625590 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.625578 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 11:30:22.625724 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.625659 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 11:30:22.626766 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.626746 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 11:30:22.626904 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.626769 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-64.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 11:30:22.626950 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.626912 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 11:30:22.626950 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.626920 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 11:30:22.626950 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.626933 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 11:30:22.628490 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.628478 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 11:30:22.629880 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.629870 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 17 11:30:22.629989 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.629979 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 11:30:22.632515 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.632505 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 17 11:30:22.632549 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.632517 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 11:30:22.632549 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.632531 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 11:30:22.632549 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.632540 2577 kubelet.go:397] "Adding apiserver pod source" Apr 17 11:30:22.632549 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.632549 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 11:30:22.633747 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.633731 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 11:30:22.633747 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.633750 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 11:30:22.634883 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.634864 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-smtgp" Apr 17 11:30:22.637577 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.637563 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 11:30:22.638818 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.638806 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 11:30:22.639618 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.639601 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-smtgp" Apr 17 11:30:22.640596 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.640582 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 11:30:22.640657 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.640600 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 11:30:22.640657 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.640607 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 11:30:22.640657 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.640615 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 11:30:22.640657 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.640622 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 11:30:22.640657 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.640630 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 11:30:22.640657 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.640638 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 11:30:22.640657 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.640645 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 11:30:22.640657 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.640653 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 11:30:22.640857 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.640662 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 11:30:22.640857 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.640705 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 11:30:22.640857 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.640717 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 11:30:22.641415 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.641405 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 11:30:22.641447 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.641416 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 11:30:22.659173 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.659159 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 11:30:22.659254 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.659200 2577 server.go:1295] "Started kubelet" Apr 17 11:30:22.659361 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.659288 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 11:30:22.659395 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.659335 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 11:30:22.659426 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.659402 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 11:30:22.660148 ip-10-0-134-64 systemd[1]: Started Kubernetes Kubelet. Apr 17 11:30:22.661187 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.661145 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 11:30:22.661821 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.661804 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 17 11:30:22.663489 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.663471 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:30:22.664761 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.664745 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:30:22.666125 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.666109 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 11:30:22.666216 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.666110 2577 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-134-64.ec2.internal" not found Apr 17 11:30:22.666560 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.666543 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 11:30:22.667533 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:22.667470 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-64.ec2.internal\" not found" Apr 17 11:30:22.668024 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.668007 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 11:30:22.668476 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.668297 2577 factory.go:55] Registering systemd factory Apr 17 11:30:22.668618 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.668601 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 11:30:22.668711 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.668641 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 11:30:22.668711 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.668614 2577 factory.go:223] Registration of the systemd container factory successfully Apr 17 11:30:22.669150 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.669131 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 17 11:30:22.669150 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.669148 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 17 11:30:22.669310 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.669293 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:30:22.670626 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.670607 2577 factory.go:153] Registering CRI-O factory Apr 17 11:30:22.670626 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.670629 2577 factory.go:223] Registration of the crio container factory successfully Apr 17 11:30:22.670770 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.670680 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 11:30:22.670770 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.670703 2577 factory.go:103] Registering Raw factory Apr 17 11:30:22.670770 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.670748 2577 manager.go:1196] Started watching for new ooms in manager Apr 17 11:30:22.671339 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:22.671316 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 11:30:22.671444 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.671428 2577 manager.go:319] Starting recovery of all containers Apr 17 11:30:22.672138 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:22.672116 2577 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-134-64.ec2.internal\" not found" node="ip-10-0-134-64.ec2.internal" Apr 17 11:30:22.681208 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.681083 2577 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-134-64.ec2.internal" not found Apr 17 11:30:22.682149 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.682132 2577 manager.go:324] Recovery completed Apr 17 11:30:22.683823 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:22.683801 2577 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 17 11:30:22.686662 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.686650 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:30:22.688458 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.688363 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-64.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:30:22.688458 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.688393 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-64.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:30:22.688458 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.688404 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-64.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:30:22.688909 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.688896 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 11:30:22.688909 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.688909 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 11:30:22.688988 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.688925 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 17 11:30:22.691539 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.691526 2577 policy_none.go:49] "None policy: Start" Apr 17 11:30:22.691588 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.691541 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 11:30:22.691588 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.691551 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 17 11:30:22.734841 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.734827 2577 manager.go:341] "Starting Device Plugin manager" Apr 17 11:30:22.734952 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:22.734863 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 11:30:22.734952 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.734877 2577 server.go:85] "Starting device plugin registration server" Apr 17 11:30:22.735101 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.735088 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 11:30:22.735147 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.735102 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 11:30:22.735221 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.735203 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 11:30:22.735314 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.735303 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 11:30:22.735368 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.735316 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 11:30:22.735799 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:22.735779 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 11:30:22.735906 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:22.735812 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-64.ec2.internal\" not found" Apr 17 11:30:22.737597 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.737578 2577 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-134-64.ec2.internal" not found Apr 17 11:30:22.792223 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.792166 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 11:30:22.793304 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.793286 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 11:30:22.793378 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.793311 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 11:30:22.793378 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.793332 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 11:30:22.793378 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.793341 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 11:30:22.793378 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:22.793370 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 11:30:22.796393 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.796375 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:30:22.836087 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.836053 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:30:22.837716 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.837701 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-64.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:30:22.837782 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.837729 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-64.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:30:22.837782 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.837741 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-64.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:30:22.837782 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.837763 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-64.ec2.internal" Apr 17 11:30:22.846221 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.846207 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-64.ec2.internal" Apr 17 11:30:22.893786 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.893761 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-64.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-134-64.ec2.internal"] Apr 17 11:30:22.896333 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.896315 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-64.ec2.internal" Apr 17 11:30:22.896430 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.896322 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-64.ec2.internal" Apr 17 11:30:22.923900 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.923880 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-64.ec2.internal" Apr 17 11:30:22.928401 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.928387 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-64.ec2.internal" Apr 17 11:30:22.933571 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.933556 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 11:30:22.938888 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.938872 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 11:30:22.971101 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.971078 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/060506f2615e2a76fac2c219a480cfb1-config\") pod \"kube-apiserver-proxy-ip-10-0-134-64.ec2.internal\" (UID: \"060506f2615e2a76fac2c219a480cfb1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-64.ec2.internal" Apr 17 11:30:22.971194 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.971105 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0af5d3062aed12d9101a8cacb1a07582-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-64.ec2.internal\" (UID: \"0af5d3062aed12d9101a8cacb1a07582\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-64.ec2.internal" Apr 17 11:30:22.971194 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:22.971124 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0af5d3062aed12d9101a8cacb1a07582-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-64.ec2.internal\" (UID: \"0af5d3062aed12d9101a8cacb1a07582\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-64.ec2.internal" Apr 17 11:30:23.071415 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.071338 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/060506f2615e2a76fac2c219a480cfb1-config\") pod \"kube-apiserver-proxy-ip-10-0-134-64.ec2.internal\" (UID: \"060506f2615e2a76fac2c219a480cfb1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-64.ec2.internal" Apr 17 11:30:23.071415 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.071368 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0af5d3062aed12d9101a8cacb1a07582-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-64.ec2.internal\" (UID: \"0af5d3062aed12d9101a8cacb1a07582\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-64.ec2.internal" Apr 17 11:30:23.071415 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.071384 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0af5d3062aed12d9101a8cacb1a07582-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-64.ec2.internal\" (UID: \"0af5d3062aed12d9101a8cacb1a07582\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-64.ec2.internal" Apr 17 11:30:23.071594 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.071416 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0af5d3062aed12d9101a8cacb1a07582-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-64.ec2.internal\" (UID: \"0af5d3062aed12d9101a8cacb1a07582\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-64.ec2.internal" Apr 17 11:30:23.071594 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.071414 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/060506f2615e2a76fac2c219a480cfb1-config\") pod \"kube-apiserver-proxy-ip-10-0-134-64.ec2.internal\" (UID: \"060506f2615e2a76fac2c219a480cfb1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-64.ec2.internal" Apr 17 11:30:23.071594 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.071449 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0af5d3062aed12d9101a8cacb1a07582-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-64.ec2.internal\" (UID: \"0af5d3062aed12d9101a8cacb1a07582\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-64.ec2.internal" Apr 17 11:30:23.236042 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.236002 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-64.ec2.internal" Apr 17 11:30:23.241433 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.241417 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-64.ec2.internal" Apr 17 11:30:23.559124 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.559064 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 11:30:23.559732 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.559200 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 11:30:23.559732 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.559204 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 11:30:23.559732 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.559228 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 11:30:23.632794 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.632765 2577 apiserver.go:52] "Watching apiserver" Apr 17 11:30:23.641349 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.641310 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 11:25:22 +0000 UTC" deadline="2027-12-21 18:09:02.952388765 +0000 UTC" Apr 17 11:30:23.641349 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.641346 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14718h38m39.311045885s" Apr 17 11:30:23.642907 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.642883 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 11:30:23.643211 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.643193 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swb84","openshift-cluster-node-tuning-operator/tuned-524kl","openshift-image-registry/node-ca-frb9s","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-64.ec2.internal","openshift-multus/multus-additional-cni-plugins-fl46q","openshift-multus/network-metrics-daemon-xw9bz","openshift-network-diagnostics/network-check-target-cgpzp","kube-system/kube-apiserver-proxy-ip-10-0-134-64.ec2.internal","openshift-dns/node-resolver-stmhs","openshift-multus/multus-cgvrz","openshift-network-operator/iptables-alerter-qkqfc","openshift-ovn-kubernetes/ovnkube-node-znj2s","kube-system/konnectivity-agent-fxwv2"] Apr 17 11:30:23.645456 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.645438 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swb84" Apr 17 11:30:23.646859 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.646839 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.646950 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.646920 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-frb9s" Apr 17 11:30:23.647736 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.647708 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 11:30:23.647821 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.647769 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 11:30:23.647872 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.647848 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-qzfrx\"" Apr 17 11:30:23.647981 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.647969 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 11:30:23.648115 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.648101 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fl46q" Apr 17 11:30:23.649623 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.649605 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:30:23.649723 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.649624 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-9b2wx\"" Apr 17 11:30:23.649865 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.649844 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw9bz" Apr 17 11:30:23.649950 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:23.649910 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw9bz" podUID="4155f35e-1865-499f-88fb-fdde1e2c1218" Apr 17 11:30:23.650218 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.650099 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 11:30:23.650218 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.650118 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 11:30:23.650218 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.650121 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 11:30:23.650218 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.650156 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-5525m\"" Apr 17 11:30:23.650461 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.650217 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 11:30:23.650461 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.650430 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 11:30:23.650461 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.650446 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 11:30:23.650461 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.650460 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 11:30:23.650574 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.650473 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-z79m4\"" Apr 17 11:30:23.650574 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.650460 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 11:30:23.651186 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.651171 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 11:30:23.652416 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.652401 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cgpzp" Apr 17 11:30:23.652486 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:23.652469 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cgpzp" podUID="34d03c01-00bf-416b-8b46-2274587cc240" Apr 17 11:30:23.652524 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.652496 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-stmhs" Apr 17 11:30:23.653620 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.653606 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.654358 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.654345 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-ptsrr\"" Apr 17 11:30:23.654521 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.654504 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 11:30:23.654581 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.654537 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 11:30:23.654863 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.654849 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qkqfc" Apr 17 11:30:23.655660 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.655643 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 11:30:23.655756 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.655686 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-hw8ht\"" Apr 17 11:30:23.656248 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.656231 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.656817 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.656801 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:30:23.656982 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.656969 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 11:30:23.656982 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.656977 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 11:30:23.657116 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.657102 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-z9q6w\"" Apr 17 11:30:23.657307 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.657291 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-fxwv2" Apr 17 11:30:23.658330 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.658314 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 11:30:23.658390 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.658344 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 11:30:23.658613 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.658580 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 11:30:23.658613 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.658608 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 11:30:23.658720 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.658710 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-5z96q\"" Apr 17 11:30:23.658776 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.658727 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 11:30:23.659939 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.659894 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 11:30:23.660819 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.660796 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-j7bcj\"" Apr 17 11:30:23.660898 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.660805 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 11:30:23.660898 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.660842 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 11:30:23.667342 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.666432 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 11:30:23.669428 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.669410 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 11:30:23.674058 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.674038 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff8d9fca-80b5-4d5a-99c0-374a747b0900-host\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.674147 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.674085 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ff8d9fca-80b5-4d5a-99c0-374a747b0900-sys\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.674147 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.674111 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw4tc\" (UniqueName: \"kubernetes.io/projected/517a579e-7efd-4d38-8225-2b0c7c48d532-kube-api-access-mw4tc\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.674147 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.674141 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-run-ovn\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.674341 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.674164 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-node-log\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.674341 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.674210 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-log-socket\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.674341 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.674246 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43535899-eb5a-4030-8bab-db2650a0cbff-ovnkube-config\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.674341 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.674294 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43535899-eb5a-4030-8bab-db2650a0cbff-env-overrides\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.674341 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.674319 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thz6c\" (UniqueName: \"kubernetes.io/projected/3f06ebee-cbe3-4266-bf01-0bb889437be7-kube-api-access-thz6c\") pod \"node-resolver-stmhs\" (UID: \"3f06ebee-cbe3-4266-bf01-0bb889437be7\") " pod="openshift-dns/node-resolver-stmhs" Apr 17 11:30:23.674573 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.674344 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ff8d9fca-80b5-4d5a-99c0-374a747b0900-lib-modules\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.674573 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.674366 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-host-var-lib-cni-multus\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.674573 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.674388 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-var-lib-openvswitch\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.674573 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.674431 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ecf1ff91-10cb-4ca8-8a83-7ed6b852b5ac-iptables-alerter-script\") pod \"iptables-alerter-qkqfc\" (UID: \"ecf1ff91-10cb-4ca8-8a83-7ed6b852b5ac\") " pod="openshift-network-operator/iptables-alerter-qkqfc" Apr 17 11:30:23.674573 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.674477 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ecf1ff91-10cb-4ca8-8a83-7ed6b852b5ac-host-slash\") pod \"iptables-alerter-qkqfc\" (UID: \"ecf1ff91-10cb-4ca8-8a83-7ed6b852b5ac\") " pod="openshift-network-operator/iptables-alerter-qkqfc" Apr 17 11:30:23.674573 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.674505 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ff8d9fca-80b5-4d5a-99c0-374a747b0900-etc-modprobe-d\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.674573 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.674535 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-cnibin\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.674573 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.674558 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-run-systemd\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.674919 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.674601 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-multus-conf-dir\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.674919 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.674641 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8ppd\" (UniqueName: \"kubernetes.io/projected/ff8d9fca-80b5-4d5a-99c0-374a747b0900-kube-api-access-q8ppd\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.674919 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.674665 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/187342e6-1155-44f7-a799-bfeab7d58152-etc-selinux\") pod \"aws-ebs-csi-driver-node-swb84\" (UID: \"187342e6-1155-44f7-a799-bfeab7d58152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swb84" Apr 17 11:30:23.674919 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.674687 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ff8d9fca-80b5-4d5a-99c0-374a747b0900-run\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.674919 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.674719 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bb2737eb-5571-4fee-8d9a-10110cc1a205-cni-binary-copy\") pod \"multus-additional-cni-plugins-fl46q\" (UID: \"bb2737eb-5571-4fee-8d9a-10110cc1a205\") " pod="openshift-multus/multus-additional-cni-plugins-fl46q" Apr 17 11:30:23.674919 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.674748 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-host-var-lib-kubelet\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.674919 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.674771 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-host-cni-bin\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.674919 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.674793 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3f06ebee-cbe3-4266-bf01-0bb889437be7-tmp-dir\") pod \"node-resolver-stmhs\" (UID: \"3f06ebee-cbe3-4266-bf01-0bb889437be7\") " pod="openshift-dns/node-resolver-stmhs" Apr 17 11:30:23.674919 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.674815 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/187342e6-1155-44f7-a799-bfeab7d58152-sys-fs\") pod \"aws-ebs-csi-driver-node-swb84\" (UID: \"187342e6-1155-44f7-a799-bfeab7d58152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swb84" Apr 17 11:30:23.674919 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.674844 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff8d9fca-80b5-4d5a-99c0-374a747b0900-var-lib-kubelet\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.674919 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.674874 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-system-cni-dir\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.674919 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.674915 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-multus-cni-dir\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.675292 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.674947 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-etc-kubernetes\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.675292 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.674970 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-run-openvswitch\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.675292 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.674984 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/187342e6-1155-44f7-a799-bfeab7d58152-registration-dir\") pod \"aws-ebs-csi-driver-node-swb84\" (UID: \"187342e6-1155-44f7-a799-bfeab7d58152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swb84" Apr 17 11:30:23.675292 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675024 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kh25\" (UniqueName: \"kubernetes.io/projected/187342e6-1155-44f7-a799-bfeab7d58152-kube-api-access-2kh25\") pod \"aws-ebs-csi-driver-node-swb84\" (UID: \"187342e6-1155-44f7-a799-bfeab7d58152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swb84" Apr 17 11:30:23.675292 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675059 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bb2737eb-5571-4fee-8d9a-10110cc1a205-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fl46q\" (UID: \"bb2737eb-5571-4fee-8d9a-10110cc1a205\") " pod="openshift-multus/multus-additional-cni-plugins-fl46q" Apr 17 11:30:23.675292 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675085 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4155f35e-1865-499f-88fb-fdde1e2c1218-metrics-certs\") pod \"network-metrics-daemon-xw9bz\" (UID: \"4155f35e-1865-499f-88fb-fdde1e2c1218\") " pod="openshift-multus/network-metrics-daemon-xw9bz" Apr 17 11:30:23.675292 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675107 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-host-var-lib-cni-bin\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.675292 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675128 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-host-cni-netd\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.675292 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675151 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/187342e6-1155-44f7-a799-bfeab7d58152-device-dir\") pod \"aws-ebs-csi-driver-node-swb84\" (UID: \"187342e6-1155-44f7-a799-bfeab7d58152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swb84" Apr 17 11:30:23.675292 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675173 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ff8d9fca-80b5-4d5a-99c0-374a747b0900-etc-sysconfig\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.675292 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675196 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-host-slash\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.675292 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675221 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ff8d9fca-80b5-4d5a-99c0-374a747b0900-etc-tuned\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.675292 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675244 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/517a579e-7efd-4d38-8225-2b0c7c48d532-cni-binary-copy\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.675292 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675285 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-host-run-netns\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.675792 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675308 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-hostroot\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.675792 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675330 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-host-kubelet\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.675792 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675352 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqb72\" (UniqueName: \"kubernetes.io/projected/43535899-eb5a-4030-8bab-db2650a0cbff-kube-api-access-vqb72\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.675792 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675400 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff8d9fca-80b5-4d5a-99c0-374a747b0900-etc-kubernetes\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.675792 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675432 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ff8d9fca-80b5-4d5a-99c0-374a747b0900-etc-systemd\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.675792 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675457 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bb2737eb-5571-4fee-8d9a-10110cc1a205-cnibin\") pod \"multus-additional-cni-plugins-fl46q\" (UID: \"bb2737eb-5571-4fee-8d9a-10110cc1a205\") " pod="openshift-multus/multus-additional-cni-plugins-fl46q" Apr 17 11:30:23.675792 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675479 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bb2737eb-5571-4fee-8d9a-10110cc1a205-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fl46q\" (UID: \"bb2737eb-5571-4fee-8d9a-10110cc1a205\") " pod="openshift-multus/multus-additional-cni-plugins-fl46q" Apr 17 11:30:23.675792 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675500 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-systemd-units\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.675792 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675523 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a1b0eda5-8b26-4ce3-af63-74364b0ea28f-host\") pod \"node-ca-frb9s\" (UID: \"a1b0eda5-8b26-4ce3-af63-74364b0ea28f\") " pod="openshift-image-registry/node-ca-frb9s" Apr 17 11:30:23.675792 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675544 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3f06ebee-cbe3-4266-bf01-0bb889437be7-hosts-file\") pod \"node-resolver-stmhs\" (UID: \"3f06ebee-cbe3-4266-bf01-0bb889437be7\") " pod="openshift-dns/node-resolver-stmhs" Apr 17 11:30:23.675792 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675565 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bb2737eb-5571-4fee-8d9a-10110cc1a205-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fl46q\" (UID: \"bb2737eb-5571-4fee-8d9a-10110cc1a205\") " pod="openshift-multus/multus-additional-cni-plugins-fl46q" Apr 17 11:30:23.675792 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675583 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-os-release\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.675792 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675602 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-host-run-multus-certs\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.675792 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675624 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43535899-eb5a-4030-8bab-db2650a0cbff-ovnkube-script-lib\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.675792 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675643 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/187342e6-1155-44f7-a799-bfeab7d58152-kubelet-dir\") pod \"aws-ebs-csi-driver-node-swb84\" (UID: \"187342e6-1155-44f7-a799-bfeab7d58152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swb84" Apr 17 11:30:23.675792 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675664 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ff8d9fca-80b5-4d5a-99c0-374a747b0900-etc-sysctl-d\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.676434 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675684 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tph5z\" (UniqueName: \"kubernetes.io/projected/4155f35e-1865-499f-88fb-fdde1e2c1218-kube-api-access-tph5z\") pod \"network-metrics-daemon-xw9bz\" (UID: \"4155f35e-1865-499f-88fb-fdde1e2c1218\") " pod="openshift-multus/network-metrics-daemon-xw9bz" Apr 17 11:30:23.676434 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675705 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43535899-eb5a-4030-8bab-db2650a0cbff-ovn-node-metrics-cert\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.676434 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675724 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a1b0eda5-8b26-4ce3-af63-74364b0ea28f-serviceca\") pod \"node-ca-frb9s\" (UID: \"a1b0eda5-8b26-4ce3-af63-74364b0ea28f\") " pod="openshift-image-registry/node-ca-frb9s" Apr 17 11:30:23.676434 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675757 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/93c44504-2cac-4acc-82af-a24fa55d1c56-agent-certs\") pod \"konnectivity-agent-fxwv2\" (UID: \"93c44504-2cac-4acc-82af-a24fa55d1c56\") " pod="kube-system/konnectivity-agent-fxwv2" Apr 17 11:30:23.676434 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675782 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/93c44504-2cac-4acc-82af-a24fa55d1c56-konnectivity-ca\") pod \"konnectivity-agent-fxwv2\" (UID: \"93c44504-2cac-4acc-82af-a24fa55d1c56\") " pod="kube-system/konnectivity-agent-fxwv2" Apr 17 11:30:23.676434 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675808 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ff8d9fca-80b5-4d5a-99c0-374a747b0900-etc-sysctl-conf\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.676434 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675833 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klfd4\" (UniqueName: \"kubernetes.io/projected/bb2737eb-5571-4fee-8d9a-10110cc1a205-kube-api-access-klfd4\") pod \"multus-additional-cni-plugins-fl46q\" (UID: \"bb2737eb-5571-4fee-8d9a-10110cc1a205\") " pod="openshift-multus/multus-additional-cni-plugins-fl46q" Apr 17 11:30:23.676434 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675857 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-multus-socket-dir-parent\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.676434 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675885 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/517a579e-7efd-4d38-8225-2b0c7c48d532-multus-daemon-config\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.676434 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675920 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m77s\" (UniqueName: \"kubernetes.io/projected/34d03c01-00bf-416b-8b46-2274587cc240-kube-api-access-5m77s\") pod \"network-check-target-cgpzp\" (UID: \"34d03c01-00bf-416b-8b46-2274587cc240\") " pod="openshift-network-diagnostics/network-check-target-cgpzp" Apr 17 11:30:23.676434 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675942 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-host-run-k8s-cni-cncf-io\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.676434 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675964 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-etc-openvswitch\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.676434 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.675987 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-host-run-ovn-kubernetes\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.676434 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.676010 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.676434 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.676035 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mm69\" (UniqueName: \"kubernetes.io/projected/ecf1ff91-10cb-4ca8-8a83-7ed6b852b5ac-kube-api-access-6mm69\") pod \"iptables-alerter-qkqfc\" (UID: \"ecf1ff91-10cb-4ca8-8a83-7ed6b852b5ac\") " pod="openshift-network-operator/iptables-alerter-qkqfc" Apr 17 11:30:23.676434 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.676058 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bb2737eb-5571-4fee-8d9a-10110cc1a205-os-release\") pod \"multus-additional-cni-plugins-fl46q\" (UID: \"bb2737eb-5571-4fee-8d9a-10110cc1a205\") " pod="openshift-multus/multus-additional-cni-plugins-fl46q" Apr 17 11:30:23.676910 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.676083 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-host-run-netns\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.676910 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.676113 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thddn\" (UniqueName: \"kubernetes.io/projected/a1b0eda5-8b26-4ce3-af63-74364b0ea28f-kube-api-access-thddn\") pod \"node-ca-frb9s\" (UID: \"a1b0eda5-8b26-4ce3-af63-74364b0ea28f\") " pod="openshift-image-registry/node-ca-frb9s" Apr 17 11:30:23.676910 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.676133 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ff8d9fca-80b5-4d5a-99c0-374a747b0900-tmp\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.676910 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.676157 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bb2737eb-5571-4fee-8d9a-10110cc1a205-system-cni-dir\") pod \"multus-additional-cni-plugins-fl46q\" (UID: \"bb2737eb-5571-4fee-8d9a-10110cc1a205\") " pod="openshift-multus/multus-additional-cni-plugins-fl46q" Apr 17 11:30:23.676910 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.676184 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/187342e6-1155-44f7-a799-bfeab7d58152-socket-dir\") pod \"aws-ebs-csi-driver-node-swb84\" (UID: \"187342e6-1155-44f7-a799-bfeab7d58152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swb84" Apr 17 11:30:23.676910 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.676519 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 11:30:23.686385 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:23.686340 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod060506f2615e2a76fac2c219a480cfb1.slice/crio-9805d0375aa23453125082c2edad06a941eb00ffb20c1fd51ab21b14436c2caf WatchSource:0}: Error finding container 9805d0375aa23453125082c2edad06a941eb00ffb20c1fd51ab21b14436c2caf: Status 404 returned error can't find the container with id 9805d0375aa23453125082c2edad06a941eb00ffb20c1fd51ab21b14436c2caf Apr 17 11:30:23.686618 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:23.686599 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0af5d3062aed12d9101a8cacb1a07582.slice/crio-e791d95dfafc80eac4f11958688b5cb075d21844cdfdd90b4fc0039638a1a119 WatchSource:0}: Error finding container e791d95dfafc80eac4f11958688b5cb075d21844cdfdd90b4fc0039638a1a119: Status 404 returned error can't find the container with id e791d95dfafc80eac4f11958688b5cb075d21844cdfdd90b4fc0039638a1a119 Apr 17 11:30:23.690622 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.690607 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 11:30:23.696634 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.696615 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-zgwzf" Apr 17 11:30:23.708187 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.708164 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-zgwzf" Apr 17 11:30:23.776785 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.776760 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/187342e6-1155-44f7-a799-bfeab7d58152-device-dir\") pod \"aws-ebs-csi-driver-node-swb84\" (UID: \"187342e6-1155-44f7-a799-bfeab7d58152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swb84" Apr 17 11:30:23.776785 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.776787 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ff8d9fca-80b5-4d5a-99c0-374a747b0900-etc-sysconfig\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.776968 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.776812 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-host-slash\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.776968 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.776828 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ff8d9fca-80b5-4d5a-99c0-374a747b0900-etc-tuned\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.776968 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.776842 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/517a579e-7efd-4d38-8225-2b0c7c48d532-cni-binary-copy\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.776968 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.776877 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-host-run-netns\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.776968 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.776880 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/187342e6-1155-44f7-a799-bfeab7d58152-device-dir\") pod \"aws-ebs-csi-driver-node-swb84\" (UID: \"187342e6-1155-44f7-a799-bfeab7d58152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swb84" Apr 17 11:30:23.776968 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.776890 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-host-slash\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.776968 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.776899 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-hostroot\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.776968 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.776893 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ff8d9fca-80b5-4d5a-99c0-374a747b0900-etc-sysconfig\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.777379 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.777002 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-hostroot\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.777379 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.777048 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-host-run-netns\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.777379 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.777079 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-host-kubelet\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.777379 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.777109 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vqb72\" (UniqueName: \"kubernetes.io/projected/43535899-eb5a-4030-8bab-db2650a0cbff-kube-api-access-vqb72\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.777379 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.777136 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff8d9fca-80b5-4d5a-99c0-374a747b0900-etc-kubernetes\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.777379 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.777142 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-host-kubelet\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.777379 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.777160 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ff8d9fca-80b5-4d5a-99c0-374a747b0900-etc-systemd\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.777379 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.777179 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff8d9fca-80b5-4d5a-99c0-374a747b0900-etc-kubernetes\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.777379 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.777187 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bb2737eb-5571-4fee-8d9a-10110cc1a205-cnibin\") pod \"multus-additional-cni-plugins-fl46q\" (UID: \"bb2737eb-5571-4fee-8d9a-10110cc1a205\") " pod="openshift-multus/multus-additional-cni-plugins-fl46q" Apr 17 11:30:23.777379 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.777202 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ff8d9fca-80b5-4d5a-99c0-374a747b0900-etc-systemd\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.777379 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.777212 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bb2737eb-5571-4fee-8d9a-10110cc1a205-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fl46q\" (UID: \"bb2737eb-5571-4fee-8d9a-10110cc1a205\") " pod="openshift-multus/multus-additional-cni-plugins-fl46q" Apr 17 11:30:23.777379 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.777230 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bb2737eb-5571-4fee-8d9a-10110cc1a205-cnibin\") pod \"multus-additional-cni-plugins-fl46q\" (UID: \"bb2737eb-5571-4fee-8d9a-10110cc1a205\") " pod="openshift-multus/multus-additional-cni-plugins-fl46q" Apr 17 11:30:23.777379 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.777237 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-systemd-units\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.777379 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.777160 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 11:30:23.777379 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.777261 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a1b0eda5-8b26-4ce3-af63-74364b0ea28f-host\") pod \"node-ca-frb9s\" (UID: \"a1b0eda5-8b26-4ce3-af63-74364b0ea28f\") " pod="openshift-image-registry/node-ca-frb9s" Apr 17 11:30:23.777379 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.777304 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3f06ebee-cbe3-4266-bf01-0bb889437be7-hosts-file\") pod \"node-resolver-stmhs\" (UID: \"3f06ebee-cbe3-4266-bf01-0bb889437be7\") " pod="openshift-dns/node-resolver-stmhs" Apr 17 11:30:23.777379 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.777327 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bb2737eb-5571-4fee-8d9a-10110cc1a205-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fl46q\" (UID: \"bb2737eb-5571-4fee-8d9a-10110cc1a205\") " pod="openshift-multus/multus-additional-cni-plugins-fl46q" Apr 17 11:30:23.777379 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.777339 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bb2737eb-5571-4fee-8d9a-10110cc1a205-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fl46q\" (UID: \"bb2737eb-5571-4fee-8d9a-10110cc1a205\") " pod="openshift-multus/multus-additional-cni-plugins-fl46q" Apr 17 11:30:23.777981 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.777358 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-os-release\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.777981 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.777310 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a1b0eda5-8b26-4ce3-af63-74364b0ea28f-host\") pod \"node-ca-frb9s\" (UID: \"a1b0eda5-8b26-4ce3-af63-74364b0ea28f\") " pod="openshift-image-registry/node-ca-frb9s" Apr 17 11:30:23.777981 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.777368 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3f06ebee-cbe3-4266-bf01-0bb889437be7-hosts-file\") pod \"node-resolver-stmhs\" (UID: \"3f06ebee-cbe3-4266-bf01-0bb889437be7\") " pod="openshift-dns/node-resolver-stmhs" Apr 17 11:30:23.777981 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.777262 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-systemd-units\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.777981 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.777382 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-host-run-multus-certs\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.777981 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.777406 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43535899-eb5a-4030-8bab-db2650a0cbff-ovnkube-script-lib\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.777981 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.777410 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-host-run-multus-certs\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.777981 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.777429 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/517a579e-7efd-4d38-8225-2b0c7c48d532-cni-binary-copy\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.777981 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.777431 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/187342e6-1155-44f7-a799-bfeab7d58152-kubelet-dir\") pod \"aws-ebs-csi-driver-node-swb84\" (UID: \"187342e6-1155-44f7-a799-bfeab7d58152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swb84" Apr 17 11:30:23.777981 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.777470 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ff8d9fca-80b5-4d5a-99c0-374a747b0900-etc-sysctl-d\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.777981 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.777478 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/187342e6-1155-44f7-a799-bfeab7d58152-kubelet-dir\") pod \"aws-ebs-csi-driver-node-swb84\" (UID: \"187342e6-1155-44f7-a799-bfeab7d58152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swb84" Apr 17 11:30:23.777981 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.777432 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-os-release\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.777981 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.777498 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tph5z\" (UniqueName: \"kubernetes.io/projected/4155f35e-1865-499f-88fb-fdde1e2c1218-kube-api-access-tph5z\") pod \"network-metrics-daemon-xw9bz\" (UID: \"4155f35e-1865-499f-88fb-fdde1e2c1218\") " pod="openshift-multus/network-metrics-daemon-xw9bz" Apr 17 11:30:23.777981 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.777546 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43535899-eb5a-4030-8bab-db2650a0cbff-ovn-node-metrics-cert\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.777981 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.777582 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a1b0eda5-8b26-4ce3-af63-74364b0ea28f-serviceca\") pod \"node-ca-frb9s\" (UID: \"a1b0eda5-8b26-4ce3-af63-74364b0ea28f\") " pod="openshift-image-registry/node-ca-frb9s" Apr 17 11:30:23.777981 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.777616 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/93c44504-2cac-4acc-82af-a24fa55d1c56-agent-certs\") pod \"konnectivity-agent-fxwv2\" (UID: \"93c44504-2cac-4acc-82af-a24fa55d1c56\") " pod="kube-system/konnectivity-agent-fxwv2" Apr 17 11:30:23.777981 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.777630 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ff8d9fca-80b5-4d5a-99c0-374a747b0900-etc-sysctl-d\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.777981 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.777783 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bb2737eb-5571-4fee-8d9a-10110cc1a205-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fl46q\" (UID: \"bb2737eb-5571-4fee-8d9a-10110cc1a205\") " pod="openshift-multus/multus-additional-cni-plugins-fl46q" Apr 17 11:30:23.778616 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.778118 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43535899-eb5a-4030-8bab-db2650a0cbff-ovnkube-script-lib\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.778616 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.778149 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/93c44504-2cac-4acc-82af-a24fa55d1c56-konnectivity-ca\") pod \"konnectivity-agent-fxwv2\" (UID: \"93c44504-2cac-4acc-82af-a24fa55d1c56\") " pod="kube-system/konnectivity-agent-fxwv2" Apr 17 11:30:23.778616 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.778190 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ff8d9fca-80b5-4d5a-99c0-374a747b0900-etc-sysctl-conf\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.778616 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.778213 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-klfd4\" (UniqueName: \"kubernetes.io/projected/bb2737eb-5571-4fee-8d9a-10110cc1a205-kube-api-access-klfd4\") pod \"multus-additional-cni-plugins-fl46q\" (UID: \"bb2737eb-5571-4fee-8d9a-10110cc1a205\") " pod="openshift-multus/multus-additional-cni-plugins-fl46q" Apr 17 11:30:23.778616 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.778514 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-multus-socket-dir-parent\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.778616 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.778570 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/517a579e-7efd-4d38-8225-2b0c7c48d532-multus-daemon-config\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.779090 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.779067 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-multus-socket-dir-parent\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.779233 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.779215 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ff8d9fca-80b5-4d5a-99c0-374a747b0900-etc-sysctl-conf\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.779488 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.779379 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a1b0eda5-8b26-4ce3-af63-74364b0ea28f-serviceca\") pod \"node-ca-frb9s\" (UID: \"a1b0eda5-8b26-4ce3-af63-74364b0ea28f\") " pod="openshift-image-registry/node-ca-frb9s" Apr 17 11:30:23.779897 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.779875 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/517a579e-7efd-4d38-8225-2b0c7c48d532-multus-daemon-config\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.780682 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.780657 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/93c44504-2cac-4acc-82af-a24fa55d1c56-konnectivity-ca\") pod \"konnectivity-agent-fxwv2\" (UID: \"93c44504-2cac-4acc-82af-a24fa55d1c56\") " pod="kube-system/konnectivity-agent-fxwv2" Apr 17 11:30:23.780920 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.780901 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ff8d9fca-80b5-4d5a-99c0-374a747b0900-etc-tuned\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.781309 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.778605 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5m77s\" (UniqueName: \"kubernetes.io/projected/34d03c01-00bf-416b-8b46-2274587cc240-kube-api-access-5m77s\") pod \"network-check-target-cgpzp\" (UID: \"34d03c01-00bf-416b-8b46-2274587cc240\") " pod="openshift-network-diagnostics/network-check-target-cgpzp" Apr 17 11:30:23.781404 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.781364 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-host-run-k8s-cni-cncf-io\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.781461 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.781399 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-etc-openvswitch\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.781461 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.781434 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-host-run-ovn-kubernetes\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.781560 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.781464 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.781560 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.781514 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6mm69\" (UniqueName: \"kubernetes.io/projected/ecf1ff91-10cb-4ca8-8a83-7ed6b852b5ac-kube-api-access-6mm69\") pod \"iptables-alerter-qkqfc\" (UID: \"ecf1ff91-10cb-4ca8-8a83-7ed6b852b5ac\") " pod="openshift-network-operator/iptables-alerter-qkqfc" Apr 17 11:30:23.781560 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.781535 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43535899-eb5a-4030-8bab-db2650a0cbff-ovn-node-metrics-cert\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.781560 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.781545 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bb2737eb-5571-4fee-8d9a-10110cc1a205-os-release\") pod \"multus-additional-cni-plugins-fl46q\" (UID: \"bb2737eb-5571-4fee-8d9a-10110cc1a205\") " pod="openshift-multus/multus-additional-cni-plugins-fl46q" Apr 17 11:30:23.781726 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.781633 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-host-run-ovn-kubernetes\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.781771 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.781754 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-host-run-netns\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.781818 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.781574 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-host-run-netns\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.782153 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.782063 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thddn\" (UniqueName: \"kubernetes.io/projected/a1b0eda5-8b26-4ce3-af63-74364b0ea28f-kube-api-access-thddn\") pod \"node-ca-frb9s\" (UID: \"a1b0eda5-8b26-4ce3-af63-74364b0ea28f\") " pod="openshift-image-registry/node-ca-frb9s" Apr 17 11:30:23.782318 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.782299 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ff8d9fca-80b5-4d5a-99c0-374a747b0900-tmp\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.782447 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.782234 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bb2737eb-5571-4fee-8d9a-10110cc1a205-os-release\") pod \"multus-additional-cni-plugins-fl46q\" (UID: \"bb2737eb-5571-4fee-8d9a-10110cc1a205\") " pod="openshift-multus/multus-additional-cni-plugins-fl46q" Apr 17 11:30:23.782539 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.782437 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bb2737eb-5571-4fee-8d9a-10110cc1a205-system-cni-dir\") pod \"multus-additional-cni-plugins-fl46q\" (UID: \"bb2737eb-5571-4fee-8d9a-10110cc1a205\") " pod="openshift-multus/multus-additional-cni-plugins-fl46q" Apr 17 11:30:23.782686 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.782667 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/187342e6-1155-44f7-a799-bfeab7d58152-socket-dir\") pod \"aws-ebs-csi-driver-node-swb84\" (UID: \"187342e6-1155-44f7-a799-bfeab7d58152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swb84" Apr 17 11:30:23.782759 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.782529 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-host-run-k8s-cni-cncf-io\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.782759 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.781700 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.782759 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.782623 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-etc-openvswitch\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.782759 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.782709 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff8d9fca-80b5-4d5a-99c0-374a747b0900-host\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.782759 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.782626 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bb2737eb-5571-4fee-8d9a-10110cc1a205-system-cni-dir\") pod \"multus-additional-cni-plugins-fl46q\" (UID: \"bb2737eb-5571-4fee-8d9a-10110cc1a205\") " pod="openshift-multus/multus-additional-cni-plugins-fl46q" Apr 17 11:30:23.782995 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.782757 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ff8d9fca-80b5-4d5a-99c0-374a747b0900-sys\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.782995 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.782831 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ff8d9fca-80b5-4d5a-99c0-374a747b0900-sys\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.783638 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.783612 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff8d9fca-80b5-4d5a-99c0-374a747b0900-host\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.783638 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.783616 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/187342e6-1155-44f7-a799-bfeab7d58152-socket-dir\") pod \"aws-ebs-csi-driver-node-swb84\" (UID: \"187342e6-1155-44f7-a799-bfeab7d58152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swb84" Apr 17 11:30:23.783787 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.783670 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mw4tc\" (UniqueName: \"kubernetes.io/projected/517a579e-7efd-4d38-8225-2b0c7c48d532-kube-api-access-mw4tc\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.783787 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.783709 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-run-ovn\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.783787 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.783735 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-node-log\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.783787 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.783767 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-log-socket\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.784004 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.783792 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43535899-eb5a-4030-8bab-db2650a0cbff-ovnkube-config\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.784004 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.783789 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-run-ovn\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.784004 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.783792 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-node-log\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.784004 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.783826 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-log-socket\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.784004 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.783867 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43535899-eb5a-4030-8bab-db2650a0cbff-env-overrides\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.784004 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.783898 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thz6c\" (UniqueName: \"kubernetes.io/projected/3f06ebee-cbe3-4266-bf01-0bb889437be7-kube-api-access-thz6c\") pod \"node-resolver-stmhs\" (UID: \"3f06ebee-cbe3-4266-bf01-0bb889437be7\") " pod="openshift-dns/node-resolver-stmhs" Apr 17 11:30:23.784004 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.783926 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ff8d9fca-80b5-4d5a-99c0-374a747b0900-lib-modules\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.784298 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784039 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ff8d9fca-80b5-4d5a-99c0-374a747b0900-lib-modules\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.784298 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784075 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-host-var-lib-cni-multus\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.784298 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784100 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-var-lib-openvswitch\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.784298 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784116 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ecf1ff91-10cb-4ca8-8a83-7ed6b852b5ac-iptables-alerter-script\") pod \"iptables-alerter-qkqfc\" (UID: \"ecf1ff91-10cb-4ca8-8a83-7ed6b852b5ac\") " pod="openshift-network-operator/iptables-alerter-qkqfc" Apr 17 11:30:23.784298 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784131 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ecf1ff91-10cb-4ca8-8a83-7ed6b852b5ac-host-slash\") pod \"iptables-alerter-qkqfc\" (UID: \"ecf1ff91-10cb-4ca8-8a83-7ed6b852b5ac\") " pod="openshift-network-operator/iptables-alerter-qkqfc" Apr 17 11:30:23.784298 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784151 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ff8d9fca-80b5-4d5a-99c0-374a747b0900-etc-modprobe-d\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.784298 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784175 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-host-var-lib-cni-multus\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.784298 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784179 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-cnibin\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.784298 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784200 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ecf1ff91-10cb-4ca8-8a83-7ed6b852b5ac-host-slash\") pod \"iptables-alerter-qkqfc\" (UID: \"ecf1ff91-10cb-4ca8-8a83-7ed6b852b5ac\") " pod="openshift-network-operator/iptables-alerter-qkqfc" Apr 17 11:30:23.784298 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784214 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-run-systemd\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.784298 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784237 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-multus-conf-dir\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.784298 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784263 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8ppd\" (UniqueName: \"kubernetes.io/projected/ff8d9fca-80b5-4d5a-99c0-374a747b0900-kube-api-access-q8ppd\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.784298 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784296 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43535899-eb5a-4030-8bab-db2650a0cbff-ovnkube-config\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.784870 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784306 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/187342e6-1155-44f7-a799-bfeab7d58152-etc-selinux\") pod \"aws-ebs-csi-driver-node-swb84\" (UID: \"187342e6-1155-44f7-a799-bfeab7d58152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swb84" Apr 17 11:30:23.784870 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784329 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ff8d9fca-80b5-4d5a-99c0-374a747b0900-run\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.784870 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784348 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-var-lib-openvswitch\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.784870 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784351 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43535899-eb5a-4030-8bab-db2650a0cbff-env-overrides\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.784870 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784353 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bb2737eb-5571-4fee-8d9a-10110cc1a205-cni-binary-copy\") pod \"multus-additional-cni-plugins-fl46q\" (UID: \"bb2737eb-5571-4fee-8d9a-10110cc1a205\") " pod="openshift-multus/multus-additional-cni-plugins-fl46q" Apr 17 11:30:23.784870 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784399 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-host-var-lib-kubelet\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.784870 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784415 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-host-cni-bin\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.784870 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784430 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3f06ebee-cbe3-4266-bf01-0bb889437be7-tmp-dir\") pod \"node-resolver-stmhs\" (UID: \"3f06ebee-cbe3-4266-bf01-0bb889437be7\") " pod="openshift-dns/node-resolver-stmhs" Apr 17 11:30:23.784870 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784433 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ff8d9fca-80b5-4d5a-99c0-374a747b0900-etc-modprobe-d\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.784870 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784444 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/187342e6-1155-44f7-a799-bfeab7d58152-sys-fs\") pod \"aws-ebs-csi-driver-node-swb84\" (UID: \"187342e6-1155-44f7-a799-bfeab7d58152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swb84" Apr 17 11:30:23.784870 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784459 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff8d9fca-80b5-4d5a-99c0-374a747b0900-var-lib-kubelet\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.784870 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784473 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-system-cni-dir\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.784870 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784473 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-host-cni-bin\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.784870 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784500 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-host-var-lib-kubelet\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.784870 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784583 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-multus-cni-dir\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.784870 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784609 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-etc-kubernetes\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.784870 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784627 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/93c44504-2cac-4acc-82af-a24fa55d1c56-agent-certs\") pod \"konnectivity-agent-fxwv2\" (UID: \"93c44504-2cac-4acc-82af-a24fa55d1c56\") " pod="kube-system/konnectivity-agent-fxwv2" Apr 17 11:30:23.784870 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784634 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-run-openvswitch\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.785455 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784665 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/187342e6-1155-44f7-a799-bfeab7d58152-registration-dir\") pod \"aws-ebs-csi-driver-node-swb84\" (UID: \"187342e6-1155-44f7-a799-bfeab7d58152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swb84" Apr 17 11:30:23.785455 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784688 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3f06ebee-cbe3-4266-bf01-0bb889437be7-tmp-dir\") pod \"node-resolver-stmhs\" (UID: \"3f06ebee-cbe3-4266-bf01-0bb889437be7\") " pod="openshift-dns/node-resolver-stmhs" Apr 17 11:30:23.785455 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784693 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2kh25\" (UniqueName: \"kubernetes.io/projected/187342e6-1155-44f7-a799-bfeab7d58152-kube-api-access-2kh25\") pod \"aws-ebs-csi-driver-node-swb84\" (UID: \"187342e6-1155-44f7-a799-bfeab7d58152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swb84" Apr 17 11:30:23.785455 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784718 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bb2737eb-5571-4fee-8d9a-10110cc1a205-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fl46q\" (UID: \"bb2737eb-5571-4fee-8d9a-10110cc1a205\") " pod="openshift-multus/multus-additional-cni-plugins-fl46q" Apr 17 11:30:23.785455 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784728 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/187342e6-1155-44f7-a799-bfeab7d58152-sys-fs\") pod \"aws-ebs-csi-driver-node-swb84\" (UID: \"187342e6-1155-44f7-a799-bfeab7d58152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swb84" Apr 17 11:30:23.785455 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784739 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ecf1ff91-10cb-4ca8-8a83-7ed6b852b5ac-iptables-alerter-script\") pod \"iptables-alerter-qkqfc\" (UID: \"ecf1ff91-10cb-4ca8-8a83-7ed6b852b5ac\") " pod="openshift-network-operator/iptables-alerter-qkqfc" Apr 17 11:30:23.785455 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784757 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff8d9fca-80b5-4d5a-99c0-374a747b0900-var-lib-kubelet\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.785455 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784746 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4155f35e-1865-499f-88fb-fdde1e2c1218-metrics-certs\") pod \"network-metrics-daemon-xw9bz\" (UID: \"4155f35e-1865-499f-88fb-fdde1e2c1218\") " pod="openshift-multus/network-metrics-daemon-xw9bz" Apr 17 11:30:23.785455 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784785 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-run-openvswitch\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.785455 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784800 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-system-cni-dir\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.785455 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784787 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bb2737eb-5571-4fee-8d9a-10110cc1a205-cni-binary-copy\") pod \"multus-additional-cni-plugins-fl46q\" (UID: \"bb2737eb-5571-4fee-8d9a-10110cc1a205\") " pod="openshift-multus/multus-additional-cni-plugins-fl46q" Apr 17 11:30:23.785455 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784808 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-host-var-lib-cni-bin\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.785455 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784827 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-cnibin\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.785455 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784854 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-host-var-lib-cni-bin\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.785455 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784871 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-run-systemd\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.785455 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784873 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-multus-cni-dir\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.785455 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784894 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/187342e6-1155-44f7-a799-bfeab7d58152-etc-selinux\") pod \"aws-ebs-csi-driver-node-swb84\" (UID: \"187342e6-1155-44f7-a799-bfeab7d58152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swb84" Apr 17 11:30:23.785896 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784906 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-multus-conf-dir\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.785896 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.784985 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ff8d9fca-80b5-4d5a-99c0-374a747b0900-run\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.785896 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.785033 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/187342e6-1155-44f7-a799-bfeab7d58152-registration-dir\") pod \"aws-ebs-csi-driver-node-swb84\" (UID: \"187342e6-1155-44f7-a799-bfeab7d58152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swb84" Apr 17 11:30:23.785896 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.785045 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ff8d9fca-80b5-4d5a-99c0-374a747b0900-tmp\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.785896 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.785053 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/517a579e-7efd-4d38-8225-2b0c7c48d532-etc-kubernetes\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.785896 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.785062 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-host-cni-netd\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.785896 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.785094 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43535899-eb5a-4030-8bab-db2650a0cbff-host-cni-netd\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.785896 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:23.785200 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:30:23.785896 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:23.785304 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4155f35e-1865-499f-88fb-fdde1e2c1218-metrics-certs podName:4155f35e-1865-499f-88fb-fdde1e2c1218 nodeName:}" failed. No retries permitted until 2026-04-17 11:30:24.285247556 +0000 UTC m=+2.082962159 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4155f35e-1865-499f-88fb-fdde1e2c1218-metrics-certs") pod "network-metrics-daemon-xw9bz" (UID: "4155f35e-1865-499f-88fb-fdde1e2c1218") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:30:23.785896 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.785347 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bb2737eb-5571-4fee-8d9a-10110cc1a205-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fl46q\" (UID: \"bb2737eb-5571-4fee-8d9a-10110cc1a205\") " pod="openshift-multus/multus-additional-cni-plugins-fl46q" Apr 17 11:30:23.786525 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.786506 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqb72\" (UniqueName: \"kubernetes.io/projected/43535899-eb5a-4030-8bab-db2650a0cbff-kube-api-access-vqb72\") pod \"ovnkube-node-znj2s\" (UID: \"43535899-eb5a-4030-8bab-db2650a0cbff\") " pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:23.788638 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:23.788621 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:30:23.788707 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:23.788640 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:30:23.788707 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:23.788652 2577 projected.go:194] Error preparing data for projected volume kube-api-access-5m77s for pod openshift-network-diagnostics/network-check-target-cgpzp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:30:23.788707 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:23.788696 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/34d03c01-00bf-416b-8b46-2274587cc240-kube-api-access-5m77s podName:34d03c01-00bf-416b-8b46-2274587cc240 nodeName:}" failed. No retries permitted until 2026-04-17 11:30:24.288682793 +0000 UTC m=+2.086397392 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-5m77s" (UniqueName: "kubernetes.io/projected/34d03c01-00bf-416b-8b46-2274587cc240-kube-api-access-5m77s") pod "network-check-target-cgpzp" (UID: "34d03c01-00bf-416b-8b46-2274587cc240") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:30:23.790797 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.790773 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tph5z\" (UniqueName: \"kubernetes.io/projected/4155f35e-1865-499f-88fb-fdde1e2c1218-kube-api-access-tph5z\") pod \"network-metrics-daemon-xw9bz\" (UID: \"4155f35e-1865-499f-88fb-fdde1e2c1218\") " pod="openshift-multus/network-metrics-daemon-xw9bz" Apr 17 11:30:23.790927 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.790902 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-klfd4\" (UniqueName: \"kubernetes.io/projected/bb2737eb-5571-4fee-8d9a-10110cc1a205-kube-api-access-klfd4\") pod \"multus-additional-cni-plugins-fl46q\" (UID: \"bb2737eb-5571-4fee-8d9a-10110cc1a205\") " pod="openshift-multus/multus-additional-cni-plugins-fl46q" Apr 17 11:30:23.791758 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.791727 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw4tc\" (UniqueName: \"kubernetes.io/projected/517a579e-7efd-4d38-8225-2b0c7c48d532-kube-api-access-mw4tc\") pod \"multus-cgvrz\" (UID: \"517a579e-7efd-4d38-8225-2b0c7c48d532\") " pod="openshift-multus/multus-cgvrz" Apr 17 11:30:23.791839 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.791795 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-thddn\" (UniqueName: \"kubernetes.io/projected/a1b0eda5-8b26-4ce3-af63-74364b0ea28f-kube-api-access-thddn\") pod \"node-ca-frb9s\" (UID: \"a1b0eda5-8b26-4ce3-af63-74364b0ea28f\") " pod="openshift-image-registry/node-ca-frb9s" Apr 17 11:30:23.791839 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.791809 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mm69\" (UniqueName: \"kubernetes.io/projected/ecf1ff91-10cb-4ca8-8a83-7ed6b852b5ac-kube-api-access-6mm69\") pod \"iptables-alerter-qkqfc\" (UID: \"ecf1ff91-10cb-4ca8-8a83-7ed6b852b5ac\") " pod="openshift-network-operator/iptables-alerter-qkqfc" Apr 17 11:30:23.791904 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.791809 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-thz6c\" (UniqueName: \"kubernetes.io/projected/3f06ebee-cbe3-4266-bf01-0bb889437be7-kube-api-access-thz6c\") pod \"node-resolver-stmhs\" (UID: \"3f06ebee-cbe3-4266-bf01-0bb889437be7\") " pod="openshift-dns/node-resolver-stmhs" Apr 17 11:30:23.792468 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.792453 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8ppd\" (UniqueName: \"kubernetes.io/projected/ff8d9fca-80b5-4d5a-99c0-374a747b0900-kube-api-access-q8ppd\") pod \"tuned-524kl\" (UID: \"ff8d9fca-80b5-4d5a-99c0-374a747b0900\") " pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:23.792558 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.792543 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kh25\" (UniqueName: \"kubernetes.io/projected/187342e6-1155-44f7-a799-bfeab7d58152-kube-api-access-2kh25\") pod \"aws-ebs-csi-driver-node-swb84\" (UID: \"187342e6-1155-44f7-a799-bfeab7d58152\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swb84" Apr 17 11:30:23.797139 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.797097 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-64.ec2.internal" event={"ID":"0af5d3062aed12d9101a8cacb1a07582","Type":"ContainerStarted","Data":"e791d95dfafc80eac4f11958688b5cb075d21844cdfdd90b4fc0039638a1a119"} Apr 17 11:30:23.797892 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.797875 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-64.ec2.internal" event={"ID":"060506f2615e2a76fac2c219a480cfb1","Type":"ContainerStarted","Data":"9805d0375aa23453125082c2edad06a941eb00ffb20c1fd51ab21b14436c2caf"} Apr 17 11:30:23.993866 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:23.993839 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swb84" Apr 17 11:30:24.000429 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:24.000411 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-524kl" Apr 17 11:30:24.006023 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:24.005991 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff8d9fca_80b5_4d5a_99c0_374a747b0900.slice/crio-2681058723ceb70954f87a1d281d7993ad9cbff393208c8b9ce1aab30d73771e WatchSource:0}: Error finding container 2681058723ceb70954f87a1d281d7993ad9cbff393208c8b9ce1aab30d73771e: Status 404 returned error can't find the container with id 2681058723ceb70954f87a1d281d7993ad9cbff393208c8b9ce1aab30d73771e Apr 17 11:30:24.006099 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:24.006028 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-frb9s" Apr 17 11:30:24.009873 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:24.009854 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fl46q" Apr 17 11:30:24.015069 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:24.015039 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1b0eda5_8b26_4ce3_af63_74364b0ea28f.slice/crio-00f3ce2929ebc94b15c0135eb26da1c1ad50fbf391cee2f198ef579f5830c84d WatchSource:0}: Error finding container 00f3ce2929ebc94b15c0135eb26da1c1ad50fbf391cee2f198ef579f5830c84d: Status 404 returned error can't find the container with id 00f3ce2929ebc94b15c0135eb26da1c1ad50fbf391cee2f198ef579f5830c84d Apr 17 11:30:24.015069 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:24.015059 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-stmhs" Apr 17 11:30:24.020700 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:24.020679 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cgvrz" Apr 17 11:30:24.026643 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:24.026629 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qkqfc" Apr 17 11:30:24.031730 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:24.031708 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:24.033286 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:24.033234 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f06ebee_cbe3_4266_bf01_0bb889437be7.slice/crio-d8cfbf983ef2dc6d9d99099ed6e9e1088665772733ecad8f1de679fe3b6dedff WatchSource:0}: Error finding container d8cfbf983ef2dc6d9d99099ed6e9e1088665772733ecad8f1de679fe3b6dedff: Status 404 returned error can't find the container with id d8cfbf983ef2dc6d9d99099ed6e9e1088665772733ecad8f1de679fe3b6dedff Apr 17 11:30:24.036503 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:24.036469 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-fxwv2" Apr 17 11:30:24.045491 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:24.045464 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecf1ff91_10cb_4ca8_8a83_7ed6b852b5ac.slice/crio-5754d62e55d38cb92b17029a67911c079f35a8d46864ff94ff9946ff8c79edcc WatchSource:0}: Error finding container 5754d62e55d38cb92b17029a67911c079f35a8d46864ff94ff9946ff8c79edcc: Status 404 returned error can't find the container with id 5754d62e55d38cb92b17029a67911c079f35a8d46864ff94ff9946ff8c79edcc Apr 17 11:30:24.047238 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:24.047203 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod517a579e_7efd_4d38_8225_2b0c7c48d532.slice/crio-7f35bb51acb957f067ef5aa8cf6e9663399c48545b5d57cfee272e67c0a091a7 WatchSource:0}: Error finding container 7f35bb51acb957f067ef5aa8cf6e9663399c48545b5d57cfee272e67c0a091a7: Status 404 returned error can't find the container with id 7f35bb51acb957f067ef5aa8cf6e9663399c48545b5d57cfee272e67c0a091a7 Apr 17 11:30:24.047939 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:24.047777 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43535899_eb5a_4030_8bab_db2650a0cbff.slice/crio-a7520ddb3addfa1a4825373e902c14ec14f1dfa10db627391cd8648470a3a6b1 WatchSource:0}: Error finding container a7520ddb3addfa1a4825373e902c14ec14f1dfa10db627391cd8648470a3a6b1: Status 404 returned error can't find the container with id a7520ddb3addfa1a4825373e902c14ec14f1dfa10db627391cd8648470a3a6b1 Apr 17 11:30:24.051059 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:24.051025 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93c44504_2cac_4acc_82af_a24fa55d1c56.slice/crio-caf0ecb1ef249f770017d0333b81d33f350677f56509d6db060ca59097ce28e7 WatchSource:0}: Error finding container caf0ecb1ef249f770017d0333b81d33f350677f56509d6db060ca59097ce28e7: Status 404 returned error can't find the container with id caf0ecb1ef249f770017d0333b81d33f350677f56509d6db060ca59097ce28e7 Apr 17 11:30:24.288535 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:24.288456 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4155f35e-1865-499f-88fb-fdde1e2c1218-metrics-certs\") pod \"network-metrics-daemon-xw9bz\" (UID: \"4155f35e-1865-499f-88fb-fdde1e2c1218\") " pod="openshift-multus/network-metrics-daemon-xw9bz" Apr 17 11:30:24.288689 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:24.288613 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:30:24.288689 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:24.288683 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4155f35e-1865-499f-88fb-fdde1e2c1218-metrics-certs podName:4155f35e-1865-499f-88fb-fdde1e2c1218 nodeName:}" failed. No retries permitted until 2026-04-17 11:30:25.288664309 +0000 UTC m=+3.086378906 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4155f35e-1865-499f-88fb-fdde1e2c1218-metrics-certs") pod "network-metrics-daemon-xw9bz" (UID: "4155f35e-1865-499f-88fb-fdde1e2c1218") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:30:24.387766 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:24.387729 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:30:24.389283 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:24.389246 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5m77s\" (UniqueName: \"kubernetes.io/projected/34d03c01-00bf-416b-8b46-2274587cc240-kube-api-access-5m77s\") pod \"network-check-target-cgpzp\" (UID: \"34d03c01-00bf-416b-8b46-2274587cc240\") " pod="openshift-network-diagnostics/network-check-target-cgpzp" Apr 17 11:30:24.389443 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:24.389422 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:30:24.389506 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:24.389449 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:30:24.389506 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:24.389462 2577 projected.go:194] Error preparing data for projected volume kube-api-access-5m77s for pod openshift-network-diagnostics/network-check-target-cgpzp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:30:24.389599 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:24.389514 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/34d03c01-00bf-416b-8b46-2274587cc240-kube-api-access-5m77s podName:34d03c01-00bf-416b-8b46-2274587cc240 nodeName:}" failed. No retries permitted until 2026-04-17 11:30:25.389496596 +0000 UTC m=+3.187211205 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-5m77s" (UniqueName: "kubernetes.io/projected/34d03c01-00bf-416b-8b46-2274587cc240-kube-api-access-5m77s") pod "network-check-target-cgpzp" (UID: "34d03c01-00bf-416b-8b46-2274587cc240") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:30:24.447718 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:24.447478 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:30:24.454749 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:24.454716 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:30:24.709747 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:24.709649 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 11:25:23 +0000 UTC" deadline="2028-01-08 11:59:58.467516872 +0000 UTC" Apr 17 11:30:24.709747 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:24.709693 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15144h29m33.757827486s" Apr 17 11:30:24.795717 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:24.795687 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cgpzp" Apr 17 11:30:24.795890 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:24.795814 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cgpzp" podUID="34d03c01-00bf-416b-8b46-2274587cc240" Apr 17 11:30:24.820486 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:24.820448 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" event={"ID":"43535899-eb5a-4030-8bab-db2650a0cbff","Type":"ContainerStarted","Data":"a7520ddb3addfa1a4825373e902c14ec14f1dfa10db627391cd8648470a3a6b1"} Apr 17 11:30:24.823594 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:24.823538 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cgvrz" event={"ID":"517a579e-7efd-4d38-8225-2b0c7c48d532","Type":"ContainerStarted","Data":"7f35bb51acb957f067ef5aa8cf6e9663399c48545b5d57cfee272e67c0a091a7"} Apr 17 11:30:24.826950 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:24.826891 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-stmhs" event={"ID":"3f06ebee-cbe3-4266-bf01-0bb889437be7","Type":"ContainerStarted","Data":"d8cfbf983ef2dc6d9d99099ed6e9e1088665772733ecad8f1de679fe3b6dedff"} Apr 17 11:30:24.835718 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:24.835664 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fl46q" event={"ID":"bb2737eb-5571-4fee-8d9a-10110cc1a205","Type":"ContainerStarted","Data":"41256162afeaedc8b7bbe6214b4648a498ef3c55ce29bc9fbfb6616c2a32d3ac"} Apr 17 11:30:24.845461 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:24.843623 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swb84" event={"ID":"187342e6-1155-44f7-a799-bfeab7d58152","Type":"ContainerStarted","Data":"715fa1428984b410086eaa3b12dc5d5d5996c745600b322b7a4ff5ec60d61aa1"} Apr 17 11:30:24.866002 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:24.864033 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-fxwv2" event={"ID":"93c44504-2cac-4acc-82af-a24fa55d1c56","Type":"ContainerStarted","Data":"caf0ecb1ef249f770017d0333b81d33f350677f56509d6db060ca59097ce28e7"} Apr 17 11:30:24.867150 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:24.867113 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qkqfc" event={"ID":"ecf1ff91-10cb-4ca8-8a83-7ed6b852b5ac","Type":"ContainerStarted","Data":"5754d62e55d38cb92b17029a67911c079f35a8d46864ff94ff9946ff8c79edcc"} Apr 17 11:30:24.878743 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:24.878717 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-frb9s" event={"ID":"a1b0eda5-8b26-4ce3-af63-74364b0ea28f","Type":"ContainerStarted","Data":"00f3ce2929ebc94b15c0135eb26da1c1ad50fbf391cee2f198ef579f5830c84d"} Apr 17 11:30:24.901450 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:24.901418 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-524kl" event={"ID":"ff8d9fca-80b5-4d5a-99c0-374a747b0900","Type":"ContainerStarted","Data":"2681058723ceb70954f87a1d281d7993ad9cbff393208c8b9ce1aab30d73771e"} Apr 17 11:30:25.298524 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:25.298482 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4155f35e-1865-499f-88fb-fdde1e2c1218-metrics-certs\") pod \"network-metrics-daemon-xw9bz\" (UID: \"4155f35e-1865-499f-88fb-fdde1e2c1218\") " pod="openshift-multus/network-metrics-daemon-xw9bz" Apr 17 11:30:25.298710 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:25.298654 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:30:25.298770 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:25.298720 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4155f35e-1865-499f-88fb-fdde1e2c1218-metrics-certs podName:4155f35e-1865-499f-88fb-fdde1e2c1218 nodeName:}" failed. No retries permitted until 2026-04-17 11:30:27.298700048 +0000 UTC m=+5.096414646 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4155f35e-1865-499f-88fb-fdde1e2c1218-metrics-certs") pod "network-metrics-daemon-xw9bz" (UID: "4155f35e-1865-499f-88fb-fdde1e2c1218") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:30:25.399729 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:25.399692 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5m77s\" (UniqueName: \"kubernetes.io/projected/34d03c01-00bf-416b-8b46-2274587cc240-kube-api-access-5m77s\") pod \"network-check-target-cgpzp\" (UID: \"34d03c01-00bf-416b-8b46-2274587cc240\") " pod="openshift-network-diagnostics/network-check-target-cgpzp" Apr 17 11:30:25.399922 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:25.399878 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:30:25.399922 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:25.399905 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:30:25.399922 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:25.399918 2577 projected.go:194] Error preparing data for projected volume kube-api-access-5m77s for pod openshift-network-diagnostics/network-check-target-cgpzp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:30:25.400094 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:25.399978 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/34d03c01-00bf-416b-8b46-2274587cc240-kube-api-access-5m77s podName:34d03c01-00bf-416b-8b46-2274587cc240 nodeName:}" failed. No retries permitted until 2026-04-17 11:30:27.399958718 +0000 UTC m=+5.197673326 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-5m77s" (UniqueName: "kubernetes.io/projected/34d03c01-00bf-416b-8b46-2274587cc240-kube-api-access-5m77s") pod "network-check-target-cgpzp" (UID: "34d03c01-00bf-416b-8b46-2274587cc240") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:30:25.710591 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:25.710497 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 11:25:23 +0000 UTC" deadline="2027-10-29 21:20:14.281959202 +0000 UTC" Apr 17 11:30:25.710591 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:25.710536 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13449h49m48.571426912s" Apr 17 11:30:25.794936 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:25.794414 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw9bz" Apr 17 11:30:25.794936 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:25.794627 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw9bz" podUID="4155f35e-1865-499f-88fb-fdde1e2c1218" Apr 17 11:30:26.796587 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:26.796067 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cgpzp" Apr 17 11:30:26.796587 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:26.796198 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cgpzp" podUID="34d03c01-00bf-416b-8b46-2274587cc240" Apr 17 11:30:27.274165 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:27.273313 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-49lr8"] Apr 17 11:30:27.275652 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:27.275312 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49lr8" Apr 17 11:30:27.275652 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:27.275400 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-49lr8" podUID="e74f83b5-a2a2-4262-89a6-a122df9a5401" Apr 17 11:30:27.316039 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:27.315608 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4155f35e-1865-499f-88fb-fdde1e2c1218-metrics-certs\") pod \"network-metrics-daemon-xw9bz\" (UID: \"4155f35e-1865-499f-88fb-fdde1e2c1218\") " pod="openshift-multus/network-metrics-daemon-xw9bz" Apr 17 11:30:27.316039 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:27.315732 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:30:27.316039 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:27.315798 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4155f35e-1865-499f-88fb-fdde1e2c1218-metrics-certs podName:4155f35e-1865-499f-88fb-fdde1e2c1218 nodeName:}" failed. No retries permitted until 2026-04-17 11:30:31.315780844 +0000 UTC m=+9.113495450 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4155f35e-1865-499f-88fb-fdde1e2c1218-metrics-certs") pod "network-metrics-daemon-xw9bz" (UID: "4155f35e-1865-499f-88fb-fdde1e2c1218") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:30:27.416514 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:27.416472 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e74f83b5-a2a2-4262-89a6-a122df9a5401-kubelet-config\") pod \"global-pull-secret-syncer-49lr8\" (UID: \"e74f83b5-a2a2-4262-89a6-a122df9a5401\") " pod="kube-system/global-pull-secret-syncer-49lr8" Apr 17 11:30:27.416707 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:27.416539 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e74f83b5-a2a2-4262-89a6-a122df9a5401-original-pull-secret\") pod \"global-pull-secret-syncer-49lr8\" (UID: \"e74f83b5-a2a2-4262-89a6-a122df9a5401\") " pod="kube-system/global-pull-secret-syncer-49lr8" Apr 17 11:30:27.416707 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:27.416568 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e74f83b5-a2a2-4262-89a6-a122df9a5401-dbus\") pod \"global-pull-secret-syncer-49lr8\" (UID: \"e74f83b5-a2a2-4262-89a6-a122df9a5401\") " pod="kube-system/global-pull-secret-syncer-49lr8" Apr 17 11:30:27.416707 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:27.416605 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5m77s\" (UniqueName: \"kubernetes.io/projected/34d03c01-00bf-416b-8b46-2274587cc240-kube-api-access-5m77s\") pod \"network-check-target-cgpzp\" (UID: \"34d03c01-00bf-416b-8b46-2274587cc240\") " pod="openshift-network-diagnostics/network-check-target-cgpzp" Apr 17 11:30:27.416882 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:27.416747 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:30:27.416882 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:27.416768 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:30:27.416882 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:27.416780 2577 projected.go:194] Error preparing data for projected volume kube-api-access-5m77s for pod openshift-network-diagnostics/network-check-target-cgpzp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:30:27.416882 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:27.416837 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/34d03c01-00bf-416b-8b46-2274587cc240-kube-api-access-5m77s podName:34d03c01-00bf-416b-8b46-2274587cc240 nodeName:}" failed. No retries permitted until 2026-04-17 11:30:31.416818452 +0000 UTC m=+9.214533050 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-5m77s" (UniqueName: "kubernetes.io/projected/34d03c01-00bf-416b-8b46-2274587cc240-kube-api-access-5m77s") pod "network-check-target-cgpzp" (UID: "34d03c01-00bf-416b-8b46-2274587cc240") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:30:27.517790 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:27.517545 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e74f83b5-a2a2-4262-89a6-a122df9a5401-kubelet-config\") pod \"global-pull-secret-syncer-49lr8\" (UID: \"e74f83b5-a2a2-4262-89a6-a122df9a5401\") " pod="kube-system/global-pull-secret-syncer-49lr8" Apr 17 11:30:27.517939 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:27.517825 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e74f83b5-a2a2-4262-89a6-a122df9a5401-original-pull-secret\") pod \"global-pull-secret-syncer-49lr8\" (UID: \"e74f83b5-a2a2-4262-89a6-a122df9a5401\") " pod="kube-system/global-pull-secret-syncer-49lr8" Apr 17 11:30:27.517939 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:27.517858 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e74f83b5-a2a2-4262-89a6-a122df9a5401-dbus\") pod \"global-pull-secret-syncer-49lr8\" (UID: \"e74f83b5-a2a2-4262-89a6-a122df9a5401\") " pod="kube-system/global-pull-secret-syncer-49lr8" Apr 17 11:30:27.518058 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:27.518047 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e74f83b5-a2a2-4262-89a6-a122df9a5401-dbus\") pod \"global-pull-secret-syncer-49lr8\" (UID: \"e74f83b5-a2a2-4262-89a6-a122df9a5401\") " pod="kube-system/global-pull-secret-syncer-49lr8" Apr 17 11:30:27.518096 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:27.518047 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e74f83b5-a2a2-4262-89a6-a122df9a5401-kubelet-config\") pod \"global-pull-secret-syncer-49lr8\" (UID: \"e74f83b5-a2a2-4262-89a6-a122df9a5401\") " pod="kube-system/global-pull-secret-syncer-49lr8" Apr 17 11:30:27.518174 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:27.518134 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:30:27.518210 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:27.518204 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e74f83b5-a2a2-4262-89a6-a122df9a5401-original-pull-secret podName:e74f83b5-a2a2-4262-89a6-a122df9a5401 nodeName:}" failed. No retries permitted until 2026-04-17 11:30:28.018185557 +0000 UTC m=+5.815900155 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e74f83b5-a2a2-4262-89a6-a122df9a5401-original-pull-secret") pod "global-pull-secret-syncer-49lr8" (UID: "e74f83b5-a2a2-4262-89a6-a122df9a5401") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:30:27.793742 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:27.793713 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw9bz" Apr 17 11:30:27.793906 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:27.793856 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw9bz" podUID="4155f35e-1865-499f-88fb-fdde1e2c1218" Apr 17 11:30:28.022141 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:28.022101 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e74f83b5-a2a2-4262-89a6-a122df9a5401-original-pull-secret\") pod \"global-pull-secret-syncer-49lr8\" (UID: \"e74f83b5-a2a2-4262-89a6-a122df9a5401\") " pod="kube-system/global-pull-secret-syncer-49lr8" Apr 17 11:30:28.022594 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:28.022290 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:30:28.022594 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:28.022357 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e74f83b5-a2a2-4262-89a6-a122df9a5401-original-pull-secret podName:e74f83b5-a2a2-4262-89a6-a122df9a5401 nodeName:}" failed. No retries permitted until 2026-04-17 11:30:29.022335401 +0000 UTC m=+6.820050012 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e74f83b5-a2a2-4262-89a6-a122df9a5401-original-pull-secret") pod "global-pull-secret-syncer-49lr8" (UID: "e74f83b5-a2a2-4262-89a6-a122df9a5401") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:30:28.794812 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:28.794346 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cgpzp" Apr 17 11:30:28.794812 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:28.794354 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49lr8" Apr 17 11:30:28.794812 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:28.794471 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cgpzp" podUID="34d03c01-00bf-416b-8b46-2274587cc240" Apr 17 11:30:28.794812 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:28.794555 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-49lr8" podUID="e74f83b5-a2a2-4262-89a6-a122df9a5401" Apr 17 11:30:29.030587 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:29.030549 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e74f83b5-a2a2-4262-89a6-a122df9a5401-original-pull-secret\") pod \"global-pull-secret-syncer-49lr8\" (UID: \"e74f83b5-a2a2-4262-89a6-a122df9a5401\") " pod="kube-system/global-pull-secret-syncer-49lr8" Apr 17 11:30:29.031002 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:29.030718 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:30:29.031002 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:29.030793 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e74f83b5-a2a2-4262-89a6-a122df9a5401-original-pull-secret podName:e74f83b5-a2a2-4262-89a6-a122df9a5401 nodeName:}" failed. No retries permitted until 2026-04-17 11:30:31.030774894 +0000 UTC m=+8.828489508 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e74f83b5-a2a2-4262-89a6-a122df9a5401-original-pull-secret") pod "global-pull-secret-syncer-49lr8" (UID: "e74f83b5-a2a2-4262-89a6-a122df9a5401") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:30:29.793925 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:29.793579 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw9bz" Apr 17 11:30:29.793925 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:29.793723 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw9bz" podUID="4155f35e-1865-499f-88fb-fdde1e2c1218" Apr 17 11:30:30.797385 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:30.796696 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49lr8" Apr 17 11:30:30.797385 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:30.796827 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-49lr8" podUID="e74f83b5-a2a2-4262-89a6-a122df9a5401" Apr 17 11:30:30.797385 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:30.797224 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cgpzp" Apr 17 11:30:30.797385 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:30.797323 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cgpzp" podUID="34d03c01-00bf-416b-8b46-2274587cc240" Apr 17 11:30:31.049979 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:31.049351 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e74f83b5-a2a2-4262-89a6-a122df9a5401-original-pull-secret\") pod \"global-pull-secret-syncer-49lr8\" (UID: \"e74f83b5-a2a2-4262-89a6-a122df9a5401\") " pod="kube-system/global-pull-secret-syncer-49lr8" Apr 17 11:30:31.049979 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:31.049523 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:30:31.049979 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:31.049585 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e74f83b5-a2a2-4262-89a6-a122df9a5401-original-pull-secret podName:e74f83b5-a2a2-4262-89a6-a122df9a5401 nodeName:}" failed. No retries permitted until 2026-04-17 11:30:35.049565639 +0000 UTC m=+12.847280245 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e74f83b5-a2a2-4262-89a6-a122df9a5401-original-pull-secret") pod "global-pull-secret-syncer-49lr8" (UID: "e74f83b5-a2a2-4262-89a6-a122df9a5401") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:30:31.351810 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:31.351721 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4155f35e-1865-499f-88fb-fdde1e2c1218-metrics-certs\") pod \"network-metrics-daemon-xw9bz\" (UID: \"4155f35e-1865-499f-88fb-fdde1e2c1218\") " pod="openshift-multus/network-metrics-daemon-xw9bz" Apr 17 11:30:31.351981 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:31.351888 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:30:31.351981 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:31.351956 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4155f35e-1865-499f-88fb-fdde1e2c1218-metrics-certs podName:4155f35e-1865-499f-88fb-fdde1e2c1218 nodeName:}" failed. No retries permitted until 2026-04-17 11:30:39.351934562 +0000 UTC m=+17.149649176 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4155f35e-1865-499f-88fb-fdde1e2c1218-metrics-certs") pod "network-metrics-daemon-xw9bz" (UID: "4155f35e-1865-499f-88fb-fdde1e2c1218") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:30:31.452407 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:31.452371 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5m77s\" (UniqueName: \"kubernetes.io/projected/34d03c01-00bf-416b-8b46-2274587cc240-kube-api-access-5m77s\") pod \"network-check-target-cgpzp\" (UID: \"34d03c01-00bf-416b-8b46-2274587cc240\") " pod="openshift-network-diagnostics/network-check-target-cgpzp" Apr 17 11:30:31.452587 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:31.452552 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:30:31.452587 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:31.452574 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:30:31.452587 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:31.452585 2577 projected.go:194] Error preparing data for projected volume kube-api-access-5m77s for pod openshift-network-diagnostics/network-check-target-cgpzp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:30:31.452743 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:31.452640 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/34d03c01-00bf-416b-8b46-2274587cc240-kube-api-access-5m77s podName:34d03c01-00bf-416b-8b46-2274587cc240 nodeName:}" failed. No retries permitted until 2026-04-17 11:30:39.452622036 +0000 UTC m=+17.250336640 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-5m77s" (UniqueName: "kubernetes.io/projected/34d03c01-00bf-416b-8b46-2274587cc240-kube-api-access-5m77s") pod "network-check-target-cgpzp" (UID: "34d03c01-00bf-416b-8b46-2274587cc240") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:30:31.794364 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:31.793812 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw9bz" Apr 17 11:30:31.794364 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:31.793962 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw9bz" podUID="4155f35e-1865-499f-88fb-fdde1e2c1218" Apr 17 11:30:32.794991 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:32.794961 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cgpzp" Apr 17 11:30:32.795467 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:32.795062 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cgpzp" podUID="34d03c01-00bf-416b-8b46-2274587cc240" Apr 17 11:30:32.795467 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:32.795131 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49lr8" Apr 17 11:30:32.795467 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:32.795249 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-49lr8" podUID="e74f83b5-a2a2-4262-89a6-a122df9a5401" Apr 17 11:30:33.793585 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:33.793556 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw9bz" Apr 17 11:30:33.793733 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:33.793708 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw9bz" podUID="4155f35e-1865-499f-88fb-fdde1e2c1218" Apr 17 11:30:34.793972 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:34.793937 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cgpzp" Apr 17 11:30:34.794400 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:34.793988 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49lr8" Apr 17 11:30:34.794400 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:34.794098 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cgpzp" podUID="34d03c01-00bf-416b-8b46-2274587cc240" Apr 17 11:30:34.794400 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:34.794226 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-49lr8" podUID="e74f83b5-a2a2-4262-89a6-a122df9a5401" Apr 17 11:30:35.075225 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:35.075139 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e74f83b5-a2a2-4262-89a6-a122df9a5401-original-pull-secret\") pod \"global-pull-secret-syncer-49lr8\" (UID: \"e74f83b5-a2a2-4262-89a6-a122df9a5401\") " pod="kube-system/global-pull-secret-syncer-49lr8" Apr 17 11:30:35.075394 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:35.075308 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:30:35.075394 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:35.075380 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e74f83b5-a2a2-4262-89a6-a122df9a5401-original-pull-secret podName:e74f83b5-a2a2-4262-89a6-a122df9a5401 nodeName:}" failed. No retries permitted until 2026-04-17 11:30:43.075364672 +0000 UTC m=+20.873079270 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e74f83b5-a2a2-4262-89a6-a122df9a5401-original-pull-secret") pod "global-pull-secret-syncer-49lr8" (UID: "e74f83b5-a2a2-4262-89a6-a122df9a5401") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:30:35.793628 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:35.793597 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw9bz" Apr 17 11:30:35.793808 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:35.793709 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw9bz" podUID="4155f35e-1865-499f-88fb-fdde1e2c1218" Apr 17 11:30:36.793852 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:36.793816 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49lr8" Apr 17 11:30:36.794318 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:36.793935 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-49lr8" podUID="e74f83b5-a2a2-4262-89a6-a122df9a5401" Apr 17 11:30:36.794318 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:36.793996 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cgpzp" Apr 17 11:30:36.794318 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:36.794117 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cgpzp" podUID="34d03c01-00bf-416b-8b46-2274587cc240" Apr 17 11:30:37.794165 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:37.794128 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw9bz" Apr 17 11:30:37.794744 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:37.794286 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw9bz" podUID="4155f35e-1865-499f-88fb-fdde1e2c1218" Apr 17 11:30:38.794310 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:38.794251 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cgpzp" Apr 17 11:30:38.794752 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:38.794395 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cgpzp" podUID="34d03c01-00bf-416b-8b46-2274587cc240" Apr 17 11:30:38.794752 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:38.794446 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49lr8" Apr 17 11:30:38.794752 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:38.794565 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-49lr8" podUID="e74f83b5-a2a2-4262-89a6-a122df9a5401" Apr 17 11:30:39.405091 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:39.405053 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4155f35e-1865-499f-88fb-fdde1e2c1218-metrics-certs\") pod \"network-metrics-daemon-xw9bz\" (UID: \"4155f35e-1865-499f-88fb-fdde1e2c1218\") " pod="openshift-multus/network-metrics-daemon-xw9bz" Apr 17 11:30:39.405306 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:39.405186 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:30:39.405306 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:39.405281 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4155f35e-1865-499f-88fb-fdde1e2c1218-metrics-certs podName:4155f35e-1865-499f-88fb-fdde1e2c1218 nodeName:}" failed. No retries permitted until 2026-04-17 11:30:55.405250512 +0000 UTC m=+33.202965108 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4155f35e-1865-499f-88fb-fdde1e2c1218-metrics-certs") pod "network-metrics-daemon-xw9bz" (UID: "4155f35e-1865-499f-88fb-fdde1e2c1218") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:30:39.506162 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:39.506124 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5m77s\" (UniqueName: \"kubernetes.io/projected/34d03c01-00bf-416b-8b46-2274587cc240-kube-api-access-5m77s\") pod \"network-check-target-cgpzp\" (UID: \"34d03c01-00bf-416b-8b46-2274587cc240\") " pod="openshift-network-diagnostics/network-check-target-cgpzp" Apr 17 11:30:39.506380 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:39.506305 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:30:39.506380 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:39.506326 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:30:39.506380 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:39.506339 2577 projected.go:194] Error preparing data for projected volume kube-api-access-5m77s for pod openshift-network-diagnostics/network-check-target-cgpzp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:30:39.506526 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:39.506399 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/34d03c01-00bf-416b-8b46-2274587cc240-kube-api-access-5m77s podName:34d03c01-00bf-416b-8b46-2274587cc240 nodeName:}" failed. No retries permitted until 2026-04-17 11:30:55.506380903 +0000 UTC m=+33.304095505 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-5m77s" (UniqueName: "kubernetes.io/projected/34d03c01-00bf-416b-8b46-2274587cc240-kube-api-access-5m77s") pod "network-check-target-cgpzp" (UID: "34d03c01-00bf-416b-8b46-2274587cc240") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:30:39.794145 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:39.794063 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw9bz" Apr 17 11:30:39.794327 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:39.794212 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw9bz" podUID="4155f35e-1865-499f-88fb-fdde1e2c1218" Apr 17 11:30:40.793965 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:40.793930 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49lr8" Apr 17 11:30:40.794128 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:40.794053 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-49lr8" podUID="e74f83b5-a2a2-4262-89a6-a122df9a5401" Apr 17 11:30:40.794128 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:40.794109 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cgpzp" Apr 17 11:30:40.794231 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:40.794202 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cgpzp" podUID="34d03c01-00bf-416b-8b46-2274587cc240" Apr 17 11:30:41.794434 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:41.794407 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw9bz" Apr 17 11:30:41.794769 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:41.794502 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw9bz" podUID="4155f35e-1865-499f-88fb-fdde1e2c1218" Apr 17 11:30:42.794888 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:42.794720 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49lr8" Apr 17 11:30:42.795490 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:42.794783 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cgpzp" Apr 17 11:30:42.795490 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:42.795023 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cgpzp" podUID="34d03c01-00bf-416b-8b46-2274587cc240" Apr 17 11:30:42.795490 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:42.794944 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-49lr8" podUID="e74f83b5-a2a2-4262-89a6-a122df9a5401" Apr 17 11:30:42.959463 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:42.959379 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swb84" event={"ID":"187342e6-1155-44f7-a799-bfeab7d58152","Type":"ContainerStarted","Data":"19eacde78b9301da38583899c7b2de6aa104ada1132d3ba486ef68db5007f124"} Apr 17 11:30:42.960581 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:42.960561 2577 generic.go:358] "Generic (PLEG): container finished" podID="0af5d3062aed12d9101a8cacb1a07582" containerID="417ac707c215eab8291924fa95ceeb8d0bb4748fa5d83dd7123ce37affb8d6d1" exitCode=0 Apr 17 11:30:42.960701 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:42.960613 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-64.ec2.internal" event={"ID":"0af5d3062aed12d9101a8cacb1a07582","Type":"ContainerDied","Data":"417ac707c215eab8291924fa95ceeb8d0bb4748fa5d83dd7123ce37affb8d6d1"} Apr 17 11:30:42.961733 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:42.961707 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-fxwv2" event={"ID":"93c44504-2cac-4acc-82af-a24fa55d1c56","Type":"ContainerStarted","Data":"4d2cad7444e200c80b817b43f149766effd064c7e35bcf538d16e1bde47e62d9"} Apr 17 11:30:42.962835 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:42.962815 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-frb9s" event={"ID":"a1b0eda5-8b26-4ce3-af63-74364b0ea28f","Type":"ContainerStarted","Data":"e6b350a8064a541ebf1826d3f185c9d289e4af1a393b8b0b18fa5375dcf5313b"} Apr 17 11:30:42.963913 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:42.963890 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-524kl" event={"ID":"ff8d9fca-80b5-4d5a-99c0-374a747b0900","Type":"ContainerStarted","Data":"2b14d1227c2112ef572e54092343cdf1ecff25f9be58d1563494760247e0b7c1"} Apr 17 11:30:42.965005 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:42.964986 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-64.ec2.internal" event={"ID":"060506f2615e2a76fac2c219a480cfb1","Type":"ContainerStarted","Data":"742b658df0c3f735a8448f36fa8d690986d9d7c59338d4a258c7bb56ccb10eb7"} Apr 17 11:30:42.967147 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:42.967132 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-znj2s_43535899-eb5a-4030-8bab-db2650a0cbff/ovn-acl-logging/0.log" Apr 17 11:30:42.967391 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:42.967374 2577 generic.go:358] "Generic (PLEG): container finished" podID="43535899-eb5a-4030-8bab-db2650a0cbff" containerID="e4058b35a80beac83ee982a7fe9a602c7b59ef913cbca686d04db8eef23853f1" exitCode=1 Apr 17 11:30:42.967444 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:42.967425 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" event={"ID":"43535899-eb5a-4030-8bab-db2650a0cbff","Type":"ContainerStarted","Data":"29c5e1de75dfa7da47e20a1577186f0d133d9d75a4fe624043923d6210d011ad"} Apr 17 11:30:42.967444 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:42.967439 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" event={"ID":"43535899-eb5a-4030-8bab-db2650a0cbff","Type":"ContainerStarted","Data":"e88f1ac53007eb121728de1ae636de9327ba295dff6e25fdc2b36e37fb5f1cca"} Apr 17 11:30:42.967509 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:42.967449 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" event={"ID":"43535899-eb5a-4030-8bab-db2650a0cbff","Type":"ContainerStarted","Data":"c8053da2ed5f08137e13154a79e21ae012abbd1c22e4c2410eae609c8517c5ca"} Apr 17 11:30:42.967509 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:42.967457 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" event={"ID":"43535899-eb5a-4030-8bab-db2650a0cbff","Type":"ContainerStarted","Data":"8df92da6cf72cdf2e905e2ccf7795a9e748c2ebccf8b6e55f6895db952e04b04"} Apr 17 11:30:42.967509 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:42.967468 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" event={"ID":"43535899-eb5a-4030-8bab-db2650a0cbff","Type":"ContainerDied","Data":"e4058b35a80beac83ee982a7fe9a602c7b59ef913cbca686d04db8eef23853f1"} Apr 17 11:30:42.967509 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:42.967482 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" event={"ID":"43535899-eb5a-4030-8bab-db2650a0cbff","Type":"ContainerStarted","Data":"47fcc34bc55d47a1e94069c0653eefda882a95f69e3ec272eef757b7edd9120c"} Apr 17 11:30:42.968565 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:42.968547 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cgvrz" event={"ID":"517a579e-7efd-4d38-8225-2b0c7c48d532","Type":"ContainerStarted","Data":"3bbaa886ad4532496fa8744af5a9fa0cd5a9ceb2887219dc25f257c86b6c60d2"} Apr 17 11:30:42.969680 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:42.969660 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-stmhs" event={"ID":"3f06ebee-cbe3-4266-bf01-0bb889437be7","Type":"ContainerStarted","Data":"398f94c123688c910e522a7287f507d12bde5bb5806a7cde73e165f1506209b3"} Apr 17 11:30:42.970836 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:42.970814 2577 generic.go:358] "Generic (PLEG): container finished" podID="bb2737eb-5571-4fee-8d9a-10110cc1a205" containerID="d97c45b8da7eecc262a6524e2253acf84fa23c860c7bb9ba9d84feb42d127be1" exitCode=0 Apr 17 11:30:42.970911 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:42.970846 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fl46q" event={"ID":"bb2737eb-5571-4fee-8d9a-10110cc1a205","Type":"ContainerDied","Data":"d97c45b8da7eecc262a6524e2253acf84fa23c860c7bb9ba9d84feb42d127be1"} Apr 17 11:30:42.977699 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:42.977663 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-fxwv2" podStartSLOduration=3.200549734 podStartE2EDuration="20.977653813s" podCreationTimestamp="2026-04-17 11:30:22 +0000 UTC" firstStartedPulling="2026-04-17 11:30:24.052977921 +0000 UTC m=+1.850692522" lastFinishedPulling="2026-04-17 11:30:41.830081995 +0000 UTC m=+19.627796601" observedRunningTime="2026-04-17 11:30:42.977355417 +0000 UTC m=+20.775070031" watchObservedRunningTime="2026-04-17 11:30:42.977653813 +0000 UTC m=+20.775368429" Apr 17 11:30:43.014306 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:43.014244 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-64.ec2.internal" podStartSLOduration=21.014225957 podStartE2EDuration="21.014225957s" podCreationTimestamp="2026-04-17 11:30:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:30:43.014176014 +0000 UTC m=+20.811890631" watchObservedRunningTime="2026-04-17 11:30:43.014225957 +0000 UTC m=+20.811940576" Apr 17 11:30:43.029675 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:43.029624 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-524kl" podStartSLOduration=3.20846551 podStartE2EDuration="21.029611477s" podCreationTimestamp="2026-04-17 11:30:22 +0000 UTC" firstStartedPulling="2026-04-17 11:30:24.010606821 +0000 UTC m=+1.808321416" lastFinishedPulling="2026-04-17 11:30:41.831752774 +0000 UTC m=+19.629467383" observedRunningTime="2026-04-17 11:30:43.029371364 +0000 UTC m=+20.827085979" watchObservedRunningTime="2026-04-17 11:30:43.029611477 +0000 UTC m=+20.827326094" Apr 17 11:30:43.042953 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:43.042910 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-frb9s" podStartSLOduration=3.228169298 podStartE2EDuration="21.04289971s" podCreationTimestamp="2026-04-17 11:30:22 +0000 UTC" firstStartedPulling="2026-04-17 11:30:24.016602925 +0000 UTC m=+1.814317535" lastFinishedPulling="2026-04-17 11:30:41.831333351 +0000 UTC m=+19.629047947" observedRunningTime="2026-04-17 11:30:43.042536577 +0000 UTC m=+20.840251194" watchObservedRunningTime="2026-04-17 11:30:43.04289971 +0000 UTC m=+20.840614323" Apr 17 11:30:43.056060 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:43.056004 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-stmhs" podStartSLOduration=3.262249927 podStartE2EDuration="21.055991998s" podCreationTimestamp="2026-04-17 11:30:22 +0000 UTC" firstStartedPulling="2026-04-17 11:30:24.036691224 +0000 UTC m=+1.834405820" lastFinishedPulling="2026-04-17 11:30:41.83043329 +0000 UTC m=+19.628147891" observedRunningTime="2026-04-17 11:30:43.055656891 +0000 UTC m=+20.853371508" watchObservedRunningTime="2026-04-17 11:30:43.055991998 +0000 UTC m=+20.853706614" Apr 17 11:30:43.077090 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:43.077049 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-cgvrz" podStartSLOduration=3.261843791 podStartE2EDuration="21.077033609s" podCreationTimestamp="2026-04-17 11:30:22 +0000 UTC" firstStartedPulling="2026-04-17 11:30:24.049331586 +0000 UTC m=+1.847046195" lastFinishedPulling="2026-04-17 11:30:41.864521418 +0000 UTC m=+19.662236013" observedRunningTime="2026-04-17 11:30:43.076690086 +0000 UTC m=+20.874404703" watchObservedRunningTime="2026-04-17 11:30:43.077033609 +0000 UTC m=+20.874748226" Apr 17 11:30:43.136385 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:43.136346 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e74f83b5-a2a2-4262-89a6-a122df9a5401-original-pull-secret\") pod \"global-pull-secret-syncer-49lr8\" (UID: \"e74f83b5-a2a2-4262-89a6-a122df9a5401\") " pod="kube-system/global-pull-secret-syncer-49lr8" Apr 17 11:30:43.136535 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:43.136483 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:30:43.136582 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:43.136551 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e74f83b5-a2a2-4262-89a6-a122df9a5401-original-pull-secret podName:e74f83b5-a2a2-4262-89a6-a122df9a5401 nodeName:}" failed. No retries permitted until 2026-04-17 11:30:59.136532586 +0000 UTC m=+36.934247187 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e74f83b5-a2a2-4262-89a6-a122df9a5401-original-pull-secret") pod "global-pull-secret-syncer-49lr8" (UID: "e74f83b5-a2a2-4262-89a6-a122df9a5401") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:30:43.527691 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:43.527666 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 11:30:43.746657 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:43.746544 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T11:30:43.527687975Z","UUID":"f635e92e-9048-461f-a6a3-87d722cef6c2","Handler":null,"Name":"","Endpoint":""} Apr 17 11:30:43.748500 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:43.748445 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 11:30:43.748500 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:43.748470 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 11:30:43.793616 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:43.793537 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw9bz" Apr 17 11:30:43.793748 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:43.793646 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw9bz" podUID="4155f35e-1865-499f-88fb-fdde1e2c1218" Apr 17 11:30:43.975044 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:43.975002 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qkqfc" event={"ID":"ecf1ff91-10cb-4ca8-8a83-7ed6b852b5ac","Type":"ContainerStarted","Data":"e74a95cf38d86fe9122ff589dc049f7e3cc093209fd796547ce054e3e75a2425"} Apr 17 11:30:43.977358 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:43.977326 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swb84" event={"ID":"187342e6-1155-44f7-a799-bfeab7d58152","Type":"ContainerStarted","Data":"78b728ab7181198460cfd642af6156f6e550d4d5cd8e660dd02b8cfdd10944da"} Apr 17 11:30:44.007200 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:44.007151 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-qkqfc" podStartSLOduration=4.225233421 podStartE2EDuration="22.007132865s" podCreationTimestamp="2026-04-17 11:30:22 +0000 UTC" firstStartedPulling="2026-04-17 11:30:24.048541696 +0000 UTC m=+1.846256309" lastFinishedPulling="2026-04-17 11:30:41.830441143 +0000 UTC m=+19.628155753" observedRunningTime="2026-04-17 11:30:43.991354254 +0000 UTC m=+21.789068869" watchObservedRunningTime="2026-04-17 11:30:44.007132865 +0000 UTC m=+21.804847483" Apr 17 11:30:44.672140 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:44.672102 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-fxwv2" Apr 17 11:30:44.672766 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:44.672746 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-fxwv2" Apr 17 11:30:44.794597 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:44.794166 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cgpzp" Apr 17 11:30:44.794597 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:44.794210 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49lr8" Apr 17 11:30:44.794597 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:44.794290 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cgpzp" podUID="34d03c01-00bf-416b-8b46-2274587cc240" Apr 17 11:30:44.794597 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:44.794348 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-49lr8" podUID="e74f83b5-a2a2-4262-89a6-a122df9a5401" Apr 17 11:30:44.982220 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:44.982145 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-znj2s_43535899-eb5a-4030-8bab-db2650a0cbff/ovn-acl-logging/0.log" Apr 17 11:30:44.982660 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:44.982629 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" event={"ID":"43535899-eb5a-4030-8bab-db2650a0cbff","Type":"ContainerStarted","Data":"e69d4f0ad30d9e3c9762154dec655ff6de2d189d123cf581bab2a4ba077536ee"} Apr 17 11:30:44.984776 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:44.984747 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swb84" event={"ID":"187342e6-1155-44f7-a799-bfeab7d58152","Type":"ContainerStarted","Data":"85eba6bee41ffd2b2f98b007a347e485ded8d1e99b88fc07ae67302b7bcb92fd"} Apr 17 11:30:44.986502 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:44.986478 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-64.ec2.internal" event={"ID":"0af5d3062aed12d9101a8cacb1a07582","Type":"ContainerStarted","Data":"e7f8cb277bf40f899778777cc964008bc8f210a1bb9a12460cad72dd257f1a8e"} Apr 17 11:30:44.986731 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:44.986708 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-fxwv2" Apr 17 11:30:44.987171 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:44.987149 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-fxwv2" Apr 17 11:30:45.003840 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:45.003785 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swb84" podStartSLOduration=2.593395781 podStartE2EDuration="23.003771269s" podCreationTimestamp="2026-04-17 11:30:22 +0000 UTC" firstStartedPulling="2026-04-17 11:30:24.002111821 +0000 UTC m=+1.799826419" lastFinishedPulling="2026-04-17 11:30:44.412487309 +0000 UTC m=+22.210201907" observedRunningTime="2026-04-17 11:30:45.003349991 +0000 UTC m=+22.801064609" watchObservedRunningTime="2026-04-17 11:30:45.003771269 +0000 UTC m=+22.801485885" Apr 17 11:30:45.033903 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:45.033843 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-64.ec2.internal" podStartSLOduration=23.033828343 podStartE2EDuration="23.033828343s" podCreationTimestamp="2026-04-17 11:30:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:30:45.033463104 +0000 UTC m=+22.831177721" watchObservedRunningTime="2026-04-17 11:30:45.033828343 +0000 UTC m=+22.831542959" Apr 17 11:30:45.794458 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:45.794420 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw9bz" Apr 17 11:30:45.794633 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:45.794544 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw9bz" podUID="4155f35e-1865-499f-88fb-fdde1e2c1218" Apr 17 11:30:46.794302 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:46.794073 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49lr8" Apr 17 11:30:46.794739 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:46.794141 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cgpzp" Apr 17 11:30:46.794739 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:46.794389 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-49lr8" podUID="e74f83b5-a2a2-4262-89a6-a122df9a5401" Apr 17 11:30:46.794739 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:46.794492 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cgpzp" podUID="34d03c01-00bf-416b-8b46-2274587cc240" Apr 17 11:30:47.794646 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:47.794464 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw9bz" Apr 17 11:30:47.795301 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:47.794732 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw9bz" podUID="4155f35e-1865-499f-88fb-fdde1e2c1218" Apr 17 11:30:47.993537 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:47.993504 2577 generic.go:358] "Generic (PLEG): container finished" podID="bb2737eb-5571-4fee-8d9a-10110cc1a205" containerID="c01ccad30470da0c314be1c7b69cef0ecb4c037d846209cbc9bdd6bf6952a4d7" exitCode=0 Apr 17 11:30:47.993715 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:47.993585 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fl46q" event={"ID":"bb2737eb-5571-4fee-8d9a-10110cc1a205","Type":"ContainerDied","Data":"c01ccad30470da0c314be1c7b69cef0ecb4c037d846209cbc9bdd6bf6952a4d7"} Apr 17 11:30:47.996519 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:47.996500 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-znj2s_43535899-eb5a-4030-8bab-db2650a0cbff/ovn-acl-logging/0.log" Apr 17 11:30:47.996844 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:47.996820 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" event={"ID":"43535899-eb5a-4030-8bab-db2650a0cbff","Type":"ContainerStarted","Data":"48464ddf9d023575d73dcb763d6dbfcde0f1ecff668ad48a311ded68467b7bf4"} Apr 17 11:30:47.997162 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:47.997142 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:47.997162 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:47.997166 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:47.997412 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:47.997389 2577 scope.go:117] "RemoveContainer" containerID="e4058b35a80beac83ee982a7fe9a602c7b59ef913cbca686d04db8eef23853f1" Apr 17 11:30:48.015442 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:48.015419 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:48.794329 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:48.794300 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49lr8" Apr 17 11:30:48.794424 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:48.794339 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cgpzp" Apr 17 11:30:48.794492 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:48.794437 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-49lr8" podUID="e74f83b5-a2a2-4262-89a6-a122df9a5401" Apr 17 11:30:48.794583 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:48.794560 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cgpzp" podUID="34d03c01-00bf-416b-8b46-2274587cc240" Apr 17 11:30:49.000758 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:49.000675 2577 generic.go:358] "Generic (PLEG): container finished" podID="bb2737eb-5571-4fee-8d9a-10110cc1a205" containerID="0c916b459a313d9df7d99f1de6e26a974155504d8254a7515b17d176b89ed29c" exitCode=0 Apr 17 11:30:49.001168 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:49.000768 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fl46q" event={"ID":"bb2737eb-5571-4fee-8d9a-10110cc1a205","Type":"ContainerDied","Data":"0c916b459a313d9df7d99f1de6e26a974155504d8254a7515b17d176b89ed29c"} Apr 17 11:30:49.004079 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:49.004065 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-znj2s_43535899-eb5a-4030-8bab-db2650a0cbff/ovn-acl-logging/0.log" Apr 17 11:30:49.004385 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:49.004364 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" event={"ID":"43535899-eb5a-4030-8bab-db2650a0cbff","Type":"ContainerStarted","Data":"1672cf8fb4ecfb15c33f165523c4833f4a10de997a32f6235ffbee1e6db2ae79"} Apr 17 11:30:49.004681 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:49.004659 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:49.019117 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:49.019099 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:30:49.052877 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:49.052839 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" podStartSLOduration=9.075358208 podStartE2EDuration="27.052824679s" podCreationTimestamp="2026-04-17 11:30:22 +0000 UTC" firstStartedPulling="2026-04-17 11:30:24.050475461 +0000 UTC m=+1.848190071" lastFinishedPulling="2026-04-17 11:30:42.027941937 +0000 UTC m=+19.825656542" observedRunningTime="2026-04-17 11:30:49.05144177 +0000 UTC m=+26.849156386" watchObservedRunningTime="2026-04-17 11:30:49.052824679 +0000 UTC m=+26.850539296" Apr 17 11:30:49.067088 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:49.067063 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-49lr8"] Apr 17 11:30:49.067170 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:49.067139 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49lr8" Apr 17 11:30:49.067238 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:49.067216 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-49lr8" podUID="e74f83b5-a2a2-4262-89a6-a122df9a5401" Apr 17 11:30:49.070346 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:49.070322 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-cgpzp"] Apr 17 11:30:49.070432 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:49.070395 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cgpzp" Apr 17 11:30:49.070513 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:49.070472 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cgpzp" podUID="34d03c01-00bf-416b-8b46-2274587cc240" Apr 17 11:30:49.070910 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:49.070893 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xw9bz"] Apr 17 11:30:49.071011 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:49.070998 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw9bz" Apr 17 11:30:49.071110 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:49.071095 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw9bz" podUID="4155f35e-1865-499f-88fb-fdde1e2c1218" Apr 17 11:30:50.007538 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:50.007507 2577 generic.go:358] "Generic (PLEG): container finished" podID="bb2737eb-5571-4fee-8d9a-10110cc1a205" containerID="802dddf268e185101aeb6ebb7d84f6469df325e30bec735fc81cf131f6c4a3d4" exitCode=0 Apr 17 11:30:50.007945 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:50.007594 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fl46q" event={"ID":"bb2737eb-5571-4fee-8d9a-10110cc1a205","Type":"ContainerDied","Data":"802dddf268e185101aeb6ebb7d84f6469df325e30bec735fc81cf131f6c4a3d4"} Apr 17 11:30:50.797680 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:50.797373 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49lr8" Apr 17 11:30:50.797680 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:50.797373 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cgpzp" Apr 17 11:30:50.797680 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:50.797502 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-49lr8" podUID="e74f83b5-a2a2-4262-89a6-a122df9a5401" Apr 17 11:30:50.797680 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:50.797373 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw9bz" Apr 17 11:30:50.797680 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:50.797566 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cgpzp" podUID="34d03c01-00bf-416b-8b46-2274587cc240" Apr 17 11:30:50.797680 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:50.797666 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw9bz" podUID="4155f35e-1865-499f-88fb-fdde1e2c1218" Apr 17 11:30:52.795264 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:52.795041 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cgpzp" Apr 17 11:30:52.795658 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:52.795106 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49lr8" Apr 17 11:30:52.795658 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:52.795363 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cgpzp" podUID="34d03c01-00bf-416b-8b46-2274587cc240" Apr 17 11:30:52.795658 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:52.795130 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw9bz" Apr 17 11:30:52.795658 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:52.795436 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-49lr8" podUID="e74f83b5-a2a2-4262-89a6-a122df9a5401" Apr 17 11:30:52.795871 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:52.795561 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw9bz" podUID="4155f35e-1865-499f-88fb-fdde1e2c1218" Apr 17 11:30:54.793557 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:54.793523 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49lr8" Apr 17 11:30:54.794214 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:54.793533 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw9bz" Apr 17 11:30:54.794214 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:54.793674 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-49lr8" podUID="e74f83b5-a2a2-4262-89a6-a122df9a5401" Apr 17 11:30:54.794214 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:54.793539 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cgpzp" Apr 17 11:30:54.794214 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:54.793758 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw9bz" podUID="4155f35e-1865-499f-88fb-fdde1e2c1218" Apr 17 11:30:54.794214 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:54.793795 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cgpzp" podUID="34d03c01-00bf-416b-8b46-2274587cc240" Apr 17 11:30:55.085313 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:55.085218 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-64.ec2.internal" event="NodeReady" Apr 17 11:30:55.085472 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:55.085399 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 11:30:55.128724 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:55.128693 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-gsgsr"] Apr 17 11:30:55.155491 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:55.155459 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-c5z7w"] Apr 17 11:30:55.155673 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:55.155649 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gsgsr" Apr 17 11:30:55.158588 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:55.158545 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 11:30:55.158588 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:55.158547 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ltml9\"" Apr 17 11:30:55.158779 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:55.158595 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 11:30:55.175285 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:55.175245 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gsgsr"] Apr 17 11:30:55.175285 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:55.175285 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-c5z7w"] Apr 17 11:30:55.175484 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:55.175388 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-c5z7w" Apr 17 11:30:55.179051 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:55.179030 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 11:30:55.179051 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:55.179048 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 11:30:55.179218 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:55.179134 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-hmj98\"" Apr 17 11:30:55.179422 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:55.179396 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 11:30:55.230261 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:55.230228 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/15ba3943-b4d6-43fe-88ff-590573a317b8-config-volume\") pod \"dns-default-gsgsr\" (UID: \"15ba3943-b4d6-43fe-88ff-590573a317b8\") " pod="openshift-dns/dns-default-gsgsr" Apr 17 11:30:55.230434 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:55.230292 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a43f043-3738-4fc8-9a0f-9a3de52038b5-cert\") pod \"ingress-canary-c5z7w\" (UID: \"8a43f043-3738-4fc8-9a0f-9a3de52038b5\") " pod="openshift-ingress-canary/ingress-canary-c5z7w" Apr 17 11:30:55.230434 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:55.230337 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47vbn\" (UniqueName: \"kubernetes.io/projected/8a43f043-3738-4fc8-9a0f-9a3de52038b5-kube-api-access-47vbn\") pod \"ingress-canary-c5z7w\" (UID: \"8a43f043-3738-4fc8-9a0f-9a3de52038b5\") " pod="openshift-ingress-canary/ingress-canary-c5z7w" Apr 17 11:30:55.230434 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:55.230363 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26phc\" (UniqueName: \"kubernetes.io/projected/15ba3943-b4d6-43fe-88ff-590573a317b8-kube-api-access-26phc\") pod \"dns-default-gsgsr\" (UID: \"15ba3943-b4d6-43fe-88ff-590573a317b8\") " pod="openshift-dns/dns-default-gsgsr" Apr 17 11:30:55.230550 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:55.230446 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/15ba3943-b4d6-43fe-88ff-590573a317b8-tmp-dir\") pod \"dns-default-gsgsr\" (UID: \"15ba3943-b4d6-43fe-88ff-590573a317b8\") " pod="openshift-dns/dns-default-gsgsr" Apr 17 11:30:55.230550 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:55.230500 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/15ba3943-b4d6-43fe-88ff-590573a317b8-metrics-tls\") pod \"dns-default-gsgsr\" (UID: \"15ba3943-b4d6-43fe-88ff-590573a317b8\") " pod="openshift-dns/dns-default-gsgsr" Apr 17 11:30:55.331533 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:55.331503 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/15ba3943-b4d6-43fe-88ff-590573a317b8-tmp-dir\") pod \"dns-default-gsgsr\" (UID: \"15ba3943-b4d6-43fe-88ff-590573a317b8\") " pod="openshift-dns/dns-default-gsgsr" Apr 17 11:30:55.331533 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:55.331544 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/15ba3943-b4d6-43fe-88ff-590573a317b8-metrics-tls\") pod \"dns-default-gsgsr\" (UID: \"15ba3943-b4d6-43fe-88ff-590573a317b8\") " pod="openshift-dns/dns-default-gsgsr" Apr 17 11:30:55.331772 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:55.331654 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:30:55.331772 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:55.331703 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/15ba3943-b4d6-43fe-88ff-590573a317b8-config-volume\") pod \"dns-default-gsgsr\" (UID: \"15ba3943-b4d6-43fe-88ff-590573a317b8\") " pod="openshift-dns/dns-default-gsgsr" Apr 17 11:30:55.331772 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:55.331716 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15ba3943-b4d6-43fe-88ff-590573a317b8-metrics-tls podName:15ba3943-b4d6-43fe-88ff-590573a317b8 nodeName:}" failed. No retries permitted until 2026-04-17 11:30:55.831698404 +0000 UTC m=+33.629412999 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/15ba3943-b4d6-43fe-88ff-590573a317b8-metrics-tls") pod "dns-default-gsgsr" (UID: "15ba3943-b4d6-43fe-88ff-590573a317b8") : secret "dns-default-metrics-tls" not found Apr 17 11:30:55.331887 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:55.331771 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a43f043-3738-4fc8-9a0f-9a3de52038b5-cert\") pod \"ingress-canary-c5z7w\" (UID: \"8a43f043-3738-4fc8-9a0f-9a3de52038b5\") " pod="openshift-ingress-canary/ingress-canary-c5z7w" Apr 17 11:30:55.331887 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:55.331824 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-47vbn\" (UniqueName: \"kubernetes.io/projected/8a43f043-3738-4fc8-9a0f-9a3de52038b5-kube-api-access-47vbn\") pod \"ingress-canary-c5z7w\" (UID: \"8a43f043-3738-4fc8-9a0f-9a3de52038b5\") " pod="openshift-ingress-canary/ingress-canary-c5z7w" Apr 17 11:30:55.331887 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:55.331855 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:30:55.332004 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:55.331900 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a43f043-3738-4fc8-9a0f-9a3de52038b5-cert podName:8a43f043-3738-4fc8-9a0f-9a3de52038b5 nodeName:}" failed. No retries permitted until 2026-04-17 11:30:55.831886304 +0000 UTC m=+33.629600899 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8a43f043-3738-4fc8-9a0f-9a3de52038b5-cert") pod "ingress-canary-c5z7w" (UID: "8a43f043-3738-4fc8-9a0f-9a3de52038b5") : secret "canary-serving-cert" not found Apr 17 11:30:55.332004 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:55.331896 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/15ba3943-b4d6-43fe-88ff-590573a317b8-tmp-dir\") pod \"dns-default-gsgsr\" (UID: \"15ba3943-b4d6-43fe-88ff-590573a317b8\") " pod="openshift-dns/dns-default-gsgsr" Apr 17 11:30:55.332004 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:55.331855 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26phc\" (UniqueName: \"kubernetes.io/projected/15ba3943-b4d6-43fe-88ff-590573a317b8-kube-api-access-26phc\") pod \"dns-default-gsgsr\" (UID: \"15ba3943-b4d6-43fe-88ff-590573a317b8\") " pod="openshift-dns/dns-default-gsgsr" Apr 17 11:30:55.332321 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:55.332304 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/15ba3943-b4d6-43fe-88ff-590573a317b8-config-volume\") pod \"dns-default-gsgsr\" (UID: \"15ba3943-b4d6-43fe-88ff-590573a317b8\") " pod="openshift-dns/dns-default-gsgsr" Apr 17 11:30:55.345873 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:55.345807 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-26phc\" (UniqueName: \"kubernetes.io/projected/15ba3943-b4d6-43fe-88ff-590573a317b8-kube-api-access-26phc\") pod \"dns-default-gsgsr\" (UID: \"15ba3943-b4d6-43fe-88ff-590573a317b8\") " pod="openshift-dns/dns-default-gsgsr" Apr 17 11:30:55.346041 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:55.346019 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-47vbn\" (UniqueName: \"kubernetes.io/projected/8a43f043-3738-4fc8-9a0f-9a3de52038b5-kube-api-access-47vbn\") pod \"ingress-canary-c5z7w\" (UID: \"8a43f043-3738-4fc8-9a0f-9a3de52038b5\") " pod="openshift-ingress-canary/ingress-canary-c5z7w" Apr 17 11:30:55.433289 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:55.433241 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4155f35e-1865-499f-88fb-fdde1e2c1218-metrics-certs\") pod \"network-metrics-daemon-xw9bz\" (UID: \"4155f35e-1865-499f-88fb-fdde1e2c1218\") " pod="openshift-multus/network-metrics-daemon-xw9bz" Apr 17 11:30:55.433463 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:55.433402 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:30:55.433527 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:55.433476 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4155f35e-1865-499f-88fb-fdde1e2c1218-metrics-certs podName:4155f35e-1865-499f-88fb-fdde1e2c1218 nodeName:}" failed. No retries permitted until 2026-04-17 11:31:27.433457621 +0000 UTC m=+65.231172222 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4155f35e-1865-499f-88fb-fdde1e2c1218-metrics-certs") pod "network-metrics-daemon-xw9bz" (UID: "4155f35e-1865-499f-88fb-fdde1e2c1218") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:30:55.533547 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:55.533516 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5m77s\" (UniqueName: \"kubernetes.io/projected/34d03c01-00bf-416b-8b46-2274587cc240-kube-api-access-5m77s\") pod \"network-check-target-cgpzp\" (UID: \"34d03c01-00bf-416b-8b46-2274587cc240\") " pod="openshift-network-diagnostics/network-check-target-cgpzp" Apr 17 11:30:55.533734 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:55.533689 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:30:55.533734 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:55.533711 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:30:55.533734 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:55.533722 2577 projected.go:194] Error preparing data for projected volume kube-api-access-5m77s for pod openshift-network-diagnostics/network-check-target-cgpzp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:30:55.533895 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:55.533786 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/34d03c01-00bf-416b-8b46-2274587cc240-kube-api-access-5m77s podName:34d03c01-00bf-416b-8b46-2274587cc240 nodeName:}" failed. No retries permitted until 2026-04-17 11:31:27.533766164 +0000 UTC m=+65.331480775 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-5m77s" (UniqueName: "kubernetes.io/projected/34d03c01-00bf-416b-8b46-2274587cc240-kube-api-access-5m77s") pod "network-check-target-cgpzp" (UID: "34d03c01-00bf-416b-8b46-2274587cc240") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:30:55.835653 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:55.835626 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a43f043-3738-4fc8-9a0f-9a3de52038b5-cert\") pod \"ingress-canary-c5z7w\" (UID: \"8a43f043-3738-4fc8-9a0f-9a3de52038b5\") " pod="openshift-ingress-canary/ingress-canary-c5z7w" Apr 17 11:30:55.836117 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:55.835688 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/15ba3943-b4d6-43fe-88ff-590573a317b8-metrics-tls\") pod \"dns-default-gsgsr\" (UID: \"15ba3943-b4d6-43fe-88ff-590573a317b8\") " pod="openshift-dns/dns-default-gsgsr" Apr 17 11:30:55.836117 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:55.835775 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:30:55.836117 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:55.835821 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15ba3943-b4d6-43fe-88ff-590573a317b8-metrics-tls podName:15ba3943-b4d6-43fe-88ff-590573a317b8 nodeName:}" failed. No retries permitted until 2026-04-17 11:30:56.835809117 +0000 UTC m=+34.633523712 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/15ba3943-b4d6-43fe-88ff-590573a317b8-metrics-tls") pod "dns-default-gsgsr" (UID: "15ba3943-b4d6-43fe-88ff-590573a317b8") : secret "dns-default-metrics-tls" not found Apr 17 11:30:55.836117 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:55.835777 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:30:55.836117 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:55.835890 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a43f043-3738-4fc8-9a0f-9a3de52038b5-cert podName:8a43f043-3738-4fc8-9a0f-9a3de52038b5 nodeName:}" failed. No retries permitted until 2026-04-17 11:30:56.835878982 +0000 UTC m=+34.633593589 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8a43f043-3738-4fc8-9a0f-9a3de52038b5-cert") pod "ingress-canary-c5z7w" (UID: "8a43f043-3738-4fc8-9a0f-9a3de52038b5") : secret "canary-serving-cert" not found Apr 17 11:30:56.023039 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:56.022996 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fl46q" event={"ID":"bb2737eb-5571-4fee-8d9a-10110cc1a205","Type":"ContainerStarted","Data":"2db7b98819378e1e8b0ca907b7299b82823e6894bed0342318c4e5b5164e9420"} Apr 17 11:30:56.794208 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:56.794165 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49lr8" Apr 17 11:30:56.794522 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:56.794170 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw9bz" Apr 17 11:30:56.794522 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:56.794182 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cgpzp" Apr 17 11:30:56.797428 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:56.797404 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 11:30:56.798847 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:56.798828 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-fqsgp\"" Apr 17 11:30:56.798847 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:56.798844 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 11:30:56.799005 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:56.798875 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 11:30:56.799005 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:56.798833 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lzg8w\"" Apr 17 11:30:56.799005 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:56.798891 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 11:30:56.842174 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:56.842151 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/15ba3943-b4d6-43fe-88ff-590573a317b8-metrics-tls\") pod \"dns-default-gsgsr\" (UID: \"15ba3943-b4d6-43fe-88ff-590573a317b8\") " pod="openshift-dns/dns-default-gsgsr" Apr 17 11:30:56.842500 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:56.842235 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a43f043-3738-4fc8-9a0f-9a3de52038b5-cert\") pod \"ingress-canary-c5z7w\" (UID: \"8a43f043-3738-4fc8-9a0f-9a3de52038b5\") " pod="openshift-ingress-canary/ingress-canary-c5z7w" Apr 17 11:30:56.842500 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:56.842312 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:30:56.842500 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:56.842365 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15ba3943-b4d6-43fe-88ff-590573a317b8-metrics-tls podName:15ba3943-b4d6-43fe-88ff-590573a317b8 nodeName:}" failed. No retries permitted until 2026-04-17 11:30:58.842350324 +0000 UTC m=+36.640064919 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/15ba3943-b4d6-43fe-88ff-590573a317b8-metrics-tls") pod "dns-default-gsgsr" (UID: "15ba3943-b4d6-43fe-88ff-590573a317b8") : secret "dns-default-metrics-tls" not found Apr 17 11:30:56.842500 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:56.842367 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:30:56.842500 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:56.842414 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a43f043-3738-4fc8-9a0f-9a3de52038b5-cert podName:8a43f043-3738-4fc8-9a0f-9a3de52038b5 nodeName:}" failed. No retries permitted until 2026-04-17 11:30:58.8424018 +0000 UTC m=+36.640116395 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8a43f043-3738-4fc8-9a0f-9a3de52038b5-cert") pod "ingress-canary-c5z7w" (UID: "8a43f043-3738-4fc8-9a0f-9a3de52038b5") : secret "canary-serving-cert" not found Apr 17 11:30:57.027484 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:57.027454 2577 generic.go:358] "Generic (PLEG): container finished" podID="bb2737eb-5571-4fee-8d9a-10110cc1a205" containerID="2db7b98819378e1e8b0ca907b7299b82823e6894bed0342318c4e5b5164e9420" exitCode=0 Apr 17 11:30:57.027628 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:57.027502 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fl46q" event={"ID":"bb2737eb-5571-4fee-8d9a-10110cc1a205","Type":"ContainerDied","Data":"2db7b98819378e1e8b0ca907b7299b82823e6894bed0342318c4e5b5164e9420"} Apr 17 11:30:58.031912 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:58.031883 2577 generic.go:358] "Generic (PLEG): container finished" podID="bb2737eb-5571-4fee-8d9a-10110cc1a205" containerID="cece8e4c743e55521bb19135237b0c7a3d950bdecbd5f56b09ac05fe16325388" exitCode=0 Apr 17 11:30:58.032259 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:58.031937 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fl46q" event={"ID":"bb2737eb-5571-4fee-8d9a-10110cc1a205","Type":"ContainerDied","Data":"cece8e4c743e55521bb19135237b0c7a3d950bdecbd5f56b09ac05fe16325388"} Apr 17 11:30:58.857875 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:58.857643 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a43f043-3738-4fc8-9a0f-9a3de52038b5-cert\") pod \"ingress-canary-c5z7w\" (UID: \"8a43f043-3738-4fc8-9a0f-9a3de52038b5\") " pod="openshift-ingress-canary/ingress-canary-c5z7w" Apr 17 11:30:58.857875 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:58.857868 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/15ba3943-b4d6-43fe-88ff-590573a317b8-metrics-tls\") pod \"dns-default-gsgsr\" (UID: \"15ba3943-b4d6-43fe-88ff-590573a317b8\") " pod="openshift-dns/dns-default-gsgsr" Apr 17 11:30:58.858039 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:58.857810 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:30:58.858039 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:58.857951 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a43f043-3738-4fc8-9a0f-9a3de52038b5-cert podName:8a43f043-3738-4fc8-9a0f-9a3de52038b5 nodeName:}" failed. No retries permitted until 2026-04-17 11:31:02.857935224 +0000 UTC m=+40.655649818 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8a43f043-3738-4fc8-9a0f-9a3de52038b5-cert") pod "ingress-canary-c5z7w" (UID: "8a43f043-3738-4fc8-9a0f-9a3de52038b5") : secret "canary-serving-cert" not found Apr 17 11:30:58.858039 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:58.857957 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:30:58.858039 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:30:58.857994 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15ba3943-b4d6-43fe-88ff-590573a317b8-metrics-tls podName:15ba3943-b4d6-43fe-88ff-590573a317b8 nodeName:}" failed. No retries permitted until 2026-04-17 11:31:02.857983613 +0000 UTC m=+40.655698207 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/15ba3943-b4d6-43fe-88ff-590573a317b8-metrics-tls") pod "dns-default-gsgsr" (UID: "15ba3943-b4d6-43fe-88ff-590573a317b8") : secret "dns-default-metrics-tls" not found Apr 17 11:30:59.036620 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:59.036586 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fl46q" event={"ID":"bb2737eb-5571-4fee-8d9a-10110cc1a205","Type":"ContainerStarted","Data":"29a7d36a935d3bd05babaa4aa250d12a0080ad8103be2078d1c6604f4dc97589"} Apr 17 11:30:59.062240 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:59.062193 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-fl46q" podStartSLOduration=5.283862783 podStartE2EDuration="37.062180762s" podCreationTimestamp="2026-04-17 11:30:22 +0000 UTC" firstStartedPulling="2026-04-17 11:30:24.032809005 +0000 UTC m=+1.830523603" lastFinishedPulling="2026-04-17 11:30:55.811126984 +0000 UTC m=+33.608841582" observedRunningTime="2026-04-17 11:30:59.060455695 +0000 UTC m=+36.858170311" watchObservedRunningTime="2026-04-17 11:30:59.062180762 +0000 UTC m=+36.859895378" Apr 17 11:30:59.160532 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:59.160447 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e74f83b5-a2a2-4262-89a6-a122df9a5401-original-pull-secret\") pod \"global-pull-secret-syncer-49lr8\" (UID: \"e74f83b5-a2a2-4262-89a6-a122df9a5401\") " pod="kube-system/global-pull-secret-syncer-49lr8" Apr 17 11:30:59.163414 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:59.163391 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e74f83b5-a2a2-4262-89a6-a122df9a5401-original-pull-secret\") pod \"global-pull-secret-syncer-49lr8\" (UID: \"e74f83b5-a2a2-4262-89a6-a122df9a5401\") " pod="kube-system/global-pull-secret-syncer-49lr8" Apr 17 11:30:59.203431 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:59.203406 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49lr8" Apr 17 11:30:59.407008 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:30:59.406967 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-49lr8"] Apr 17 11:30:59.411877 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:30:59.411808 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode74f83b5_a2a2_4262_89a6_a122df9a5401.slice/crio-325536d5bd035fec9bc79d0cef35827138c60cb4919a2eec58b26270ccf62437 WatchSource:0}: Error finding container 325536d5bd035fec9bc79d0cef35827138c60cb4919a2eec58b26270ccf62437: Status 404 returned error can't find the container with id 325536d5bd035fec9bc79d0cef35827138c60cb4919a2eec58b26270ccf62437 Apr 17 11:31:00.040044 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:00.039994 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-49lr8" event={"ID":"e74f83b5-a2a2-4262-89a6-a122df9a5401","Type":"ContainerStarted","Data":"325536d5bd035fec9bc79d0cef35827138c60cb4919a2eec58b26270ccf62437"} Apr 17 11:31:02.887259 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:02.887030 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a43f043-3738-4fc8-9a0f-9a3de52038b5-cert\") pod \"ingress-canary-c5z7w\" (UID: \"8a43f043-3738-4fc8-9a0f-9a3de52038b5\") " pod="openshift-ingress-canary/ingress-canary-c5z7w" Apr 17 11:31:02.887689 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:31:02.887174 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:31:02.887689 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:02.887328 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/15ba3943-b4d6-43fe-88ff-590573a317b8-metrics-tls\") pod \"dns-default-gsgsr\" (UID: \"15ba3943-b4d6-43fe-88ff-590573a317b8\") " pod="openshift-dns/dns-default-gsgsr" Apr 17 11:31:02.887689 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:31:02.887385 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a43f043-3738-4fc8-9a0f-9a3de52038b5-cert podName:8a43f043-3738-4fc8-9a0f-9a3de52038b5 nodeName:}" failed. No retries permitted until 2026-04-17 11:31:10.887365701 +0000 UTC m=+48.685080316 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8a43f043-3738-4fc8-9a0f-9a3de52038b5-cert") pod "ingress-canary-c5z7w" (UID: "8a43f043-3738-4fc8-9a0f-9a3de52038b5") : secret "canary-serving-cert" not found Apr 17 11:31:02.887689 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:31:02.887449 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:31:02.887689 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:31:02.887516 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15ba3943-b4d6-43fe-88ff-590573a317b8-metrics-tls podName:15ba3943-b4d6-43fe-88ff-590573a317b8 nodeName:}" failed. No retries permitted until 2026-04-17 11:31:10.887499839 +0000 UTC m=+48.685214434 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/15ba3943-b4d6-43fe-88ff-590573a317b8-metrics-tls") pod "dns-default-gsgsr" (UID: "15ba3943-b4d6-43fe-88ff-590573a317b8") : secret "dns-default-metrics-tls" not found Apr 17 11:31:04.048777 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:04.048683 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-49lr8" event={"ID":"e74f83b5-a2a2-4262-89a6-a122df9a5401","Type":"ContainerStarted","Data":"14f6bd1e4f1d8b4d7f9b347837dc063aafe06ce6e71fc918340b6c384c551e10"} Apr 17 11:31:04.066803 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:04.066759 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-49lr8" podStartSLOduration=32.687531645 podStartE2EDuration="37.066743468s" podCreationTimestamp="2026-04-17 11:30:27 +0000 UTC" firstStartedPulling="2026-04-17 11:30:59.414034392 +0000 UTC m=+37.211748990" lastFinishedPulling="2026-04-17 11:31:03.793246218 +0000 UTC m=+41.590960813" observedRunningTime="2026-04-17 11:31:04.066223681 +0000 UTC m=+41.863938300" watchObservedRunningTime="2026-04-17 11:31:04.066743468 +0000 UTC m=+41.864458084" Apr 17 11:31:04.792140 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:04.792099 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77f54db55c-v5k4f"] Apr 17 11:31:04.828060 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:04.828032 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77f54db55c-v5k4f" Apr 17 11:31:04.830546 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:04.830522 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 11:31:04.830663 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:04.830547 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 11:31:04.831218 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:04.831198 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bf584f5-d22zl"] Apr 17 11:31:04.831593 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:04.831571 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 17 11:31:04.831593 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:04.831583 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 11:31:04.831771 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:04.831626 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-pwhkj\"" Apr 17 11:31:04.849931 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:04.849910 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77f54db55c-v5k4f"] Apr 17 11:31:04.850030 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:04.849948 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bf584f5-d22zl" Apr 17 11:31:04.850030 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:04.849952 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bf584f5-d22zl"] Apr 17 11:31:04.852510 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:04.852491 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 17 11:31:04.852615 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:04.852521 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 17 11:31:04.852615 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:04.852581 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 17 11:31:04.852723 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:04.852613 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 17 11:31:04.902695 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:04.902664 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/00891146-59a6-466a-a58d-7f743267b099-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-77f54db55c-v5k4f\" (UID: \"00891146-59a6-466a-a58d-7f743267b099\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77f54db55c-v5k4f" Apr 17 11:31:04.902842 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:04.902700 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j98r7\" (UniqueName: \"kubernetes.io/projected/00891146-59a6-466a-a58d-7f743267b099-kube-api-access-j98r7\") pod \"managed-serviceaccount-addon-agent-77f54db55c-v5k4f\" (UID: \"00891146-59a6-466a-a58d-7f743267b099\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77f54db55c-v5k4f" Apr 17 11:31:05.003315 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:05.003263 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/00891146-59a6-466a-a58d-7f743267b099-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-77f54db55c-v5k4f\" (UID: \"00891146-59a6-466a-a58d-7f743267b099\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77f54db55c-v5k4f" Apr 17 11:31:05.003467 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:05.003323 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j98r7\" (UniqueName: \"kubernetes.io/projected/00891146-59a6-466a-a58d-7f743267b099-kube-api-access-j98r7\") pod \"managed-serviceaccount-addon-agent-77f54db55c-v5k4f\" (UID: \"00891146-59a6-466a-a58d-7f743267b099\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77f54db55c-v5k4f" Apr 17 11:31:05.003467 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:05.003350 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a1c4f670-41a2-4be5-8152-e22905ef9201-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6bf584f5-d22zl\" (UID: \"a1c4f670-41a2-4be5-8152-e22905ef9201\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bf584f5-d22zl" Apr 17 11:31:05.003467 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:05.003382 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/a1c4f670-41a2-4be5-8152-e22905ef9201-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6bf584f5-d22zl\" (UID: \"a1c4f670-41a2-4be5-8152-e22905ef9201\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bf584f5-d22zl" Apr 17 11:31:05.003467 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:05.003442 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx5ng\" (UniqueName: \"kubernetes.io/projected/a1c4f670-41a2-4be5-8152-e22905ef9201-kube-api-access-xx5ng\") pod \"cluster-proxy-proxy-agent-6bf584f5-d22zl\" (UID: \"a1c4f670-41a2-4be5-8152-e22905ef9201\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bf584f5-d22zl" Apr 17 11:31:05.003636 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:05.003475 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/a1c4f670-41a2-4be5-8152-e22905ef9201-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6bf584f5-d22zl\" (UID: \"a1c4f670-41a2-4be5-8152-e22905ef9201\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bf584f5-d22zl" Apr 17 11:31:05.003636 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:05.003549 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/a1c4f670-41a2-4be5-8152-e22905ef9201-ca\") pod \"cluster-proxy-proxy-agent-6bf584f5-d22zl\" (UID: \"a1c4f670-41a2-4be5-8152-e22905ef9201\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bf584f5-d22zl" Apr 17 11:31:05.003636 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:05.003604 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/a1c4f670-41a2-4be5-8152-e22905ef9201-hub\") pod \"cluster-proxy-proxy-agent-6bf584f5-d22zl\" (UID: \"a1c4f670-41a2-4be5-8152-e22905ef9201\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bf584f5-d22zl" Apr 17 11:31:05.006971 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:05.006944 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/00891146-59a6-466a-a58d-7f743267b099-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-77f54db55c-v5k4f\" (UID: \"00891146-59a6-466a-a58d-7f743267b099\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77f54db55c-v5k4f" Apr 17 11:31:05.011799 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:05.011778 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j98r7\" (UniqueName: \"kubernetes.io/projected/00891146-59a6-466a-a58d-7f743267b099-kube-api-access-j98r7\") pod \"managed-serviceaccount-addon-agent-77f54db55c-v5k4f\" (UID: \"00891146-59a6-466a-a58d-7f743267b099\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77f54db55c-v5k4f" Apr 17 11:31:05.104376 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:05.104297 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a1c4f670-41a2-4be5-8152-e22905ef9201-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6bf584f5-d22zl\" (UID: \"a1c4f670-41a2-4be5-8152-e22905ef9201\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bf584f5-d22zl" Apr 17 11:31:05.104376 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:05.104336 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/a1c4f670-41a2-4be5-8152-e22905ef9201-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6bf584f5-d22zl\" (UID: \"a1c4f670-41a2-4be5-8152-e22905ef9201\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bf584f5-d22zl" Apr 17 11:31:05.104376 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:05.104368 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xx5ng\" (UniqueName: \"kubernetes.io/projected/a1c4f670-41a2-4be5-8152-e22905ef9201-kube-api-access-xx5ng\") pod \"cluster-proxy-proxy-agent-6bf584f5-d22zl\" (UID: \"a1c4f670-41a2-4be5-8152-e22905ef9201\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bf584f5-d22zl" Apr 17 11:31:05.104918 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:05.104397 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/a1c4f670-41a2-4be5-8152-e22905ef9201-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6bf584f5-d22zl\" (UID: \"a1c4f670-41a2-4be5-8152-e22905ef9201\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bf584f5-d22zl" Apr 17 11:31:05.104918 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:05.104428 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/a1c4f670-41a2-4be5-8152-e22905ef9201-ca\") pod \"cluster-proxy-proxy-agent-6bf584f5-d22zl\" (UID: \"a1c4f670-41a2-4be5-8152-e22905ef9201\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bf584f5-d22zl" Apr 17 11:31:05.104918 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:05.104446 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/a1c4f670-41a2-4be5-8152-e22905ef9201-hub\") pod \"cluster-proxy-proxy-agent-6bf584f5-d22zl\" (UID: \"a1c4f670-41a2-4be5-8152-e22905ef9201\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bf584f5-d22zl" Apr 17 11:31:05.105361 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:05.105334 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/a1c4f670-41a2-4be5-8152-e22905ef9201-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6bf584f5-d22zl\" (UID: \"a1c4f670-41a2-4be5-8152-e22905ef9201\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bf584f5-d22zl" Apr 17 11:31:05.106967 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:05.106933 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/a1c4f670-41a2-4be5-8152-e22905ef9201-ca\") pod \"cluster-proxy-proxy-agent-6bf584f5-d22zl\" (UID: \"a1c4f670-41a2-4be5-8152-e22905ef9201\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bf584f5-d22zl" Apr 17 11:31:05.107073 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:05.106944 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/a1c4f670-41a2-4be5-8152-e22905ef9201-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6bf584f5-d22zl\" (UID: \"a1c4f670-41a2-4be5-8152-e22905ef9201\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bf584f5-d22zl" Apr 17 11:31:05.107073 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:05.107034 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a1c4f670-41a2-4be5-8152-e22905ef9201-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6bf584f5-d22zl\" (UID: \"a1c4f670-41a2-4be5-8152-e22905ef9201\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bf584f5-d22zl" Apr 17 11:31:05.107073 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:05.107039 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/a1c4f670-41a2-4be5-8152-e22905ef9201-hub\") pod \"cluster-proxy-proxy-agent-6bf584f5-d22zl\" (UID: \"a1c4f670-41a2-4be5-8152-e22905ef9201\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bf584f5-d22zl" Apr 17 11:31:05.112414 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:05.112391 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx5ng\" (UniqueName: \"kubernetes.io/projected/a1c4f670-41a2-4be5-8152-e22905ef9201-kube-api-access-xx5ng\") pod \"cluster-proxy-proxy-agent-6bf584f5-d22zl\" (UID: \"a1c4f670-41a2-4be5-8152-e22905ef9201\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bf584f5-d22zl" Apr 17 11:31:05.150375 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:05.150351 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77f54db55c-v5k4f" Apr 17 11:31:05.167032 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:05.167006 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bf584f5-d22zl" Apr 17 11:31:05.290152 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:05.290124 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77f54db55c-v5k4f"] Apr 17 11:31:05.294163 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:31:05.294137 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00891146_59a6_466a_a58d_7f743267b099.slice/crio-9067e4e51884b04dc99a336956c6137f5ffde439d793a5688e4d1c282862bb9c WatchSource:0}: Error finding container 9067e4e51884b04dc99a336956c6137f5ffde439d793a5688e4d1c282862bb9c: Status 404 returned error can't find the container with id 9067e4e51884b04dc99a336956c6137f5ffde439d793a5688e4d1c282862bb9c Apr 17 11:31:05.322696 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:05.322673 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bf584f5-d22zl"] Apr 17 11:31:05.325208 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:31:05.325176 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1c4f670_41a2_4be5_8152_e22905ef9201.slice/crio-0742454e017e517f5ecab1ee33c2bac18997212edc02219940e240a63c2304a4 WatchSource:0}: Error finding container 0742454e017e517f5ecab1ee33c2bac18997212edc02219940e240a63c2304a4: Status 404 returned error can't find the container with id 0742454e017e517f5ecab1ee33c2bac18997212edc02219940e240a63c2304a4 Apr 17 11:31:06.055155 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:06.055111 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bf584f5-d22zl" event={"ID":"a1c4f670-41a2-4be5-8152-e22905ef9201","Type":"ContainerStarted","Data":"0742454e017e517f5ecab1ee33c2bac18997212edc02219940e240a63c2304a4"} Apr 17 11:31:06.056545 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:06.056514 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77f54db55c-v5k4f" event={"ID":"00891146-59a6-466a-a58d-7f743267b099","Type":"ContainerStarted","Data":"9067e4e51884b04dc99a336956c6137f5ffde439d793a5688e4d1c282862bb9c"} Apr 17 11:31:10.065904 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:10.065866 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bf584f5-d22zl" event={"ID":"a1c4f670-41a2-4be5-8152-e22905ef9201","Type":"ContainerStarted","Data":"bf1d44119a60b96d1f6345a4014d9b8c480fa38f21dad03e0c40f0b2592ff49e"} Apr 17 11:31:10.067369 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:10.067316 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77f54db55c-v5k4f" event={"ID":"00891146-59a6-466a-a58d-7f743267b099","Type":"ContainerStarted","Data":"d1f916c0cf3609c3d64d225082d7288357fa6f58dde4396d22fb171ccaa40f8d"} Apr 17 11:31:10.083589 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:10.083543 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77f54db55c-v5k4f" podStartSLOduration=2.042947112 podStartE2EDuration="6.083527556s" podCreationTimestamp="2026-04-17 11:31:04 +0000 UTC" firstStartedPulling="2026-04-17 11:31:05.295914308 +0000 UTC m=+43.093628906" lastFinishedPulling="2026-04-17 11:31:09.336494751 +0000 UTC m=+47.134209350" observedRunningTime="2026-04-17 11:31:10.082742822 +0000 UTC m=+47.880457440" watchObservedRunningTime="2026-04-17 11:31:10.083527556 +0000 UTC m=+47.881242172" Apr 17 11:31:10.950886 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:10.950848 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a43f043-3738-4fc8-9a0f-9a3de52038b5-cert\") pod \"ingress-canary-c5z7w\" (UID: \"8a43f043-3738-4fc8-9a0f-9a3de52038b5\") " pod="openshift-ingress-canary/ingress-canary-c5z7w" Apr 17 11:31:10.951093 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:10.950941 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/15ba3943-b4d6-43fe-88ff-590573a317b8-metrics-tls\") pod \"dns-default-gsgsr\" (UID: \"15ba3943-b4d6-43fe-88ff-590573a317b8\") " pod="openshift-dns/dns-default-gsgsr" Apr 17 11:31:10.951093 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:31:10.951017 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:31:10.951093 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:31:10.951067 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:31:10.951257 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:31:10.951099 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a43f043-3738-4fc8-9a0f-9a3de52038b5-cert podName:8a43f043-3738-4fc8-9a0f-9a3de52038b5 nodeName:}" failed. No retries permitted until 2026-04-17 11:31:26.951076497 +0000 UTC m=+64.748791104 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8a43f043-3738-4fc8-9a0f-9a3de52038b5-cert") pod "ingress-canary-c5z7w" (UID: "8a43f043-3738-4fc8-9a0f-9a3de52038b5") : secret "canary-serving-cert" not found Apr 17 11:31:10.951257 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:31:10.951121 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15ba3943-b4d6-43fe-88ff-590573a317b8-metrics-tls podName:15ba3943-b4d6-43fe-88ff-590573a317b8 nodeName:}" failed. No retries permitted until 2026-04-17 11:31:26.95111089 +0000 UTC m=+64.748825501 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/15ba3943-b4d6-43fe-88ff-590573a317b8-metrics-tls") pod "dns-default-gsgsr" (UID: "15ba3943-b4d6-43fe-88ff-590573a317b8") : secret "dns-default-metrics-tls" not found Apr 17 11:31:12.073437 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:12.073379 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bf584f5-d22zl" event={"ID":"a1c4f670-41a2-4be5-8152-e22905ef9201","Type":"ContainerStarted","Data":"015faa4307d68d5a7866c99080d38ae87713baa4b9dc8b1c9b05b3c37e0f7944"} Apr 17 11:31:12.073437 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:12.073415 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bf584f5-d22zl" event={"ID":"a1c4f670-41a2-4be5-8152-e22905ef9201","Type":"ContainerStarted","Data":"e2b0f4ecc1419e7d2c8e698ae78d56e36aeff66b101f068c111e4c3976ef6eee"} Apr 17 11:31:13.095540 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:13.095494 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bf584f5-d22zl" podStartSLOduration=2.598069944 podStartE2EDuration="9.095478749s" podCreationTimestamp="2026-04-17 11:31:04 +0000 UTC" firstStartedPulling="2026-04-17 11:31:05.32689038 +0000 UTC m=+43.124604979" lastFinishedPulling="2026-04-17 11:31:11.824299184 +0000 UTC m=+49.622013784" observedRunningTime="2026-04-17 11:31:13.095152276 +0000 UTC m=+50.892866910" watchObservedRunningTime="2026-04-17 11:31:13.095478749 +0000 UTC m=+50.893193366" Apr 17 11:31:21.020659 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:21.020630 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-znj2s" Apr 17 11:31:26.965245 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:26.965208 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/15ba3943-b4d6-43fe-88ff-590573a317b8-metrics-tls\") pod \"dns-default-gsgsr\" (UID: \"15ba3943-b4d6-43fe-88ff-590573a317b8\") " pod="openshift-dns/dns-default-gsgsr" Apr 17 11:31:26.965656 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:26.965265 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a43f043-3738-4fc8-9a0f-9a3de52038b5-cert\") pod \"ingress-canary-c5z7w\" (UID: \"8a43f043-3738-4fc8-9a0f-9a3de52038b5\") " pod="openshift-ingress-canary/ingress-canary-c5z7w" Apr 17 11:31:26.965656 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:31:26.965370 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:31:26.965656 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:31:26.965374 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:31:26.965656 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:31:26.965422 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a43f043-3738-4fc8-9a0f-9a3de52038b5-cert podName:8a43f043-3738-4fc8-9a0f-9a3de52038b5 nodeName:}" failed. No retries permitted until 2026-04-17 11:31:58.96540826 +0000 UTC m=+96.763122855 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8a43f043-3738-4fc8-9a0f-9a3de52038b5-cert") pod "ingress-canary-c5z7w" (UID: "8a43f043-3738-4fc8-9a0f-9a3de52038b5") : secret "canary-serving-cert" not found Apr 17 11:31:26.965656 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:31:26.965435 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15ba3943-b4d6-43fe-88ff-590573a317b8-metrics-tls podName:15ba3943-b4d6-43fe-88ff-590573a317b8 nodeName:}" failed. No retries permitted until 2026-04-17 11:31:58.965428999 +0000 UTC m=+96.763143593 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/15ba3943-b4d6-43fe-88ff-590573a317b8-metrics-tls") pod "dns-default-gsgsr" (UID: "15ba3943-b4d6-43fe-88ff-590573a317b8") : secret "dns-default-metrics-tls" not found Apr 17 11:31:27.468423 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:27.468386 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4155f35e-1865-499f-88fb-fdde1e2c1218-metrics-certs\") pod \"network-metrics-daemon-xw9bz\" (UID: \"4155f35e-1865-499f-88fb-fdde1e2c1218\") " pod="openshift-multus/network-metrics-daemon-xw9bz" Apr 17 11:31:27.471340 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:27.471323 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 11:31:27.479353 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:31:27.479331 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 11:31:27.479423 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:31:27.479392 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4155f35e-1865-499f-88fb-fdde1e2c1218-metrics-certs podName:4155f35e-1865-499f-88fb-fdde1e2c1218 nodeName:}" failed. No retries permitted until 2026-04-17 11:32:31.479376023 +0000 UTC m=+129.277090627 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4155f35e-1865-499f-88fb-fdde1e2c1218-metrics-certs") pod "network-metrics-daemon-xw9bz" (UID: "4155f35e-1865-499f-88fb-fdde1e2c1218") : secret "metrics-daemon-secret" not found Apr 17 11:31:27.569712 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:27.569678 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5m77s\" (UniqueName: \"kubernetes.io/projected/34d03c01-00bf-416b-8b46-2274587cc240-kube-api-access-5m77s\") pod \"network-check-target-cgpzp\" (UID: \"34d03c01-00bf-416b-8b46-2274587cc240\") " pod="openshift-network-diagnostics/network-check-target-cgpzp" Apr 17 11:31:27.572686 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:27.572669 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 11:31:27.584190 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:27.584174 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 11:31:27.593876 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:27.593852 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m77s\" (UniqueName: \"kubernetes.io/projected/34d03c01-00bf-416b-8b46-2274587cc240-kube-api-access-5m77s\") pod \"network-check-target-cgpzp\" (UID: \"34d03c01-00bf-416b-8b46-2274587cc240\") " pod="openshift-network-diagnostics/network-check-target-cgpzp" Apr 17 11:31:27.716093 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:27.716063 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-fqsgp\"" Apr 17 11:31:27.724082 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:27.724036 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cgpzp" Apr 17 11:31:27.837538 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:27.837507 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-cgpzp"] Apr 17 11:31:27.840538 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:31:27.840511 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34d03c01_00bf_416b_8b46_2274587cc240.slice/crio-cd0dee109545888710b57daa40e780caa7c800f36b04ef7af3f8f1ba7b0ee043 WatchSource:0}: Error finding container cd0dee109545888710b57daa40e780caa7c800f36b04ef7af3f8f1ba7b0ee043: Status 404 returned error can't find the container with id cd0dee109545888710b57daa40e780caa7c800f36b04ef7af3f8f1ba7b0ee043 Apr 17 11:31:28.107747 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:28.107667 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-cgpzp" event={"ID":"34d03c01-00bf-416b-8b46-2274587cc240","Type":"ContainerStarted","Data":"cd0dee109545888710b57daa40e780caa7c800f36b04ef7af3f8f1ba7b0ee043"} Apr 17 11:31:31.115286 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:31.115235 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-cgpzp" event={"ID":"34d03c01-00bf-416b-8b46-2274587cc240","Type":"ContainerStarted","Data":"135d0c4a6aa6e0db130328582a48be157a03f95c7a28d799b6cd14914b896f56"} Apr 17 11:31:31.115821 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:31.115373 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-cgpzp" Apr 17 11:31:31.133688 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:31.133644 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-cgpzp" podStartSLOduration=66.536744592 podStartE2EDuration="1m9.133632274s" podCreationTimestamp="2026-04-17 11:30:22 +0000 UTC" firstStartedPulling="2026-04-17 11:31:27.842393749 +0000 UTC m=+65.640108343" lastFinishedPulling="2026-04-17 11:31:30.439281415 +0000 UTC m=+68.236996025" observedRunningTime="2026-04-17 11:31:31.132878913 +0000 UTC m=+68.930593529" watchObservedRunningTime="2026-04-17 11:31:31.133632274 +0000 UTC m=+68.931346885" Apr 17 11:31:58.989696 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:58.989667 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a43f043-3738-4fc8-9a0f-9a3de52038b5-cert\") pod \"ingress-canary-c5z7w\" (UID: \"8a43f043-3738-4fc8-9a0f-9a3de52038b5\") " pod="openshift-ingress-canary/ingress-canary-c5z7w" Apr 17 11:31:58.990050 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:31:58.989722 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/15ba3943-b4d6-43fe-88ff-590573a317b8-metrics-tls\") pod \"dns-default-gsgsr\" (UID: \"15ba3943-b4d6-43fe-88ff-590573a317b8\") " pod="openshift-dns/dns-default-gsgsr" Apr 17 11:31:58.990050 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:31:58.989814 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:31:58.990050 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:31:58.989816 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:31:58.990050 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:31:58.989874 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15ba3943-b4d6-43fe-88ff-590573a317b8-metrics-tls podName:15ba3943-b4d6-43fe-88ff-590573a317b8 nodeName:}" failed. No retries permitted until 2026-04-17 11:33:02.98986161 +0000 UTC m=+160.787576206 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/15ba3943-b4d6-43fe-88ff-590573a317b8-metrics-tls") pod "dns-default-gsgsr" (UID: "15ba3943-b4d6-43fe-88ff-590573a317b8") : secret "dns-default-metrics-tls" not found Apr 17 11:31:58.990050 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:31:58.989889 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a43f043-3738-4fc8-9a0f-9a3de52038b5-cert podName:8a43f043-3738-4fc8-9a0f-9a3de52038b5 nodeName:}" failed. No retries permitted until 2026-04-17 11:33:02.989882145 +0000 UTC m=+160.787596740 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8a43f043-3738-4fc8-9a0f-9a3de52038b5-cert") pod "ingress-canary-c5z7w" (UID: "8a43f043-3738-4fc8-9a0f-9a3de52038b5") : secret "canary-serving-cert" not found Apr 17 11:32:02.120232 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:02.120207 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-cgpzp" Apr 17 11:32:30.623142 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:30.623115 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-stmhs_3f06ebee-cbe3-4266-bf01-0bb889437be7/dns-node-resolver/0.log" Apr 17 11:32:31.221216 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:31.221188 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-frb9s_a1b0eda5-8b26-4ce3-af63-74364b0ea28f/node-ca/0.log" Apr 17 11:32:31.530067 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:31.529988 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4155f35e-1865-499f-88fb-fdde1e2c1218-metrics-certs\") pod \"network-metrics-daemon-xw9bz\" (UID: \"4155f35e-1865-499f-88fb-fdde1e2c1218\") " pod="openshift-multus/network-metrics-daemon-xw9bz" Apr 17 11:32:31.530210 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:32:31.530092 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 11:32:31.530210 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:32:31.530147 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4155f35e-1865-499f-88fb-fdde1e2c1218-metrics-certs podName:4155f35e-1865-499f-88fb-fdde1e2c1218 nodeName:}" failed. No retries permitted until 2026-04-17 11:34:33.530131197 +0000 UTC m=+251.327845791 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4155f35e-1865-499f-88fb-fdde1e2c1218-metrics-certs") pod "network-metrics-daemon-xw9bz" (UID: "4155f35e-1865-499f-88fb-fdde1e2c1218") : secret "metrics-daemon-secret" not found Apr 17 11:32:53.082360 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:53.082332 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-dn4vn"] Apr 17 11:32:53.085301 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:53.085261 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-dn4vn" Apr 17 11:32:53.085594 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:53.085505 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d12374a5-1ff4-4f08-9980-138794998ec3-crio-socket\") pod \"insights-runtime-extractor-dn4vn\" (UID: \"d12374a5-1ff4-4f08-9980-138794998ec3\") " pod="openshift-insights/insights-runtime-extractor-dn4vn" Apr 17 11:32:53.085594 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:53.085563 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d12374a5-1ff4-4f08-9980-138794998ec3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dn4vn\" (UID: \"d12374a5-1ff4-4f08-9980-138794998ec3\") " pod="openshift-insights/insights-runtime-extractor-dn4vn" Apr 17 11:32:53.085594 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:53.085589 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t574z\" (UniqueName: \"kubernetes.io/projected/d12374a5-1ff4-4f08-9980-138794998ec3-kube-api-access-t574z\") pod \"insights-runtime-extractor-dn4vn\" (UID: \"d12374a5-1ff4-4f08-9980-138794998ec3\") " pod="openshift-insights/insights-runtime-extractor-dn4vn" Apr 17 11:32:53.085753 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:53.085654 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d12374a5-1ff4-4f08-9980-138794998ec3-data-volume\") pod \"insights-runtime-extractor-dn4vn\" (UID: \"d12374a5-1ff4-4f08-9980-138794998ec3\") " pod="openshift-insights/insights-runtime-extractor-dn4vn" Apr 17 11:32:53.085753 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:53.085702 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d12374a5-1ff4-4f08-9980-138794998ec3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dn4vn\" (UID: \"d12374a5-1ff4-4f08-9980-138794998ec3\") " pod="openshift-insights/insights-runtime-extractor-dn4vn" Apr 17 11:32:53.092806 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:53.092789 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 11:32:53.092806 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:53.092802 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-4b7m5\"" Apr 17 11:32:53.092980 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:53.092905 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 11:32:53.092980 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:53.092960 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 11:32:53.093062 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:53.093007 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 11:32:53.103065 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:53.103021 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-dn4vn"] Apr 17 11:32:53.186096 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:53.186068 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d12374a5-1ff4-4f08-9980-138794998ec3-crio-socket\") pod \"insights-runtime-extractor-dn4vn\" (UID: \"d12374a5-1ff4-4f08-9980-138794998ec3\") " pod="openshift-insights/insights-runtime-extractor-dn4vn" Apr 17 11:32:53.186254 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:53.186106 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d12374a5-1ff4-4f08-9980-138794998ec3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dn4vn\" (UID: \"d12374a5-1ff4-4f08-9980-138794998ec3\") " pod="openshift-insights/insights-runtime-extractor-dn4vn" Apr 17 11:32:53.186254 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:53.186121 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t574z\" (UniqueName: \"kubernetes.io/projected/d12374a5-1ff4-4f08-9980-138794998ec3-kube-api-access-t574z\") pod \"insights-runtime-extractor-dn4vn\" (UID: \"d12374a5-1ff4-4f08-9980-138794998ec3\") " pod="openshift-insights/insights-runtime-extractor-dn4vn" Apr 17 11:32:53.186254 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:53.186141 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d12374a5-1ff4-4f08-9980-138794998ec3-data-volume\") pod \"insights-runtime-extractor-dn4vn\" (UID: \"d12374a5-1ff4-4f08-9980-138794998ec3\") " pod="openshift-insights/insights-runtime-extractor-dn4vn" Apr 17 11:32:53.186254 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:53.186184 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d12374a5-1ff4-4f08-9980-138794998ec3-crio-socket\") pod \"insights-runtime-extractor-dn4vn\" (UID: \"d12374a5-1ff4-4f08-9980-138794998ec3\") " pod="openshift-insights/insights-runtime-extractor-dn4vn" Apr 17 11:32:53.186496 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:53.186259 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d12374a5-1ff4-4f08-9980-138794998ec3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dn4vn\" (UID: \"d12374a5-1ff4-4f08-9980-138794998ec3\") " pod="openshift-insights/insights-runtime-extractor-dn4vn" Apr 17 11:32:53.186496 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:53.186425 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d12374a5-1ff4-4f08-9980-138794998ec3-data-volume\") pod \"insights-runtime-extractor-dn4vn\" (UID: \"d12374a5-1ff4-4f08-9980-138794998ec3\") " pod="openshift-insights/insights-runtime-extractor-dn4vn" Apr 17 11:32:53.186790 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:53.186772 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d12374a5-1ff4-4f08-9980-138794998ec3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dn4vn\" (UID: \"d12374a5-1ff4-4f08-9980-138794998ec3\") " pod="openshift-insights/insights-runtime-extractor-dn4vn" Apr 17 11:32:53.188526 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:53.188508 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d12374a5-1ff4-4f08-9980-138794998ec3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dn4vn\" (UID: \"d12374a5-1ff4-4f08-9980-138794998ec3\") " pod="openshift-insights/insights-runtime-extractor-dn4vn" Apr 17 11:32:53.198862 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:53.198837 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t574z\" (UniqueName: \"kubernetes.io/projected/d12374a5-1ff4-4f08-9980-138794998ec3-kube-api-access-t574z\") pod \"insights-runtime-extractor-dn4vn\" (UID: \"d12374a5-1ff4-4f08-9980-138794998ec3\") " pod="openshift-insights/insights-runtime-extractor-dn4vn" Apr 17 11:32:53.394354 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:53.394261 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-dn4vn" Apr 17 11:32:53.511248 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:53.511216 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-dn4vn"] Apr 17 11:32:53.515889 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:32:53.515866 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd12374a5_1ff4_4f08_9980_138794998ec3.slice/crio-6e20fd32d0ce3af784c15b21e893b365e6b7e1323671ee0d384df4a808f46cbc WatchSource:0}: Error finding container 6e20fd32d0ce3af784c15b21e893b365e6b7e1323671ee0d384df4a808f46cbc: Status 404 returned error can't find the container with id 6e20fd32d0ce3af784c15b21e893b365e6b7e1323671ee0d384df4a808f46cbc Apr 17 11:32:54.297421 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:54.297382 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dn4vn" event={"ID":"d12374a5-1ff4-4f08-9980-138794998ec3","Type":"ContainerStarted","Data":"33e4594107e5af993defefd6883416d08794a1b6199e2182f5b2bbc0b3e4c9ec"} Apr 17 11:32:54.297421 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:54.297424 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dn4vn" event={"ID":"d12374a5-1ff4-4f08-9980-138794998ec3","Type":"ContainerStarted","Data":"3971b6af820431359ec13406f35467c7638456a0134e9d525589c7e66f96ec27"} Apr 17 11:32:54.297812 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:54.297434 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dn4vn" event={"ID":"d12374a5-1ff4-4f08-9980-138794998ec3","Type":"ContainerStarted","Data":"6e20fd32d0ce3af784c15b21e893b365e6b7e1323671ee0d384df4a808f46cbc"} Apr 17 11:32:56.304112 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:56.304071 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dn4vn" event={"ID":"d12374a5-1ff4-4f08-9980-138794998ec3","Type":"ContainerStarted","Data":"c3b4a45b5680720646e55c665cd591bffaae8a856de7e0335bc928c04bad7ebc"} Apr 17 11:32:56.323790 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:56.323745 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-dn4vn" podStartSLOduration=1.219668986 podStartE2EDuration="3.323732299s" podCreationTimestamp="2026-04-17 11:32:53 +0000 UTC" firstStartedPulling="2026-04-17 11:32:53.566406932 +0000 UTC m=+151.364121527" lastFinishedPulling="2026-04-17 11:32:55.670470245 +0000 UTC m=+153.468184840" observedRunningTime="2026-04-17 11:32:56.321909314 +0000 UTC m=+154.119623932" watchObservedRunningTime="2026-04-17 11:32:56.323732299 +0000 UTC m=+154.121446974" Apr 17 11:32:58.166737 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:32:58.166692 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-gsgsr" podUID="15ba3943-b4d6-43fe-88ff-590573a317b8" Apr 17 11:32:58.184888 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:32:58.184857 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-c5z7w" podUID="8a43f043-3738-4fc8-9a0f-9a3de52038b5" Apr 17 11:32:58.309461 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:58.309435 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gsgsr" Apr 17 11:32:58.892160 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:58.892088 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5m4sp"] Apr 17 11:32:58.894884 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:58.894870 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5m4sp" Apr 17 11:32:58.897426 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:58.897406 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 17 11:32:58.897544 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:58.897427 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-wmjcf\"" Apr 17 11:32:58.903027 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:58.903007 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5m4sp"] Apr 17 11:32:58.930299 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:58.930261 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f58c2499-4fc3-4b9f-88eb-b576bf8234f4-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-5m4sp\" (UID: \"f58c2499-4fc3-4b9f-88eb-b576bf8234f4\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5m4sp" Apr 17 11:32:59.030640 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:59.030610 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f58c2499-4fc3-4b9f-88eb-b576bf8234f4-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-5m4sp\" (UID: \"f58c2499-4fc3-4b9f-88eb-b576bf8234f4\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5m4sp" Apr 17 11:32:59.030781 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:32:59.030748 2577 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 17 11:32:59.030829 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:32:59.030804 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f58c2499-4fc3-4b9f-88eb-b576bf8234f4-tls-certificates podName:f58c2499-4fc3-4b9f-88eb-b576bf8234f4 nodeName:}" failed. No retries permitted until 2026-04-17 11:32:59.530789507 +0000 UTC m=+157.328504102 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/f58c2499-4fc3-4b9f-88eb-b576bf8234f4-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-5m4sp" (UID: "f58c2499-4fc3-4b9f-88eb-b576bf8234f4") : secret "prometheus-operator-admission-webhook-tls" not found Apr 17 11:32:59.534479 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:59.534444 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f58c2499-4fc3-4b9f-88eb-b576bf8234f4-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-5m4sp\" (UID: \"f58c2499-4fc3-4b9f-88eb-b576bf8234f4\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5m4sp" Apr 17 11:32:59.536898 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:59.536874 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f58c2499-4fc3-4b9f-88eb-b576bf8234f4-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-5m4sp\" (UID: \"f58c2499-4fc3-4b9f-88eb-b576bf8234f4\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5m4sp" Apr 17 11:32:59.803871 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:59.803766 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5m4sp" Apr 17 11:32:59.809608 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:32:59.809576 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-xw9bz" podUID="4155f35e-1865-499f-88fb-fdde1e2c1218" Apr 17 11:32:59.913711 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:32:59.913679 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5m4sp"] Apr 17 11:32:59.917559 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:32:59.917535 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf58c2499_4fc3_4b9f_88eb_b576bf8234f4.slice/crio-46d2686445b348a3b319366b8a1ec22d1650544b448956c076f90446ee1fb8d6 WatchSource:0}: Error finding container 46d2686445b348a3b319366b8a1ec22d1650544b448956c076f90446ee1fb8d6: Status 404 returned error can't find the container with id 46d2686445b348a3b319366b8a1ec22d1650544b448956c076f90446ee1fb8d6 Apr 17 11:33:00.317776 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:00.317740 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5m4sp" event={"ID":"f58c2499-4fc3-4b9f-88eb-b576bf8234f4","Type":"ContainerStarted","Data":"46d2686445b348a3b319366b8a1ec22d1650544b448956c076f90446ee1fb8d6"} Apr 17 11:33:01.321279 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:01.321233 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5m4sp" event={"ID":"f58c2499-4fc3-4b9f-88eb-b576bf8234f4","Type":"ContainerStarted","Data":"721811c06c2143b36533bb6a3b57377894778fede7c1cc0ccd8ae6e3e084b10b"} Apr 17 11:33:01.337659 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:01.337611 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5m4sp" podStartSLOduration=2.128778733 podStartE2EDuration="3.3375976s" podCreationTimestamp="2026-04-17 11:32:58 +0000 UTC" firstStartedPulling="2026-04-17 11:32:59.919396591 +0000 UTC m=+157.717111185" lastFinishedPulling="2026-04-17 11:33:01.128215449 +0000 UTC m=+158.925930052" observedRunningTime="2026-04-17 11:33:01.337465527 +0000 UTC m=+159.135180143" watchObservedRunningTime="2026-04-17 11:33:01.3375976 +0000 UTC m=+159.135312217" Apr 17 11:33:02.324138 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:02.324102 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5m4sp" Apr 17 11:33:02.328771 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:02.328748 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5m4sp" Apr 17 11:33:03.059265 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:03.059234 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a43f043-3738-4fc8-9a0f-9a3de52038b5-cert\") pod \"ingress-canary-c5z7w\" (UID: \"8a43f043-3738-4fc8-9a0f-9a3de52038b5\") " pod="openshift-ingress-canary/ingress-canary-c5z7w" Apr 17 11:33:03.059461 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:03.059377 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/15ba3943-b4d6-43fe-88ff-590573a317b8-metrics-tls\") pod \"dns-default-gsgsr\" (UID: \"15ba3943-b4d6-43fe-88ff-590573a317b8\") " pod="openshift-dns/dns-default-gsgsr" Apr 17 11:33:03.061680 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:03.061657 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a43f043-3738-4fc8-9a0f-9a3de52038b5-cert\") pod \"ingress-canary-c5z7w\" (UID: \"8a43f043-3738-4fc8-9a0f-9a3de52038b5\") " pod="openshift-ingress-canary/ingress-canary-c5z7w" Apr 17 11:33:03.061784 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:03.061694 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/15ba3943-b4d6-43fe-88ff-590573a317b8-metrics-tls\") pod \"dns-default-gsgsr\" (UID: \"15ba3943-b4d6-43fe-88ff-590573a317b8\") " pod="openshift-dns/dns-default-gsgsr" Apr 17 11:33:03.113470 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:03.113448 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ltml9\"" Apr 17 11:33:03.120719 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:03.120705 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gsgsr" Apr 17 11:33:03.234761 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:03.234732 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gsgsr"] Apr 17 11:33:03.237512 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:33:03.237482 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15ba3943_b4d6_43fe_88ff_590573a317b8.slice/crio-b131e34c296497434042b0d8bba9a71b0c23c859fc8ddae6a442568128f287bb WatchSource:0}: Error finding container b131e34c296497434042b0d8bba9a71b0c23c859fc8ddae6a442568128f287bb: Status 404 returned error can't find the container with id b131e34c296497434042b0d8bba9a71b0c23c859fc8ddae6a442568128f287bb Apr 17 11:33:03.327586 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:03.327510 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gsgsr" event={"ID":"15ba3943-b4d6-43fe-88ff-590573a317b8","Type":"ContainerStarted","Data":"b131e34c296497434042b0d8bba9a71b0c23c859fc8ddae6a442568128f287bb"} Apr 17 11:33:05.334973 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:05.334940 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gsgsr" event={"ID":"15ba3943-b4d6-43fe-88ff-590573a317b8","Type":"ContainerStarted","Data":"b67ae5eb55074a735b6c70153d6f20912a9641211ddc9339af2bce4abc0627d0"} Apr 17 11:33:05.334973 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:05.334973 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gsgsr" event={"ID":"15ba3943-b4d6-43fe-88ff-590573a317b8","Type":"ContainerStarted","Data":"779acaa5f9dc23594057a5212cf96ef1b405189eb943f90ce8d19559c90ee13b"} Apr 17 11:33:05.335394 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:05.335067 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-gsgsr" Apr 17 11:33:05.358125 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:05.358077 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-gsgsr" podStartSLOduration=129.081600049 podStartE2EDuration="2m10.358058468s" podCreationTimestamp="2026-04-17 11:30:55 +0000 UTC" firstStartedPulling="2026-04-17 11:33:03.239242783 +0000 UTC m=+161.036957379" lastFinishedPulling="2026-04-17 11:33:04.515701203 +0000 UTC m=+162.313415798" observedRunningTime="2026-04-17 11:33:05.357574023 +0000 UTC m=+163.155288632" watchObservedRunningTime="2026-04-17 11:33:05.358058468 +0000 UTC m=+163.155773078" Apr 17 11:33:07.294505 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.294448 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-2xksx"] Apr 17 11:33:07.297589 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.297572 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2xksx" Apr 17 11:33:07.299960 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.299921 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 17 11:33:07.299960 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.299934 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 11:33:07.300154 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.299979 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 17 11:33:07.300154 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.299937 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-d94tz\"" Apr 17 11:33:07.301095 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.301078 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 11:33:07.301198 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.301145 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 11:33:07.306509 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.306490 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-2xksx"] Apr 17 11:33:07.319697 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.319675 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-9wzj8"] Apr 17 11:33:07.322653 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.322635 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9wzj8" Apr 17 11:33:07.325121 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.325103 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 11:33:07.325121 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.325135 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-wj9ml\"" Apr 17 11:33:07.325329 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.325139 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 11:33:07.325329 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.325250 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 11:33:07.392985 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.392958 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b5f0a58b-6092-4b24-b961-0e7f736de486-node-exporter-accelerators-collector-config\") pod \"node-exporter-9wzj8\" (UID: \"b5f0a58b-6092-4b24-b961-0e7f736de486\") " pod="openshift-monitoring/node-exporter-9wzj8" Apr 17 11:33:07.393143 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.392998 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b5f0a58b-6092-4b24-b961-0e7f736de486-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9wzj8\" (UID: \"b5f0a58b-6092-4b24-b961-0e7f736de486\") " pod="openshift-monitoring/node-exporter-9wzj8" Apr 17 11:33:07.393143 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.393055 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/329c88e2-dddc-446c-b2b2-4dff96a8eb08-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-2xksx\" (UID: \"329c88e2-dddc-446c-b2b2-4dff96a8eb08\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2xksx" Apr 17 11:33:07.393143 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.393084 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b5f0a58b-6092-4b24-b961-0e7f736de486-node-exporter-textfile\") pod \"node-exporter-9wzj8\" (UID: \"b5f0a58b-6092-4b24-b961-0e7f736de486\") " pod="openshift-monitoring/node-exporter-9wzj8" Apr 17 11:33:07.393143 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.393106 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/329c88e2-dddc-446c-b2b2-4dff96a8eb08-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-2xksx\" (UID: \"329c88e2-dddc-446c-b2b2-4dff96a8eb08\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2xksx" Apr 17 11:33:07.393143 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.393133 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b5f0a58b-6092-4b24-b961-0e7f736de486-node-exporter-tls\") pod \"node-exporter-9wzj8\" (UID: \"b5f0a58b-6092-4b24-b961-0e7f736de486\") " pod="openshift-monitoring/node-exporter-9wzj8" Apr 17 11:33:07.393354 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.393151 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b5f0a58b-6092-4b24-b961-0e7f736de486-metrics-client-ca\") pod \"node-exporter-9wzj8\" (UID: \"b5f0a58b-6092-4b24-b961-0e7f736de486\") " pod="openshift-monitoring/node-exporter-9wzj8" Apr 17 11:33:07.393354 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.393168 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b5f0a58b-6092-4b24-b961-0e7f736de486-node-exporter-wtmp\") pod \"node-exporter-9wzj8\" (UID: \"b5f0a58b-6092-4b24-b961-0e7f736de486\") " pod="openshift-monitoring/node-exporter-9wzj8" Apr 17 11:33:07.393354 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.393216 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4szqs\" (UniqueName: \"kubernetes.io/projected/329c88e2-dddc-446c-b2b2-4dff96a8eb08-kube-api-access-4szqs\") pod \"openshift-state-metrics-9d44df66c-2xksx\" (UID: \"329c88e2-dddc-446c-b2b2-4dff96a8eb08\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2xksx" Apr 17 11:33:07.393354 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.393236 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6spdt\" (UniqueName: \"kubernetes.io/projected/b5f0a58b-6092-4b24-b961-0e7f736de486-kube-api-access-6spdt\") pod \"node-exporter-9wzj8\" (UID: \"b5f0a58b-6092-4b24-b961-0e7f736de486\") " pod="openshift-monitoring/node-exporter-9wzj8" Apr 17 11:33:07.393354 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.393300 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b5f0a58b-6092-4b24-b961-0e7f736de486-root\") pod \"node-exporter-9wzj8\" (UID: \"b5f0a58b-6092-4b24-b961-0e7f736de486\") " pod="openshift-monitoring/node-exporter-9wzj8" Apr 17 11:33:07.393354 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.393315 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/329c88e2-dddc-446c-b2b2-4dff96a8eb08-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-2xksx\" (UID: \"329c88e2-dddc-446c-b2b2-4dff96a8eb08\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2xksx" Apr 17 11:33:07.393354 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.393332 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b5f0a58b-6092-4b24-b961-0e7f736de486-sys\") pod \"node-exporter-9wzj8\" (UID: \"b5f0a58b-6092-4b24-b961-0e7f736de486\") " pod="openshift-monitoring/node-exporter-9wzj8" Apr 17 11:33:07.494167 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.494131 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b5f0a58b-6092-4b24-b961-0e7f736de486-node-exporter-accelerators-collector-config\") pod \"node-exporter-9wzj8\" (UID: \"b5f0a58b-6092-4b24-b961-0e7f736de486\") " pod="openshift-monitoring/node-exporter-9wzj8" Apr 17 11:33:07.494167 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.494173 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b5f0a58b-6092-4b24-b961-0e7f736de486-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9wzj8\" (UID: \"b5f0a58b-6092-4b24-b961-0e7f736de486\") " pod="openshift-monitoring/node-exporter-9wzj8" Apr 17 11:33:07.494439 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.494321 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/329c88e2-dddc-446c-b2b2-4dff96a8eb08-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-2xksx\" (UID: \"329c88e2-dddc-446c-b2b2-4dff96a8eb08\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2xksx" Apr 17 11:33:07.494439 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.494353 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b5f0a58b-6092-4b24-b961-0e7f736de486-node-exporter-textfile\") pod \"node-exporter-9wzj8\" (UID: \"b5f0a58b-6092-4b24-b961-0e7f736de486\") " pod="openshift-monitoring/node-exporter-9wzj8" Apr 17 11:33:07.494439 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.494376 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/329c88e2-dddc-446c-b2b2-4dff96a8eb08-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-2xksx\" (UID: \"329c88e2-dddc-446c-b2b2-4dff96a8eb08\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2xksx" Apr 17 11:33:07.494588 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.494491 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b5f0a58b-6092-4b24-b961-0e7f736de486-node-exporter-tls\") pod \"node-exporter-9wzj8\" (UID: \"b5f0a58b-6092-4b24-b961-0e7f736de486\") " pod="openshift-monitoring/node-exporter-9wzj8" Apr 17 11:33:07.494588 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.494520 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b5f0a58b-6092-4b24-b961-0e7f736de486-metrics-client-ca\") pod \"node-exporter-9wzj8\" (UID: \"b5f0a58b-6092-4b24-b961-0e7f736de486\") " pod="openshift-monitoring/node-exporter-9wzj8" Apr 17 11:33:07.494588 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.494546 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b5f0a58b-6092-4b24-b961-0e7f736de486-node-exporter-wtmp\") pod \"node-exporter-9wzj8\" (UID: \"b5f0a58b-6092-4b24-b961-0e7f736de486\") " pod="openshift-monitoring/node-exporter-9wzj8" Apr 17 11:33:07.494588 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.494569 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4szqs\" (UniqueName: \"kubernetes.io/projected/329c88e2-dddc-446c-b2b2-4dff96a8eb08-kube-api-access-4szqs\") pod \"openshift-state-metrics-9d44df66c-2xksx\" (UID: \"329c88e2-dddc-446c-b2b2-4dff96a8eb08\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2xksx" Apr 17 11:33:07.494771 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.494610 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6spdt\" (UniqueName: \"kubernetes.io/projected/b5f0a58b-6092-4b24-b961-0e7f736de486-kube-api-access-6spdt\") pod \"node-exporter-9wzj8\" (UID: \"b5f0a58b-6092-4b24-b961-0e7f736de486\") " pod="openshift-monitoring/node-exporter-9wzj8" Apr 17 11:33:07.494771 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:33:07.494681 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 11:33:07.494771 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.494739 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b5f0a58b-6092-4b24-b961-0e7f736de486-node-exporter-textfile\") pod \"node-exporter-9wzj8\" (UID: \"b5f0a58b-6092-4b24-b961-0e7f736de486\") " pod="openshift-monitoring/node-exporter-9wzj8" Apr 17 11:33:07.494771 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:33:07.494753 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5f0a58b-6092-4b24-b961-0e7f736de486-node-exporter-tls podName:b5f0a58b-6092-4b24-b961-0e7f736de486 nodeName:}" failed. No retries permitted until 2026-04-17 11:33:07.994733966 +0000 UTC m=+165.792448578 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/b5f0a58b-6092-4b24-b961-0e7f736de486-node-exporter-tls") pod "node-exporter-9wzj8" (UID: "b5f0a58b-6092-4b24-b961-0e7f736de486") : secret "node-exporter-tls" not found Apr 17 11:33:07.494972 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.494795 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b5f0a58b-6092-4b24-b961-0e7f736de486-node-exporter-wtmp\") pod \"node-exporter-9wzj8\" (UID: \"b5f0a58b-6092-4b24-b961-0e7f736de486\") " pod="openshift-monitoring/node-exporter-9wzj8" Apr 17 11:33:07.494972 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.494907 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b5f0a58b-6092-4b24-b961-0e7f736de486-node-exporter-accelerators-collector-config\") pod \"node-exporter-9wzj8\" (UID: \"b5f0a58b-6092-4b24-b961-0e7f736de486\") " pod="openshift-monitoring/node-exporter-9wzj8" Apr 17 11:33:07.495056 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.494865 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b5f0a58b-6092-4b24-b961-0e7f736de486-root\") pod \"node-exporter-9wzj8\" (UID: \"b5f0a58b-6092-4b24-b961-0e7f736de486\") " pod="openshift-monitoring/node-exporter-9wzj8" Apr 17 11:33:07.495056 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.495034 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b5f0a58b-6092-4b24-b961-0e7f736de486-root\") pod \"node-exporter-9wzj8\" (UID: \"b5f0a58b-6092-4b24-b961-0e7f736de486\") " pod="openshift-monitoring/node-exporter-9wzj8" Apr 17 11:33:07.495128 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.495049 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/329c88e2-dddc-446c-b2b2-4dff96a8eb08-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-2xksx\" (UID: \"329c88e2-dddc-446c-b2b2-4dff96a8eb08\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2xksx" Apr 17 11:33:07.495163 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.495129 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b5f0a58b-6092-4b24-b961-0e7f736de486-sys\") pod \"node-exporter-9wzj8\" (UID: \"b5f0a58b-6092-4b24-b961-0e7f736de486\") " pod="openshift-monitoring/node-exporter-9wzj8" Apr 17 11:33:07.495215 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.495076 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b5f0a58b-6092-4b24-b961-0e7f736de486-metrics-client-ca\") pod \"node-exporter-9wzj8\" (UID: \"b5f0a58b-6092-4b24-b961-0e7f736de486\") " pod="openshift-monitoring/node-exporter-9wzj8" Apr 17 11:33:07.495252 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.495224 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b5f0a58b-6092-4b24-b961-0e7f736de486-sys\") pod \"node-exporter-9wzj8\" (UID: \"b5f0a58b-6092-4b24-b961-0e7f736de486\") " pod="openshift-monitoring/node-exporter-9wzj8" Apr 17 11:33:07.496320 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.496297 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/329c88e2-dddc-446c-b2b2-4dff96a8eb08-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-2xksx\" (UID: \"329c88e2-dddc-446c-b2b2-4dff96a8eb08\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2xksx" Apr 17 11:33:07.496955 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.496934 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/329c88e2-dddc-446c-b2b2-4dff96a8eb08-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-2xksx\" (UID: \"329c88e2-dddc-446c-b2b2-4dff96a8eb08\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2xksx" Apr 17 11:33:07.496997 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.496955 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b5f0a58b-6092-4b24-b961-0e7f736de486-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9wzj8\" (UID: \"b5f0a58b-6092-4b24-b961-0e7f736de486\") " pod="openshift-monitoring/node-exporter-9wzj8" Apr 17 11:33:07.497161 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.497144 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/329c88e2-dddc-446c-b2b2-4dff96a8eb08-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-2xksx\" (UID: \"329c88e2-dddc-446c-b2b2-4dff96a8eb08\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2xksx" Apr 17 11:33:07.504515 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.504493 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4szqs\" (UniqueName: \"kubernetes.io/projected/329c88e2-dddc-446c-b2b2-4dff96a8eb08-kube-api-access-4szqs\") pod \"openshift-state-metrics-9d44df66c-2xksx\" (UID: \"329c88e2-dddc-446c-b2b2-4dff96a8eb08\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2xksx" Apr 17 11:33:07.504681 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.504656 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6spdt\" (UniqueName: \"kubernetes.io/projected/b5f0a58b-6092-4b24-b961-0e7f736de486-kube-api-access-6spdt\") pod \"node-exporter-9wzj8\" (UID: \"b5f0a58b-6092-4b24-b961-0e7f736de486\") " pod="openshift-monitoring/node-exporter-9wzj8" Apr 17 11:33:07.607093 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.607006 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2xksx" Apr 17 11:33:07.722222 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.722192 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-2xksx"] Apr 17 11:33:07.725127 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:33:07.725097 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod329c88e2_dddc_446c_b2b2_4dff96a8eb08.slice/crio-6baa6071f47dd0f83bd7d6a8d648cac974f356a1b33f345eee53bc374155ce37 WatchSource:0}: Error finding container 6baa6071f47dd0f83bd7d6a8d648cac974f356a1b33f345eee53bc374155ce37: Status 404 returned error can't find the container with id 6baa6071f47dd0f83bd7d6a8d648cac974f356a1b33f345eee53bc374155ce37 Apr 17 11:33:07.998914 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:07.998883 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b5f0a58b-6092-4b24-b961-0e7f736de486-node-exporter-tls\") pod \"node-exporter-9wzj8\" (UID: \"b5f0a58b-6092-4b24-b961-0e7f736de486\") " pod="openshift-monitoring/node-exporter-9wzj8" Apr 17 11:33:07.999083 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:33:07.999025 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 11:33:07.999138 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:33:07.999086 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5f0a58b-6092-4b24-b961-0e7f736de486-node-exporter-tls podName:b5f0a58b-6092-4b24-b961-0e7f736de486 nodeName:}" failed. No retries permitted until 2026-04-17 11:33:08.999070441 +0000 UTC m=+166.796785036 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/b5f0a58b-6092-4b24-b961-0e7f736de486-node-exporter-tls") pod "node-exporter-9wzj8" (UID: "b5f0a58b-6092-4b24-b961-0e7f736de486") : secret "node-exporter-tls" not found Apr 17 11:33:08.342691 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:08.342657 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2xksx" event={"ID":"329c88e2-dddc-446c-b2b2-4dff96a8eb08","Type":"ContainerStarted","Data":"f57e9287b643d638dc29c7260f9cd68fe51d05c5ab0342b952a20f1e4b4b62aa"} Apr 17 11:33:08.342691 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:08.342692 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2xksx" event={"ID":"329c88e2-dddc-446c-b2b2-4dff96a8eb08","Type":"ContainerStarted","Data":"000387e035d9226edab96c2503d45a85e0897a20383424d53a6f4b4c59de239b"} Apr 17 11:33:08.343081 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:08.342701 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2xksx" event={"ID":"329c88e2-dddc-446c-b2b2-4dff96a8eb08","Type":"ContainerStarted","Data":"6baa6071f47dd0f83bd7d6a8d648cac974f356a1b33f345eee53bc374155ce37"} Apr 17 11:33:09.005405 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:09.005374 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b5f0a58b-6092-4b24-b961-0e7f736de486-node-exporter-tls\") pod \"node-exporter-9wzj8\" (UID: \"b5f0a58b-6092-4b24-b961-0e7f736de486\") " pod="openshift-monitoring/node-exporter-9wzj8" Apr 17 11:33:09.007536 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:09.007515 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b5f0a58b-6092-4b24-b961-0e7f736de486-node-exporter-tls\") pod \"node-exporter-9wzj8\" (UID: \"b5f0a58b-6092-4b24-b961-0e7f736de486\") " pod="openshift-monitoring/node-exporter-9wzj8" Apr 17 11:33:09.131937 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:09.131859 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9wzj8" Apr 17 11:33:09.139544 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:33:09.139516 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5f0a58b_6092_4b24_b961_0e7f736de486.slice/crio-f1d83cab091dcdd1d80ffc00b39419d0c96bff47da63f5f4d658ee26aa9384be WatchSource:0}: Error finding container f1d83cab091dcdd1d80ffc00b39419d0c96bff47da63f5f4d658ee26aa9384be: Status 404 returned error can't find the container with id f1d83cab091dcdd1d80ffc00b39419d0c96bff47da63f5f4d658ee26aa9384be Apr 17 11:33:09.347488 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:09.347446 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2xksx" event={"ID":"329c88e2-dddc-446c-b2b2-4dff96a8eb08","Type":"ContainerStarted","Data":"98b81a5445522b7007aae88ab4a8f6b5e1ed17d6556e97ff68ecec5b1905a558"} Apr 17 11:33:09.348376 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:09.348348 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9wzj8" event={"ID":"b5f0a58b-6092-4b24-b961-0e7f736de486","Type":"ContainerStarted","Data":"f1d83cab091dcdd1d80ffc00b39419d0c96bff47da63f5f4d658ee26aa9384be"} Apr 17 11:33:09.366768 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:09.366642 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2xksx" podStartSLOduration=1.382379888 podStartE2EDuration="2.366626783s" podCreationTimestamp="2026-04-17 11:33:07 +0000 UTC" firstStartedPulling="2026-04-17 11:33:07.858539208 +0000 UTC m=+165.656253812" lastFinishedPulling="2026-04-17 11:33:08.842786112 +0000 UTC m=+166.640500707" observedRunningTime="2026-04-17 11:33:09.366384467 +0000 UTC m=+167.164099085" watchObservedRunningTime="2026-04-17 11:33:09.366626783 +0000 UTC m=+167.164341382" Apr 17 11:33:10.352470 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:10.352437 2577 generic.go:358] "Generic (PLEG): container finished" podID="b5f0a58b-6092-4b24-b961-0e7f736de486" containerID="31365fb4efe6145ace867d23db1db3885624776406abe80b8dd0a04011fa8f35" exitCode=0 Apr 17 11:33:10.352932 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:10.352534 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9wzj8" event={"ID":"b5f0a58b-6092-4b24-b961-0e7f736de486","Type":"ContainerDied","Data":"31365fb4efe6145ace867d23db1db3885624776406abe80b8dd0a04011fa8f35"} Apr 17 11:33:10.353791 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:10.353770 2577 generic.go:358] "Generic (PLEG): container finished" podID="00891146-59a6-466a-a58d-7f743267b099" containerID="d1f916c0cf3609c3d64d225082d7288357fa6f58dde4396d22fb171ccaa40f8d" exitCode=255 Apr 17 11:33:10.353911 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:10.353852 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77f54db55c-v5k4f" event={"ID":"00891146-59a6-466a-a58d-7f743267b099","Type":"ContainerDied","Data":"d1f916c0cf3609c3d64d225082d7288357fa6f58dde4396d22fb171ccaa40f8d"} Apr 17 11:33:10.354319 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:10.354302 2577 scope.go:117] "RemoveContainer" containerID="d1f916c0cf3609c3d64d225082d7288357fa6f58dde4396d22fb171ccaa40f8d" Apr 17 11:33:11.357970 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:11.357892 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77f54db55c-v5k4f" event={"ID":"00891146-59a6-466a-a58d-7f743267b099","Type":"ContainerStarted","Data":"3066d6a401b27e4ed9c6a73eaf255d3f8cb3ec1cbe2941332f62cc6379f0e04b"} Apr 17 11:33:11.359637 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:11.359616 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9wzj8" event={"ID":"b5f0a58b-6092-4b24-b961-0e7f736de486","Type":"ContainerStarted","Data":"88b9b0a8f77fe5a34e948fc64a31b817a07c556734c1709165116e480ad6bf50"} Apr 17 11:33:11.359741 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:11.359642 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9wzj8" event={"ID":"b5f0a58b-6092-4b24-b961-0e7f736de486","Type":"ContainerStarted","Data":"517d66ef5bb48f932f872c0b938202cdb72f6bc42d47a0c4433145f0e3f8ec93"} Apr 17 11:33:11.398574 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:11.398531 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-9wzj8" podStartSLOduration=3.603859568 podStartE2EDuration="4.398518406s" podCreationTimestamp="2026-04-17 11:33:07 +0000 UTC" firstStartedPulling="2026-04-17 11:33:09.14131578 +0000 UTC m=+166.939030378" lastFinishedPulling="2026-04-17 11:33:09.935974616 +0000 UTC m=+167.733689216" observedRunningTime="2026-04-17 11:33:11.397607525 +0000 UTC m=+169.195322141" watchObservedRunningTime="2026-04-17 11:33:11.398518406 +0000 UTC m=+169.196233022" Apr 17 11:33:11.717975 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:11.717938 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-775d77b857-hppnr"] Apr 17 11:33:11.720938 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:11.720916 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-775d77b857-hppnr" Apr 17 11:33:11.724546 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:11.724525 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 17 11:33:11.724546 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:11.724539 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-9bpgq\"" Apr 17 11:33:11.724740 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:11.724530 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 17 11:33:11.724740 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:11.724631 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-f7qubs2ufo9fe\"" Apr 17 11:33:11.724884 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:11.724869 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 17 11:33:11.724953 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:11.724920 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 11:33:11.728742 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:11.728724 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-775d77b857-hppnr"] Apr 17 11:33:11.827398 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:11.827371 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/2f2de312-3ca2-4d8f-ac12-6ed1decec071-secret-metrics-server-client-certs\") pod \"metrics-server-775d77b857-hppnr\" (UID: \"2f2de312-3ca2-4d8f-ac12-6ed1decec071\") " pod="openshift-monitoring/metrics-server-775d77b857-hppnr" Apr 17 11:33:11.827563 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:11.827411 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/2f2de312-3ca2-4d8f-ac12-6ed1decec071-audit-log\") pod \"metrics-server-775d77b857-hppnr\" (UID: \"2f2de312-3ca2-4d8f-ac12-6ed1decec071\") " pod="openshift-monitoring/metrics-server-775d77b857-hppnr" Apr 17 11:33:11.827563 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:11.827430 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f2de312-3ca2-4d8f-ac12-6ed1decec071-client-ca-bundle\") pod \"metrics-server-775d77b857-hppnr\" (UID: \"2f2de312-3ca2-4d8f-ac12-6ed1decec071\") " pod="openshift-monitoring/metrics-server-775d77b857-hppnr" Apr 17 11:33:11.827563 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:11.827449 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f2de312-3ca2-4d8f-ac12-6ed1decec071-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-775d77b857-hppnr\" (UID: \"2f2de312-3ca2-4d8f-ac12-6ed1decec071\") " pod="openshift-monitoring/metrics-server-775d77b857-hppnr" Apr 17 11:33:11.827563 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:11.827498 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/2f2de312-3ca2-4d8f-ac12-6ed1decec071-metrics-server-audit-profiles\") pod \"metrics-server-775d77b857-hppnr\" (UID: \"2f2de312-3ca2-4d8f-ac12-6ed1decec071\") " pod="openshift-monitoring/metrics-server-775d77b857-hppnr" Apr 17 11:33:11.827563 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:11.827559 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/2f2de312-3ca2-4d8f-ac12-6ed1decec071-secret-metrics-server-tls\") pod \"metrics-server-775d77b857-hppnr\" (UID: \"2f2de312-3ca2-4d8f-ac12-6ed1decec071\") " pod="openshift-monitoring/metrics-server-775d77b857-hppnr" Apr 17 11:33:11.827802 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:11.827575 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcczk\" (UniqueName: \"kubernetes.io/projected/2f2de312-3ca2-4d8f-ac12-6ed1decec071-kube-api-access-gcczk\") pod \"metrics-server-775d77b857-hppnr\" (UID: \"2f2de312-3ca2-4d8f-ac12-6ed1decec071\") " pod="openshift-monitoring/metrics-server-775d77b857-hppnr" Apr 17 11:33:11.928519 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:11.928490 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/2f2de312-3ca2-4d8f-ac12-6ed1decec071-secret-metrics-server-client-certs\") pod \"metrics-server-775d77b857-hppnr\" (UID: \"2f2de312-3ca2-4d8f-ac12-6ed1decec071\") " pod="openshift-monitoring/metrics-server-775d77b857-hppnr" Apr 17 11:33:11.928672 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:11.928531 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/2f2de312-3ca2-4d8f-ac12-6ed1decec071-audit-log\") pod \"metrics-server-775d77b857-hppnr\" (UID: \"2f2de312-3ca2-4d8f-ac12-6ed1decec071\") " pod="openshift-monitoring/metrics-server-775d77b857-hppnr" Apr 17 11:33:11.928672 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:11.928548 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f2de312-3ca2-4d8f-ac12-6ed1decec071-client-ca-bundle\") pod \"metrics-server-775d77b857-hppnr\" (UID: \"2f2de312-3ca2-4d8f-ac12-6ed1decec071\") " pod="openshift-monitoring/metrics-server-775d77b857-hppnr" Apr 17 11:33:11.928672 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:11.928574 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f2de312-3ca2-4d8f-ac12-6ed1decec071-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-775d77b857-hppnr\" (UID: \"2f2de312-3ca2-4d8f-ac12-6ed1decec071\") " pod="openshift-monitoring/metrics-server-775d77b857-hppnr" Apr 17 11:33:11.928672 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:11.928606 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/2f2de312-3ca2-4d8f-ac12-6ed1decec071-metrics-server-audit-profiles\") pod \"metrics-server-775d77b857-hppnr\" (UID: \"2f2de312-3ca2-4d8f-ac12-6ed1decec071\") " pod="openshift-monitoring/metrics-server-775d77b857-hppnr" Apr 17 11:33:11.928672 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:11.928651 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/2f2de312-3ca2-4d8f-ac12-6ed1decec071-secret-metrics-server-tls\") pod \"metrics-server-775d77b857-hppnr\" (UID: \"2f2de312-3ca2-4d8f-ac12-6ed1decec071\") " pod="openshift-monitoring/metrics-server-775d77b857-hppnr" Apr 17 11:33:11.928672 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:11.928672 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gcczk\" (UniqueName: \"kubernetes.io/projected/2f2de312-3ca2-4d8f-ac12-6ed1decec071-kube-api-access-gcczk\") pod \"metrics-server-775d77b857-hppnr\" (UID: \"2f2de312-3ca2-4d8f-ac12-6ed1decec071\") " pod="openshift-monitoring/metrics-server-775d77b857-hppnr" Apr 17 11:33:11.928965 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:11.928942 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/2f2de312-3ca2-4d8f-ac12-6ed1decec071-audit-log\") pod \"metrics-server-775d77b857-hppnr\" (UID: \"2f2de312-3ca2-4d8f-ac12-6ed1decec071\") " pod="openshift-monitoring/metrics-server-775d77b857-hppnr" Apr 17 11:33:11.929402 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:11.929383 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f2de312-3ca2-4d8f-ac12-6ed1decec071-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-775d77b857-hppnr\" (UID: \"2f2de312-3ca2-4d8f-ac12-6ed1decec071\") " pod="openshift-monitoring/metrics-server-775d77b857-hppnr" Apr 17 11:33:11.930520 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:11.930494 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/2f2de312-3ca2-4d8f-ac12-6ed1decec071-metrics-server-audit-profiles\") pod \"metrics-server-775d77b857-hppnr\" (UID: \"2f2de312-3ca2-4d8f-ac12-6ed1decec071\") " pod="openshift-monitoring/metrics-server-775d77b857-hppnr" Apr 17 11:33:11.931164 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:11.931139 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f2de312-3ca2-4d8f-ac12-6ed1decec071-client-ca-bundle\") pod \"metrics-server-775d77b857-hppnr\" (UID: \"2f2de312-3ca2-4d8f-ac12-6ed1decec071\") " pod="openshift-monitoring/metrics-server-775d77b857-hppnr" Apr 17 11:33:11.931239 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:11.931206 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/2f2de312-3ca2-4d8f-ac12-6ed1decec071-secret-metrics-server-tls\") pod \"metrics-server-775d77b857-hppnr\" (UID: \"2f2de312-3ca2-4d8f-ac12-6ed1decec071\") " pod="openshift-monitoring/metrics-server-775d77b857-hppnr" Apr 17 11:33:11.931306 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:11.931262 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/2f2de312-3ca2-4d8f-ac12-6ed1decec071-secret-metrics-server-client-certs\") pod \"metrics-server-775d77b857-hppnr\" (UID: \"2f2de312-3ca2-4d8f-ac12-6ed1decec071\") " pod="openshift-monitoring/metrics-server-775d77b857-hppnr" Apr 17 11:33:11.936349 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:11.936329 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcczk\" (UniqueName: \"kubernetes.io/projected/2f2de312-3ca2-4d8f-ac12-6ed1decec071-kube-api-access-gcczk\") pod \"metrics-server-775d77b857-hppnr\" (UID: \"2f2de312-3ca2-4d8f-ac12-6ed1decec071\") " pod="openshift-monitoring/metrics-server-775d77b857-hppnr" Apr 17 11:33:12.030464 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:12.030380 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-775d77b857-hppnr" Apr 17 11:33:12.091614 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:12.091582 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-gdfmh"] Apr 17 11:33:12.096321 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:12.096301 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-gdfmh" Apr 17 11:33:12.098675 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:12.098618 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-9hrnr\"" Apr 17 11:33:12.098675 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:12.098623 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 17 11:33:12.101603 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:12.101578 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-gdfmh"] Apr 17 11:33:12.130143 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:12.130115 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/bedd9666-11f0-4ba3-999c-964028f81db4-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-gdfmh\" (UID: \"bedd9666-11f0-4ba3-999c-964028f81db4\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-gdfmh" Apr 17 11:33:12.152715 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:12.152687 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-775d77b857-hppnr"] Apr 17 11:33:12.156505 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:33:12.156481 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f2de312_3ca2_4d8f_ac12_6ed1decec071.slice/crio-a3138008c6085ef8cec97633548b4cbf28b1650f256f77a535261be4e0190910 WatchSource:0}: Error finding container a3138008c6085ef8cec97633548b4cbf28b1650f256f77a535261be4e0190910: Status 404 returned error can't find the container with id a3138008c6085ef8cec97633548b4cbf28b1650f256f77a535261be4e0190910 Apr 17 11:33:12.230950 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:12.230922 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/bedd9666-11f0-4ba3-999c-964028f81db4-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-gdfmh\" (UID: \"bedd9666-11f0-4ba3-999c-964028f81db4\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-gdfmh" Apr 17 11:33:12.233231 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:12.233207 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/bedd9666-11f0-4ba3-999c-964028f81db4-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-gdfmh\" (UID: \"bedd9666-11f0-4ba3-999c-964028f81db4\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-gdfmh" Apr 17 11:33:12.363114 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:12.363045 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-775d77b857-hppnr" event={"ID":"2f2de312-3ca2-4d8f-ac12-6ed1decec071","Type":"ContainerStarted","Data":"a3138008c6085ef8cec97633548b4cbf28b1650f256f77a535261be4e0190910"} Apr 17 11:33:12.407444 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:12.407418 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-gdfmh" Apr 17 11:33:12.517845 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:12.517816 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-gdfmh"] Apr 17 11:33:12.520451 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:33:12.520427 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbedd9666_11f0_4ba3_999c_964028f81db4.slice/crio-444fe55021efef6f49615a35065f3a4de2f3b87af0baaba48d51dd072ac6d3d4 WatchSource:0}: Error finding container 444fe55021efef6f49615a35065f3a4de2f3b87af0baaba48d51dd072ac6d3d4: Status 404 returned error can't find the container with id 444fe55021efef6f49615a35065f3a4de2f3b87af0baaba48d51dd072ac6d3d4 Apr 17 11:33:12.797115 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:12.797074 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-c5z7w" Apr 17 11:33:12.800387 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:12.800363 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-hmj98\"" Apr 17 11:33:12.807777 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:12.807754 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-c5z7w" Apr 17 11:33:12.940928 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:12.940900 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-c5z7w"] Apr 17 11:33:12.944947 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:33:12.944910 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a43f043_3738_4fc8_9a0f_9a3de52038b5.slice/crio-a3845572a7077a81ba64dc622dfa6c34e6ce060132a2ba4cb1f3862278e7edf9 WatchSource:0}: Error finding container a3845572a7077a81ba64dc622dfa6c34e6ce060132a2ba4cb1f3862278e7edf9: Status 404 returned error can't find the container with id a3845572a7077a81ba64dc622dfa6c34e6ce060132a2ba4cb1f3862278e7edf9 Apr 17 11:33:13.367451 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.367410 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-c5z7w" event={"ID":"8a43f043-3738-4fc8-9a0f-9a3de52038b5","Type":"ContainerStarted","Data":"a3845572a7077a81ba64dc622dfa6c34e6ce060132a2ba4cb1f3862278e7edf9"} Apr 17 11:33:13.368543 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.368509 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-gdfmh" event={"ID":"bedd9666-11f0-4ba3-999c-964028f81db4","Type":"ContainerStarted","Data":"444fe55021efef6f49615a35065f3a4de2f3b87af0baaba48d51dd072ac6d3d4"} Apr 17 11:33:13.528218 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.528185 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 11:33:13.532247 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.532229 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.534795 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.534768 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 11:33:13.535348 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.535173 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 11:33:13.535348 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.535220 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 11:33:13.535348 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.535220 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 11:33:13.535558 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.535484 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 11:33:13.536566 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.536539 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 11:33:13.536566 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.536560 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 11:33:13.536684 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.536543 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-athem5p22v0oq\"" Apr 17 11:33:13.536684 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.536561 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 11:33:13.536807 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.536787 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 11:33:13.537224 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.537124 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 11:33:13.537224 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.537133 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-968xp\"" Apr 17 11:33:13.537465 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.537450 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 11:33:13.539207 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.539185 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 11:33:13.544307 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.544287 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 11:33:13.640020 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.639926 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f55a56e3-552e-48aa-99fb-a91e83075401-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.640020 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.639981 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.640020 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.640013 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f55a56e3-552e-48aa-99fb-a91e83075401-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.640020 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.640088 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f55a56e3-552e-48aa-99fb-a91e83075401-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.640449 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.640127 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f55a56e3-552e-48aa-99fb-a91e83075401-config-out\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.640449 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.640203 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-config\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.640449 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.640254 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.640449 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.640325 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.640449 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.640349 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.640449 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.640373 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.640449 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.640400 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5vbt\" (UniqueName: \"kubernetes.io/projected/f55a56e3-552e-48aa-99fb-a91e83075401-kube-api-access-p5vbt\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.640449 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.640425 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f55a56e3-552e-48aa-99fb-a91e83075401-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.640449 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.640447 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.640877 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.640486 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f55a56e3-552e-48aa-99fb-a91e83075401-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.640877 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.640505 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-web-config\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.640877 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.640527 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f55a56e3-552e-48aa-99fb-a91e83075401-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.640877 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.640559 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.640877 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.640609 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f55a56e3-552e-48aa-99fb-a91e83075401-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.741351 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.741317 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f55a56e3-552e-48aa-99fb-a91e83075401-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.741538 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.741365 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f55a56e3-552e-48aa-99fb-a91e83075401-config-out\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.741538 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.741396 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-config\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.741538 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.741420 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.741538 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.741454 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.741538 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.741475 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.741538 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.741500 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.741538 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.741534 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5vbt\" (UniqueName: \"kubernetes.io/projected/f55a56e3-552e-48aa-99fb-a91e83075401-kube-api-access-p5vbt\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.741824 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.741559 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f55a56e3-552e-48aa-99fb-a91e83075401-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.741824 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.741582 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.741824 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.741634 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f55a56e3-552e-48aa-99fb-a91e83075401-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.741824 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.741659 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-web-config\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.741824 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.741683 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f55a56e3-552e-48aa-99fb-a91e83075401-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.741824 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.741711 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.741824 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.741737 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f55a56e3-552e-48aa-99fb-a91e83075401-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.741824 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.741772 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f55a56e3-552e-48aa-99fb-a91e83075401-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.741824 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.741803 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.742181 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.741831 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f55a56e3-552e-48aa-99fb-a91e83075401-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.742181 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.742157 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f55a56e3-552e-48aa-99fb-a91e83075401-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.742772 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.742740 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f55a56e3-552e-48aa-99fb-a91e83075401-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.742772 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.742770 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f55a56e3-552e-48aa-99fb-a91e83075401-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.743481 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.743165 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f55a56e3-552e-48aa-99fb-a91e83075401-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.746299 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.745824 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.746299 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.746022 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.749884 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.746627 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f55a56e3-552e-48aa-99fb-a91e83075401-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.749884 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.746794 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f55a56e3-552e-48aa-99fb-a91e83075401-config-out\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.749884 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.748948 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.749884 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.749117 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f55a56e3-552e-48aa-99fb-a91e83075401-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.749884 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.749297 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.749884 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.749672 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.749884 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.749798 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.749884 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.749845 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f55a56e3-552e-48aa-99fb-a91e83075401-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.750801 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.750758 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-config\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.750801 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.750787 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-web-config\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.751672 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.751647 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.755736 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.755714 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5vbt\" (UniqueName: \"kubernetes.io/projected/f55a56e3-552e-48aa-99fb-a91e83075401-kube-api-access-p5vbt\") pod \"prometheus-k8s-0\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:13.794487 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.794457 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw9bz" Apr 17 11:33:13.842557 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:13.842512 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:14.112822 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:14.112771 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 11:33:14.118363 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:33:14.118321 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf55a56e3_552e_48aa_99fb_a91e83075401.slice/crio-3704fc3503efd9855093bf05894ebfd84de982a015f43bbfe0080714318b1a49 WatchSource:0}: Error finding container 3704fc3503efd9855093bf05894ebfd84de982a015f43bbfe0080714318b1a49: Status 404 returned error can't find the container with id 3704fc3503efd9855093bf05894ebfd84de982a015f43bbfe0080714318b1a49 Apr 17 11:33:14.373044 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:14.372934 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-gdfmh" event={"ID":"bedd9666-11f0-4ba3-999c-964028f81db4","Type":"ContainerStarted","Data":"dd9bb81c12cb6e05ee9ce214654517df94ddfa372d4291ec31eed7ee151bc8aa"} Apr 17 11:33:14.373508 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:14.373371 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-gdfmh" Apr 17 11:33:14.374447 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:14.374421 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f55a56e3-552e-48aa-99fb-a91e83075401","Type":"ContainerStarted","Data":"3704fc3503efd9855093bf05894ebfd84de982a015f43bbfe0080714318b1a49"} Apr 17 11:33:14.376038 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:14.376011 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-775d77b857-hppnr" event={"ID":"2f2de312-3ca2-4d8f-ac12-6ed1decec071","Type":"ContainerStarted","Data":"684ccf7bf468f7aadf7427c1f315e77e05b3c25e12650229e642daf33039b9d5"} Apr 17 11:33:14.379448 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:14.379428 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-gdfmh" Apr 17 11:33:14.389793 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:14.389745 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-gdfmh" podStartSLOduration=0.955303162 podStartE2EDuration="2.389728109s" podCreationTimestamp="2026-04-17 11:33:12 +0000 UTC" firstStartedPulling="2026-04-17 11:33:12.522134862 +0000 UTC m=+170.319849457" lastFinishedPulling="2026-04-17 11:33:13.956559793 +0000 UTC m=+171.754274404" observedRunningTime="2026-04-17 11:33:14.388385557 +0000 UTC m=+172.186100173" watchObservedRunningTime="2026-04-17 11:33:14.389728109 +0000 UTC m=+172.187442728" Apr 17 11:33:14.407616 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:14.407572 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-775d77b857-hppnr" podStartSLOduration=1.612019859 podStartE2EDuration="3.407555502s" podCreationTimestamp="2026-04-17 11:33:11 +0000 UTC" firstStartedPulling="2026-04-17 11:33:12.15837957 +0000 UTC m=+169.956094168" lastFinishedPulling="2026-04-17 11:33:13.953915212 +0000 UTC m=+171.751629811" observedRunningTime="2026-04-17 11:33:14.406340047 +0000 UTC m=+172.204054665" watchObservedRunningTime="2026-04-17 11:33:14.407555502 +0000 UTC m=+172.205270119" Apr 17 11:33:15.339533 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:15.339504 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-gsgsr" Apr 17 11:33:15.380523 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:15.380492 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-c5z7w" event={"ID":"8a43f043-3738-4fc8-9a0f-9a3de52038b5","Type":"ContainerStarted","Data":"3a07cdafb475378c386a9e89b25f88404a40b8f4940c1d5a9829efdef09a14a1"} Apr 17 11:33:15.398003 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:15.397958 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-c5z7w" podStartSLOduration=138.546183646 podStartE2EDuration="2m20.397941311s" podCreationTimestamp="2026-04-17 11:30:55 +0000 UTC" firstStartedPulling="2026-04-17 11:33:12.947079879 +0000 UTC m=+170.744794483" lastFinishedPulling="2026-04-17 11:33:14.798837554 +0000 UTC m=+172.596552148" observedRunningTime="2026-04-17 11:33:15.397834662 +0000 UTC m=+173.195549282" watchObservedRunningTime="2026-04-17 11:33:15.397941311 +0000 UTC m=+173.195655927" Apr 17 11:33:16.384927 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:16.384890 2577 generic.go:358] "Generic (PLEG): container finished" podID="f55a56e3-552e-48aa-99fb-a91e83075401" containerID="2b8e071bb93503db567e3ba1a84677df66549b1cb9815755003d0355e7e094e5" exitCode=0 Apr 17 11:33:16.385398 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:16.384978 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f55a56e3-552e-48aa-99fb-a91e83075401","Type":"ContainerDied","Data":"2b8e071bb93503db567e3ba1a84677df66549b1cb9815755003d0355e7e094e5"} Apr 17 11:33:19.398581 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:19.398537 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f55a56e3-552e-48aa-99fb-a91e83075401","Type":"ContainerStarted","Data":"c25a29cee792eaacef812f5a3b0d182fff92c1e86f65c77ccd50d1c7d89c6f6f"} Apr 17 11:33:19.398581 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:19.398578 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f55a56e3-552e-48aa-99fb-a91e83075401","Type":"ContainerStarted","Data":"4cdb5d1fc98bf806d2ecc9856b639e4dc31bbc99eb49526019314719abb453de"} Apr 17 11:33:21.410630 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:21.410589 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f55a56e3-552e-48aa-99fb-a91e83075401","Type":"ContainerStarted","Data":"09e206e04d13460c38b29d9731b64012793e6480afb35860a46718731b5182a8"} Apr 17 11:33:21.410630 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:21.410632 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f55a56e3-552e-48aa-99fb-a91e83075401","Type":"ContainerStarted","Data":"20abfcb353f42c0c0669031d76972e966b07166b4efb943653e6b1bf91cbbe70"} Apr 17 11:33:21.411032 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:21.410645 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f55a56e3-552e-48aa-99fb-a91e83075401","Type":"ContainerStarted","Data":"09068f383b4f9ec1a4248a0473ae2cae49f6728cf6d6642a3cc4b9a8fe85a53a"} Apr 17 11:33:21.411032 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:21.410659 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f55a56e3-552e-48aa-99fb-a91e83075401","Type":"ContainerStarted","Data":"4c00f08ad56046a2a12b00eb2f7b7d4d4c0661727294fb9811e1f6f3621e7f55"} Apr 17 11:33:21.444657 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:21.444603 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.14772615 podStartE2EDuration="8.444585505s" podCreationTimestamp="2026-04-17 11:33:13 +0000 UTC" firstStartedPulling="2026-04-17 11:33:14.120497087 +0000 UTC m=+171.918211694" lastFinishedPulling="2026-04-17 11:33:20.417356438 +0000 UTC m=+178.215071049" observedRunningTime="2026-04-17 11:33:21.44294047 +0000 UTC m=+179.240655086" watchObservedRunningTime="2026-04-17 11:33:21.444585505 +0000 UTC m=+179.242300122" Apr 17 11:33:23.843304 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:23.843228 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:33:32.031350 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:32.031319 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-775d77b857-hppnr" Apr 17 11:33:32.031350 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:32.031358 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-775d77b857-hppnr" Apr 17 11:33:45.168502 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:45.168442 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bf584f5-d22zl" podUID="a1c4f670-41a2-4be5-8152-e22905ef9201" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 11:33:52.036510 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:52.036479 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-775d77b857-hppnr" Apr 17 11:33:52.040266 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:52.040236 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-775d77b857-hppnr" Apr 17 11:33:55.168101 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:33:55.168059 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bf584f5-d22zl" podUID="a1c4f670-41a2-4be5-8152-e22905ef9201" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 11:34:05.168552 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:05.168507 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bf584f5-d22zl" podUID="a1c4f670-41a2-4be5-8152-e22905ef9201" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 11:34:05.169105 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:05.168591 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bf584f5-d22zl" Apr 17 11:34:05.169105 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:05.169043 2577 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"015faa4307d68d5a7866c99080d38ae87713baa4b9dc8b1c9b05b3c37e0f7944"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bf584f5-d22zl" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 17 11:34:05.169200 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:05.169104 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bf584f5-d22zl" podUID="a1c4f670-41a2-4be5-8152-e22905ef9201" containerName="service-proxy" containerID="cri-o://015faa4307d68d5a7866c99080d38ae87713baa4b9dc8b1c9b05b3c37e0f7944" gracePeriod=30 Apr 17 11:34:05.527495 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:05.527462 2577 generic.go:358] "Generic (PLEG): container finished" podID="a1c4f670-41a2-4be5-8152-e22905ef9201" containerID="015faa4307d68d5a7866c99080d38ae87713baa4b9dc8b1c9b05b3c37e0f7944" exitCode=2 Apr 17 11:34:05.527662 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:05.527533 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bf584f5-d22zl" event={"ID":"a1c4f670-41a2-4be5-8152-e22905ef9201","Type":"ContainerDied","Data":"015faa4307d68d5a7866c99080d38ae87713baa4b9dc8b1c9b05b3c37e0f7944"} Apr 17 11:34:05.527662 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:05.527572 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6bf584f5-d22zl" event={"ID":"a1c4f670-41a2-4be5-8152-e22905ef9201","Type":"ContainerStarted","Data":"c18016b96b5d6941d63b60a7aedf8d8092105bd54f7dcd46479eba84392393a6"} Apr 17 11:34:13.843190 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:13.843132 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:13.863195 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:13.863169 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:14.566448 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:14.566421 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:31.915857 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:31.915824 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 11:34:31.916304 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:31.916255 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f55a56e3-552e-48aa-99fb-a91e83075401" containerName="prometheus" containerID="cri-o://4cdb5d1fc98bf806d2ecc9856b639e4dc31bbc99eb49526019314719abb453de" gracePeriod=600 Apr 17 11:34:31.916460 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:31.916302 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f55a56e3-552e-48aa-99fb-a91e83075401" containerName="kube-rbac-proxy" containerID="cri-o://20abfcb353f42c0c0669031d76972e966b07166b4efb943653e6b1bf91cbbe70" gracePeriod=600 Apr 17 11:34:31.916460 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:31.916357 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f55a56e3-552e-48aa-99fb-a91e83075401" containerName="config-reloader" containerID="cri-o://c25a29cee792eaacef812f5a3b0d182fff92c1e86f65c77ccd50d1c7d89c6f6f" gracePeriod=600 Apr 17 11:34:31.916460 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:31.916305 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f55a56e3-552e-48aa-99fb-a91e83075401" containerName="thanos-sidecar" containerID="cri-o://4c00f08ad56046a2a12b00eb2f7b7d4d4c0661727294fb9811e1f6f3621e7f55" gracePeriod=600 Apr 17 11:34:31.916640 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:31.916364 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f55a56e3-552e-48aa-99fb-a91e83075401" containerName="kube-rbac-proxy-web" containerID="cri-o://09068f383b4f9ec1a4248a0473ae2cae49f6728cf6d6642a3cc4b9a8fe85a53a" gracePeriod=600 Apr 17 11:34:31.916640 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:31.916364 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f55a56e3-552e-48aa-99fb-a91e83075401" containerName="kube-rbac-proxy-thanos" containerID="cri-o://09e206e04d13460c38b29d9731b64012793e6480afb35860a46718731b5182a8" gracePeriod=600 Apr 17 11:34:32.610639 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:32.610603 2577 generic.go:358] "Generic (PLEG): container finished" podID="f55a56e3-552e-48aa-99fb-a91e83075401" containerID="09e206e04d13460c38b29d9731b64012793e6480afb35860a46718731b5182a8" exitCode=0 Apr 17 11:34:32.610639 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:32.610630 2577 generic.go:358] "Generic (PLEG): container finished" podID="f55a56e3-552e-48aa-99fb-a91e83075401" containerID="20abfcb353f42c0c0669031d76972e966b07166b4efb943653e6b1bf91cbbe70" exitCode=0 Apr 17 11:34:32.610639 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:32.610636 2577 generic.go:358] "Generic (PLEG): container finished" podID="f55a56e3-552e-48aa-99fb-a91e83075401" containerID="4c00f08ad56046a2a12b00eb2f7b7d4d4c0661727294fb9811e1f6f3621e7f55" exitCode=0 Apr 17 11:34:32.610639 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:32.610643 2577 generic.go:358] "Generic (PLEG): container finished" podID="f55a56e3-552e-48aa-99fb-a91e83075401" containerID="c25a29cee792eaacef812f5a3b0d182fff92c1e86f65c77ccd50d1c7d89c6f6f" exitCode=0 Apr 17 11:34:32.610639 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:32.610647 2577 generic.go:358] "Generic (PLEG): container finished" podID="f55a56e3-552e-48aa-99fb-a91e83075401" containerID="4cdb5d1fc98bf806d2ecc9856b639e4dc31bbc99eb49526019314719abb453de" exitCode=0 Apr 17 11:34:32.610936 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:32.610662 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f55a56e3-552e-48aa-99fb-a91e83075401","Type":"ContainerDied","Data":"09e206e04d13460c38b29d9731b64012793e6480afb35860a46718731b5182a8"} Apr 17 11:34:32.610936 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:32.610699 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f55a56e3-552e-48aa-99fb-a91e83075401","Type":"ContainerDied","Data":"20abfcb353f42c0c0669031d76972e966b07166b4efb943653e6b1bf91cbbe70"} Apr 17 11:34:32.610936 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:32.610709 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f55a56e3-552e-48aa-99fb-a91e83075401","Type":"ContainerDied","Data":"4c00f08ad56046a2a12b00eb2f7b7d4d4c0661727294fb9811e1f6f3621e7f55"} Apr 17 11:34:32.610936 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:32.610718 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f55a56e3-552e-48aa-99fb-a91e83075401","Type":"ContainerDied","Data":"c25a29cee792eaacef812f5a3b0d182fff92c1e86f65c77ccd50d1c7d89c6f6f"} Apr 17 11:34:32.610936 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:32.610726 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f55a56e3-552e-48aa-99fb-a91e83075401","Type":"ContainerDied","Data":"4cdb5d1fc98bf806d2ecc9856b639e4dc31bbc99eb49526019314719abb453de"} Apr 17 11:34:33.150750 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.150727 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.302240 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.302149 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-secret-kube-rbac-proxy\") pod \"f55a56e3-552e-48aa-99fb-a91e83075401\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " Apr 17 11:34:33.302240 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.302188 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f55a56e3-552e-48aa-99fb-a91e83075401-configmap-metrics-client-ca\") pod \"f55a56e3-552e-48aa-99fb-a91e83075401\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " Apr 17 11:34:33.302240 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.302218 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"f55a56e3-552e-48aa-99fb-a91e83075401\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " Apr 17 11:34:33.302542 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.302256 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-web-config\") pod \"f55a56e3-552e-48aa-99fb-a91e83075401\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " Apr 17 11:34:33.302542 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.302325 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f55a56e3-552e-48aa-99fb-a91e83075401-configmap-kubelet-serving-ca-bundle\") pod \"f55a56e3-552e-48aa-99fb-a91e83075401\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " Apr 17 11:34:33.302542 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.302354 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-secret-metrics-client-certs\") pod \"f55a56e3-552e-48aa-99fb-a91e83075401\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " Apr 17 11:34:33.302735 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.302380 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f55a56e3-552e-48aa-99fb-a91e83075401-prometheus-trusted-ca-bundle\") pod \"f55a56e3-552e-48aa-99fb-a91e83075401\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " Apr 17 11:34:33.302855 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.302664 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f55a56e3-552e-48aa-99fb-a91e83075401-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "f55a56e3-552e-48aa-99fb-a91e83075401" (UID: "f55a56e3-552e-48aa-99fb-a91e83075401"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:34:33.302855 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.302763 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f55a56e3-552e-48aa-99fb-a91e83075401-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "f55a56e3-552e-48aa-99fb-a91e83075401" (UID: "f55a56e3-552e-48aa-99fb-a91e83075401"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:34:33.302855 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.302797 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f55a56e3-552e-48aa-99fb-a91e83075401-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "f55a56e3-552e-48aa-99fb-a91e83075401" (UID: "f55a56e3-552e-48aa-99fb-a91e83075401"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:34:33.303083 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.302862 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-config\") pod \"f55a56e3-552e-48aa-99fb-a91e83075401\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " Apr 17 11:34:33.303083 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.302903 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-thanos-prometheus-http-client-file\") pod \"f55a56e3-552e-48aa-99fb-a91e83075401\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " Apr 17 11:34:33.303083 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.302933 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5vbt\" (UniqueName: \"kubernetes.io/projected/f55a56e3-552e-48aa-99fb-a91e83075401-kube-api-access-p5vbt\") pod \"f55a56e3-552e-48aa-99fb-a91e83075401\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " Apr 17 11:34:33.303083 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.302955 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-secret-grpc-tls\") pod \"f55a56e3-552e-48aa-99fb-a91e83075401\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " Apr 17 11:34:33.303083 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.302979 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f55a56e3-552e-48aa-99fb-a91e83075401-prometheus-k8s-db\") pod \"f55a56e3-552e-48aa-99fb-a91e83075401\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " Apr 17 11:34:33.303083 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.303020 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f55a56e3-552e-48aa-99fb-a91e83075401-configmap-serving-certs-ca-bundle\") pod \"f55a56e3-552e-48aa-99fb-a91e83075401\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " Apr 17 11:34:33.303083 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.303064 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f55a56e3-552e-48aa-99fb-a91e83075401-tls-assets\") pod \"f55a56e3-552e-48aa-99fb-a91e83075401\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " Apr 17 11:34:33.303453 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.303089 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f55a56e3-552e-48aa-99fb-a91e83075401-prometheus-k8s-rulefiles-0\") pod \"f55a56e3-552e-48aa-99fb-a91e83075401\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " Apr 17 11:34:33.303453 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.303120 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f55a56e3-552e-48aa-99fb-a91e83075401-config-out\") pod \"f55a56e3-552e-48aa-99fb-a91e83075401\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " Apr 17 11:34:33.303453 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.303159 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-secret-prometheus-k8s-tls\") pod \"f55a56e3-552e-48aa-99fb-a91e83075401\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " Apr 17 11:34:33.303453 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.303187 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"f55a56e3-552e-48aa-99fb-a91e83075401\" (UID: \"f55a56e3-552e-48aa-99fb-a91e83075401\") " Apr 17 11:34:33.303453 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.303432 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f55a56e3-552e-48aa-99fb-a91e83075401-configmap-metrics-client-ca\") on node \"ip-10-0-134-64.ec2.internal\" DevicePath \"\"" Apr 17 11:34:33.303810 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.303455 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f55a56e3-552e-48aa-99fb-a91e83075401-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-134-64.ec2.internal\" DevicePath \"\"" Apr 17 11:34:33.303810 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.303474 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f55a56e3-552e-48aa-99fb-a91e83075401-prometheus-trusted-ca-bundle\") on node \"ip-10-0-134-64.ec2.internal\" DevicePath \"\"" Apr 17 11:34:33.304533 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.304506 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f55a56e3-552e-48aa-99fb-a91e83075401-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "f55a56e3-552e-48aa-99fb-a91e83075401" (UID: "f55a56e3-552e-48aa-99fb-a91e83075401"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:34:33.305119 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.305093 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f55a56e3-552e-48aa-99fb-a91e83075401-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "f55a56e3-552e-48aa-99fb-a91e83075401" (UID: "f55a56e3-552e-48aa-99fb-a91e83075401"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:34:33.305530 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.305501 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "f55a56e3-552e-48aa-99fb-a91e83075401" (UID: "f55a56e3-552e-48aa-99fb-a91e83075401"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:34:33.305940 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.305862 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "f55a56e3-552e-48aa-99fb-a91e83075401" (UID: "f55a56e3-552e-48aa-99fb-a91e83075401"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:34:33.306714 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.306676 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f55a56e3-552e-48aa-99fb-a91e83075401-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "f55a56e3-552e-48aa-99fb-a91e83075401" (UID: "f55a56e3-552e-48aa-99fb-a91e83075401"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:34:33.307314 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.307258 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "f55a56e3-552e-48aa-99fb-a91e83075401" (UID: "f55a56e3-552e-48aa-99fb-a91e83075401"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:34:33.307763 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.307724 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "f55a56e3-552e-48aa-99fb-a91e83075401" (UID: "f55a56e3-552e-48aa-99fb-a91e83075401"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:34:33.307763 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.307751 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "f55a56e3-552e-48aa-99fb-a91e83075401" (UID: "f55a56e3-552e-48aa-99fb-a91e83075401"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:34:33.307922 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.307835 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f55a56e3-552e-48aa-99fb-a91e83075401-kube-api-access-p5vbt" (OuterVolumeSpecName: "kube-api-access-p5vbt") pod "f55a56e3-552e-48aa-99fb-a91e83075401" (UID: "f55a56e3-552e-48aa-99fb-a91e83075401"). InnerVolumeSpecName "kube-api-access-p5vbt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:34:33.308147 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.308116 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "f55a56e3-552e-48aa-99fb-a91e83075401" (UID: "f55a56e3-552e-48aa-99fb-a91e83075401"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:34:33.308239 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.308139 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-config" (OuterVolumeSpecName: "config") pod "f55a56e3-552e-48aa-99fb-a91e83075401" (UID: "f55a56e3-552e-48aa-99fb-a91e83075401"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:34:33.308387 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.308236 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "f55a56e3-552e-48aa-99fb-a91e83075401" (UID: "f55a56e3-552e-48aa-99fb-a91e83075401"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:34:33.308435 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.308416 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f55a56e3-552e-48aa-99fb-a91e83075401-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "f55a56e3-552e-48aa-99fb-a91e83075401" (UID: "f55a56e3-552e-48aa-99fb-a91e83075401"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:34:33.308770 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.308744 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f55a56e3-552e-48aa-99fb-a91e83075401-config-out" (OuterVolumeSpecName: "config-out") pod "f55a56e3-552e-48aa-99fb-a91e83075401" (UID: "f55a56e3-552e-48aa-99fb-a91e83075401"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:34:33.318058 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.318031 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-web-config" (OuterVolumeSpecName: "web-config") pod "f55a56e3-552e-48aa-99fb-a91e83075401" (UID: "f55a56e3-552e-48aa-99fb-a91e83075401"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:34:33.404493 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.404460 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-134-64.ec2.internal\" DevicePath \"\"" Apr 17 11:34:33.404493 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.404488 2577 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-secret-kube-rbac-proxy\") on node \"ip-10-0-134-64.ec2.internal\" DevicePath \"\"" Apr 17 11:34:33.404709 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.404503 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-134-64.ec2.internal\" DevicePath \"\"" Apr 17 11:34:33.404709 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.404517 2577 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-web-config\") on node \"ip-10-0-134-64.ec2.internal\" DevicePath \"\"" Apr 17 11:34:33.404709 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.404529 2577 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-secret-metrics-client-certs\") on node \"ip-10-0-134-64.ec2.internal\" DevicePath \"\"" Apr 17 11:34:33.404709 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.404551 2577 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-config\") on node \"ip-10-0-134-64.ec2.internal\" DevicePath \"\"" Apr 17 11:34:33.404709 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.404564 2577 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-thanos-prometheus-http-client-file\") on node \"ip-10-0-134-64.ec2.internal\" DevicePath \"\"" Apr 17 11:34:33.404709 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.404577 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p5vbt\" (UniqueName: \"kubernetes.io/projected/f55a56e3-552e-48aa-99fb-a91e83075401-kube-api-access-p5vbt\") on node \"ip-10-0-134-64.ec2.internal\" DevicePath \"\"" Apr 17 11:34:33.404709 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.404591 2577 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-secret-grpc-tls\") on node \"ip-10-0-134-64.ec2.internal\" DevicePath \"\"" Apr 17 11:34:33.404709 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.404603 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f55a56e3-552e-48aa-99fb-a91e83075401-prometheus-k8s-db\") on node \"ip-10-0-134-64.ec2.internal\" DevicePath \"\"" Apr 17 11:34:33.404709 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.404616 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f55a56e3-552e-48aa-99fb-a91e83075401-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-134-64.ec2.internal\" DevicePath \"\"" Apr 17 11:34:33.404709 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.404628 2577 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f55a56e3-552e-48aa-99fb-a91e83075401-tls-assets\") on node \"ip-10-0-134-64.ec2.internal\" DevicePath \"\"" Apr 17 11:34:33.404709 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.404643 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f55a56e3-552e-48aa-99fb-a91e83075401-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-134-64.ec2.internal\" DevicePath \"\"" Apr 17 11:34:33.404709 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.404656 2577 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f55a56e3-552e-48aa-99fb-a91e83075401-config-out\") on node \"ip-10-0-134-64.ec2.internal\" DevicePath \"\"" Apr 17 11:34:33.404709 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.404671 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f55a56e3-552e-48aa-99fb-a91e83075401-secret-prometheus-k8s-tls\") on node \"ip-10-0-134-64.ec2.internal\" DevicePath \"\"" Apr 17 11:34:33.606556 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.606460 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4155f35e-1865-499f-88fb-fdde1e2c1218-metrics-certs\") pod \"network-metrics-daemon-xw9bz\" (UID: \"4155f35e-1865-499f-88fb-fdde1e2c1218\") " pod="openshift-multus/network-metrics-daemon-xw9bz" Apr 17 11:34:33.608780 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.608749 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4155f35e-1865-499f-88fb-fdde1e2c1218-metrics-certs\") pod \"network-metrics-daemon-xw9bz\" (UID: \"4155f35e-1865-499f-88fb-fdde1e2c1218\") " pod="openshift-multus/network-metrics-daemon-xw9bz" Apr 17 11:34:33.616280 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.616244 2577 generic.go:358] "Generic (PLEG): container finished" podID="f55a56e3-552e-48aa-99fb-a91e83075401" containerID="09068f383b4f9ec1a4248a0473ae2cae49f6728cf6d6642a3cc4b9a8fe85a53a" exitCode=0 Apr 17 11:34:33.616404 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.616333 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f55a56e3-552e-48aa-99fb-a91e83075401","Type":"ContainerDied","Data":"09068f383b4f9ec1a4248a0473ae2cae49f6728cf6d6642a3cc4b9a8fe85a53a"} Apr 17 11:34:33.616404 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.616353 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.616404 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.616378 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f55a56e3-552e-48aa-99fb-a91e83075401","Type":"ContainerDied","Data":"3704fc3503efd9855093bf05894ebfd84de982a015f43bbfe0080714318b1a49"} Apr 17 11:34:33.616404 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.616400 2577 scope.go:117] "RemoveContainer" containerID="09e206e04d13460c38b29d9731b64012793e6480afb35860a46718731b5182a8" Apr 17 11:34:33.624003 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.623852 2577 scope.go:117] "RemoveContainer" containerID="20abfcb353f42c0c0669031d76972e966b07166b4efb943653e6b1bf91cbbe70" Apr 17 11:34:33.630283 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.630249 2577 scope.go:117] "RemoveContainer" containerID="09068f383b4f9ec1a4248a0473ae2cae49f6728cf6d6642a3cc4b9a8fe85a53a" Apr 17 11:34:33.636380 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.636366 2577 scope.go:117] "RemoveContainer" containerID="4c00f08ad56046a2a12b00eb2f7b7d4d4c0661727294fb9811e1f6f3621e7f55" Apr 17 11:34:33.639380 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.639356 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 11:34:33.642716 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.642698 2577 scope.go:117] "RemoveContainer" containerID="c25a29cee792eaacef812f5a3b0d182fff92c1e86f65c77ccd50d1c7d89c6f6f" Apr 17 11:34:33.643801 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.643787 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 11:34:33.648677 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.648664 2577 scope.go:117] "RemoveContainer" containerID="4cdb5d1fc98bf806d2ecc9856b639e4dc31bbc99eb49526019314719abb453de" Apr 17 11:34:33.654882 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.654865 2577 scope.go:117] "RemoveContainer" containerID="2b8e071bb93503db567e3ba1a84677df66549b1cb9815755003d0355e7e094e5" Apr 17 11:34:33.660788 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.660773 2577 scope.go:117] "RemoveContainer" containerID="09e206e04d13460c38b29d9731b64012793e6480afb35860a46718731b5182a8" Apr 17 11:34:33.661034 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:34:33.661015 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09e206e04d13460c38b29d9731b64012793e6480afb35860a46718731b5182a8\": container with ID starting with 09e206e04d13460c38b29d9731b64012793e6480afb35860a46718731b5182a8 not found: ID does not exist" containerID="09e206e04d13460c38b29d9731b64012793e6480afb35860a46718731b5182a8" Apr 17 11:34:33.661084 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.661043 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09e206e04d13460c38b29d9731b64012793e6480afb35860a46718731b5182a8"} err="failed to get container status \"09e206e04d13460c38b29d9731b64012793e6480afb35860a46718731b5182a8\": rpc error: code = NotFound desc = could not find container \"09e206e04d13460c38b29d9731b64012793e6480afb35860a46718731b5182a8\": container with ID starting with 09e206e04d13460c38b29d9731b64012793e6480afb35860a46718731b5182a8 not found: ID does not exist" Apr 17 11:34:33.661084 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.661063 2577 scope.go:117] "RemoveContainer" containerID="20abfcb353f42c0c0669031d76972e966b07166b4efb943653e6b1bf91cbbe70" Apr 17 11:34:33.661358 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:34:33.661339 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20abfcb353f42c0c0669031d76972e966b07166b4efb943653e6b1bf91cbbe70\": container with ID starting with 20abfcb353f42c0c0669031d76972e966b07166b4efb943653e6b1bf91cbbe70 not found: ID does not exist" containerID="20abfcb353f42c0c0669031d76972e966b07166b4efb943653e6b1bf91cbbe70" Apr 17 11:34:33.661417 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.661366 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20abfcb353f42c0c0669031d76972e966b07166b4efb943653e6b1bf91cbbe70"} err="failed to get container status \"20abfcb353f42c0c0669031d76972e966b07166b4efb943653e6b1bf91cbbe70\": rpc error: code = NotFound desc = could not find container \"20abfcb353f42c0c0669031d76972e966b07166b4efb943653e6b1bf91cbbe70\": container with ID starting with 20abfcb353f42c0c0669031d76972e966b07166b4efb943653e6b1bf91cbbe70 not found: ID does not exist" Apr 17 11:34:33.661417 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.661384 2577 scope.go:117] "RemoveContainer" containerID="09068f383b4f9ec1a4248a0473ae2cae49f6728cf6d6642a3cc4b9a8fe85a53a" Apr 17 11:34:33.661577 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:34:33.661563 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09068f383b4f9ec1a4248a0473ae2cae49f6728cf6d6642a3cc4b9a8fe85a53a\": container with ID starting with 09068f383b4f9ec1a4248a0473ae2cae49f6728cf6d6642a3cc4b9a8fe85a53a not found: ID does not exist" containerID="09068f383b4f9ec1a4248a0473ae2cae49f6728cf6d6642a3cc4b9a8fe85a53a" Apr 17 11:34:33.661612 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.661581 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09068f383b4f9ec1a4248a0473ae2cae49f6728cf6d6642a3cc4b9a8fe85a53a"} err="failed to get container status \"09068f383b4f9ec1a4248a0473ae2cae49f6728cf6d6642a3cc4b9a8fe85a53a\": rpc error: code = NotFound desc = could not find container \"09068f383b4f9ec1a4248a0473ae2cae49f6728cf6d6642a3cc4b9a8fe85a53a\": container with ID starting with 09068f383b4f9ec1a4248a0473ae2cae49f6728cf6d6642a3cc4b9a8fe85a53a not found: ID does not exist" Apr 17 11:34:33.661612 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.661594 2577 scope.go:117] "RemoveContainer" containerID="4c00f08ad56046a2a12b00eb2f7b7d4d4c0661727294fb9811e1f6f3621e7f55" Apr 17 11:34:33.661797 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:34:33.661783 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c00f08ad56046a2a12b00eb2f7b7d4d4c0661727294fb9811e1f6f3621e7f55\": container with ID starting with 4c00f08ad56046a2a12b00eb2f7b7d4d4c0661727294fb9811e1f6f3621e7f55 not found: ID does not exist" containerID="4c00f08ad56046a2a12b00eb2f7b7d4d4c0661727294fb9811e1f6f3621e7f55" Apr 17 11:34:33.661850 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.661798 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c00f08ad56046a2a12b00eb2f7b7d4d4c0661727294fb9811e1f6f3621e7f55"} err="failed to get container status \"4c00f08ad56046a2a12b00eb2f7b7d4d4c0661727294fb9811e1f6f3621e7f55\": rpc error: code = NotFound desc = could not find container \"4c00f08ad56046a2a12b00eb2f7b7d4d4c0661727294fb9811e1f6f3621e7f55\": container with ID starting with 4c00f08ad56046a2a12b00eb2f7b7d4d4c0661727294fb9811e1f6f3621e7f55 not found: ID does not exist" Apr 17 11:34:33.661850 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.661808 2577 scope.go:117] "RemoveContainer" containerID="c25a29cee792eaacef812f5a3b0d182fff92c1e86f65c77ccd50d1c7d89c6f6f" Apr 17 11:34:33.661989 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:34:33.661974 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c25a29cee792eaacef812f5a3b0d182fff92c1e86f65c77ccd50d1c7d89c6f6f\": container with ID starting with c25a29cee792eaacef812f5a3b0d182fff92c1e86f65c77ccd50d1c7d89c6f6f not found: ID does not exist" containerID="c25a29cee792eaacef812f5a3b0d182fff92c1e86f65c77ccd50d1c7d89c6f6f" Apr 17 11:34:33.662026 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.661993 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c25a29cee792eaacef812f5a3b0d182fff92c1e86f65c77ccd50d1c7d89c6f6f"} err="failed to get container status \"c25a29cee792eaacef812f5a3b0d182fff92c1e86f65c77ccd50d1c7d89c6f6f\": rpc error: code = NotFound desc = could not find container \"c25a29cee792eaacef812f5a3b0d182fff92c1e86f65c77ccd50d1c7d89c6f6f\": container with ID starting with c25a29cee792eaacef812f5a3b0d182fff92c1e86f65c77ccd50d1c7d89c6f6f not found: ID does not exist" Apr 17 11:34:33.662026 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.662006 2577 scope.go:117] "RemoveContainer" containerID="4cdb5d1fc98bf806d2ecc9856b639e4dc31bbc99eb49526019314719abb453de" Apr 17 11:34:33.662201 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:34:33.662184 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cdb5d1fc98bf806d2ecc9856b639e4dc31bbc99eb49526019314719abb453de\": container with ID starting with 4cdb5d1fc98bf806d2ecc9856b639e4dc31bbc99eb49526019314719abb453de not found: ID does not exist" containerID="4cdb5d1fc98bf806d2ecc9856b639e4dc31bbc99eb49526019314719abb453de" Apr 17 11:34:33.662246 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.662204 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cdb5d1fc98bf806d2ecc9856b639e4dc31bbc99eb49526019314719abb453de"} err="failed to get container status \"4cdb5d1fc98bf806d2ecc9856b639e4dc31bbc99eb49526019314719abb453de\": rpc error: code = NotFound desc = could not find container \"4cdb5d1fc98bf806d2ecc9856b639e4dc31bbc99eb49526019314719abb453de\": container with ID starting with 4cdb5d1fc98bf806d2ecc9856b639e4dc31bbc99eb49526019314719abb453de not found: ID does not exist" Apr 17 11:34:33.662246 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.662215 2577 scope.go:117] "RemoveContainer" containerID="2b8e071bb93503db567e3ba1a84677df66549b1cb9815755003d0355e7e094e5" Apr 17 11:34:33.662438 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:34:33.662424 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b8e071bb93503db567e3ba1a84677df66549b1cb9815755003d0355e7e094e5\": container with ID starting with 2b8e071bb93503db567e3ba1a84677df66549b1cb9815755003d0355e7e094e5 not found: ID does not exist" containerID="2b8e071bb93503db567e3ba1a84677df66549b1cb9815755003d0355e7e094e5" Apr 17 11:34:33.662479 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.662441 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b8e071bb93503db567e3ba1a84677df66549b1cb9815755003d0355e7e094e5"} err="failed to get container status \"2b8e071bb93503db567e3ba1a84677df66549b1cb9815755003d0355e7e094e5\": rpc error: code = NotFound desc = could not find container \"2b8e071bb93503db567e3ba1a84677df66549b1cb9815755003d0355e7e094e5\": container with ID starting with 2b8e071bb93503db567e3ba1a84677df66549b1cb9815755003d0355e7e094e5 not found: ID does not exist" Apr 17 11:34:33.670104 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.670086 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 11:34:33.670342 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.670330 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f55a56e3-552e-48aa-99fb-a91e83075401" containerName="prometheus" Apr 17 11:34:33.670387 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.670345 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f55a56e3-552e-48aa-99fb-a91e83075401" containerName="prometheus" Apr 17 11:34:33.670387 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.670360 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f55a56e3-552e-48aa-99fb-a91e83075401" containerName="init-config-reloader" Apr 17 11:34:33.670387 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.670366 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f55a56e3-552e-48aa-99fb-a91e83075401" containerName="init-config-reloader" Apr 17 11:34:33.670387 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.670373 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f55a56e3-552e-48aa-99fb-a91e83075401" containerName="kube-rbac-proxy" Apr 17 11:34:33.670387 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.670379 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f55a56e3-552e-48aa-99fb-a91e83075401" containerName="kube-rbac-proxy" Apr 17 11:34:33.670387 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.670387 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f55a56e3-552e-48aa-99fb-a91e83075401" containerName="kube-rbac-proxy-web" Apr 17 11:34:33.670596 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.670393 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f55a56e3-552e-48aa-99fb-a91e83075401" containerName="kube-rbac-proxy-web" Apr 17 11:34:33.670596 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.670400 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f55a56e3-552e-48aa-99fb-a91e83075401" containerName="config-reloader" Apr 17 11:34:33.670596 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.670405 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f55a56e3-552e-48aa-99fb-a91e83075401" containerName="config-reloader" Apr 17 11:34:33.670596 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.670410 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f55a56e3-552e-48aa-99fb-a91e83075401" containerName="kube-rbac-proxy-thanos" Apr 17 11:34:33.670596 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.670415 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f55a56e3-552e-48aa-99fb-a91e83075401" containerName="kube-rbac-proxy-thanos" Apr 17 11:34:33.670596 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.670423 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f55a56e3-552e-48aa-99fb-a91e83075401" containerName="thanos-sidecar" Apr 17 11:34:33.670596 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.670428 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f55a56e3-552e-48aa-99fb-a91e83075401" containerName="thanos-sidecar" Apr 17 11:34:33.670596 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.670467 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="f55a56e3-552e-48aa-99fb-a91e83075401" containerName="config-reloader" Apr 17 11:34:33.670596 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.670475 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="f55a56e3-552e-48aa-99fb-a91e83075401" containerName="kube-rbac-proxy" Apr 17 11:34:33.670596 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.670481 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="f55a56e3-552e-48aa-99fb-a91e83075401" containerName="thanos-sidecar" Apr 17 11:34:33.670596 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.670486 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="f55a56e3-552e-48aa-99fb-a91e83075401" containerName="kube-rbac-proxy-thanos" Apr 17 11:34:33.670596 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.670493 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="f55a56e3-552e-48aa-99fb-a91e83075401" containerName="prometheus" Apr 17 11:34:33.670596 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.670500 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="f55a56e3-552e-48aa-99fb-a91e83075401" containerName="kube-rbac-proxy-web" Apr 17 11:34:33.675618 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.675604 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.678416 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.678391 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 11:34:33.678489 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.678391 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 11:34:33.678674 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.678658 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 11:34:33.678764 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.678748 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 11:34:33.678975 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.678957 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 11:34:33.678975 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.678976 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-athem5p22v0oq\"" Apr 17 11:34:33.678975 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.678960 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 11:34:33.679149 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.679022 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 11:34:33.679149 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.679113 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-968xp\"" Apr 17 11:34:33.679778 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.679761 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 11:34:33.679870 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.679826 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 11:34:33.679870 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.679854 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 11:34:33.681803 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.681564 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 11:34:33.684914 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.684894 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 11:34:33.687149 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.687131 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 11:34:33.808127 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.808099 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cb44461a-db39-4d2a-9070-8d8384d3602a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.808320 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.808135 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cb44461a-db39-4d2a-9070-8d8384d3602a-config-out\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.808320 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.808158 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cb44461a-db39-4d2a-9070-8d8384d3602a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.808320 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.808173 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xtpx\" (UniqueName: \"kubernetes.io/projected/cb44461a-db39-4d2a-9070-8d8384d3602a-kube-api-access-9xtpx\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.808320 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.808239 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cb44461a-db39-4d2a-9070-8d8384d3602a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.808320 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.808304 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cb44461a-db39-4d2a-9070-8d8384d3602a-config\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.808514 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.808327 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cb44461a-db39-4d2a-9070-8d8384d3602a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.808514 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.808346 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cb44461a-db39-4d2a-9070-8d8384d3602a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.808514 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.808379 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb44461a-db39-4d2a-9070-8d8384d3602a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.808514 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.808397 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cb44461a-db39-4d2a-9070-8d8384d3602a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.808514 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.808415 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cb44461a-db39-4d2a-9070-8d8384d3602a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.808514 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.808444 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb44461a-db39-4d2a-9070-8d8384d3602a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.808514 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.808477 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb44461a-db39-4d2a-9070-8d8384d3602a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.808514 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.808505 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cb44461a-db39-4d2a-9070-8d8384d3602a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.808763 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.808528 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cb44461a-db39-4d2a-9070-8d8384d3602a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.808763 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.808561 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cb44461a-db39-4d2a-9070-8d8384d3602a-web-config\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.808763 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.808577 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cb44461a-db39-4d2a-9070-8d8384d3602a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.808763 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.808614 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cb44461a-db39-4d2a-9070-8d8384d3602a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.898520 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.898456 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lzg8w\"" Apr 17 11:34:33.906581 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.906563 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw9bz" Apr 17 11:34:33.909448 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.909430 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cb44461a-db39-4d2a-9070-8d8384d3602a-web-config\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.909506 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.909457 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cb44461a-db39-4d2a-9070-8d8384d3602a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.909506 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.909492 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cb44461a-db39-4d2a-9070-8d8384d3602a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.909677 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.909652 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cb44461a-db39-4d2a-9070-8d8384d3602a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.909769 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.909721 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cb44461a-db39-4d2a-9070-8d8384d3602a-config-out\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.909769 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.909747 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cb44461a-db39-4d2a-9070-8d8384d3602a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.909870 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.909773 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9xtpx\" (UniqueName: \"kubernetes.io/projected/cb44461a-db39-4d2a-9070-8d8384d3602a-kube-api-access-9xtpx\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.909870 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.909805 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cb44461a-db39-4d2a-9070-8d8384d3602a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.909870 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.909835 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cb44461a-db39-4d2a-9070-8d8384d3602a-config\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.910022 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.909882 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cb44461a-db39-4d2a-9070-8d8384d3602a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.910022 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.909919 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cb44461a-db39-4d2a-9070-8d8384d3602a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.910022 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.909952 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb44461a-db39-4d2a-9070-8d8384d3602a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.910022 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.909984 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cb44461a-db39-4d2a-9070-8d8384d3602a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.910022 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.910010 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cb44461a-db39-4d2a-9070-8d8384d3602a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.910252 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.910045 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb44461a-db39-4d2a-9070-8d8384d3602a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.910252 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.910074 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb44461a-db39-4d2a-9070-8d8384d3602a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.910252 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.910114 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cb44461a-db39-4d2a-9070-8d8384d3602a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.910252 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.910137 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cb44461a-db39-4d2a-9070-8d8384d3602a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.911135 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.910769 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb44461a-db39-4d2a-9070-8d8384d3602a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.911538 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.911513 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb44461a-db39-4d2a-9070-8d8384d3602a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.912907 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.912353 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb44461a-db39-4d2a-9070-8d8384d3602a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.912907 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.912427 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cb44461a-db39-4d2a-9070-8d8384d3602a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.912907 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.912865 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cb44461a-db39-4d2a-9070-8d8384d3602a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.913120 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.912975 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cb44461a-db39-4d2a-9070-8d8384d3602a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.913578 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.913530 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cb44461a-db39-4d2a-9070-8d8384d3602a-web-config\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.914060 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.914020 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cb44461a-db39-4d2a-9070-8d8384d3602a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.914365 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.914320 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cb44461a-db39-4d2a-9070-8d8384d3602a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.914637 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.914591 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cb44461a-db39-4d2a-9070-8d8384d3602a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.914637 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.914604 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cb44461a-db39-4d2a-9070-8d8384d3602a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.915157 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.915138 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cb44461a-db39-4d2a-9070-8d8384d3602a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.915200 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.915174 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cb44461a-db39-4d2a-9070-8d8384d3602a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.915669 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.915647 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cb44461a-db39-4d2a-9070-8d8384d3602a-config\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.915868 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.915844 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cb44461a-db39-4d2a-9070-8d8384d3602a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.915922 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.915858 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cb44461a-db39-4d2a-9070-8d8384d3602a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.916299 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.916264 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cb44461a-db39-4d2a-9070-8d8384d3602a-config-out\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.920953 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.920922 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xtpx\" (UniqueName: \"kubernetes.io/projected/cb44461a-db39-4d2a-9070-8d8384d3602a-kube-api-access-9xtpx\") pod \"prometheus-k8s-0\" (UID: \"cb44461a-db39-4d2a-9070-8d8384d3602a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:33.984815 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:33.984784 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:34:34.031226 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:34.031050 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xw9bz"] Apr 17 11:34:34.035858 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:34:34.035804 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4155f35e_1865_499f_88fb_fdde1e2c1218.slice/crio-43cc3dadb43feaba8f3670963128e0d09eac5bc1032f6ce608e40b50db34761f WatchSource:0}: Error finding container 43cc3dadb43feaba8f3670963128e0d09eac5bc1032f6ce608e40b50db34761f: Status 404 returned error can't find the container with id 43cc3dadb43feaba8f3670963128e0d09eac5bc1032f6ce608e40b50db34761f Apr 17 11:34:34.118086 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:34.118048 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 11:34:34.122975 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:34:34.122946 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb44461a_db39_4d2a_9070_8d8384d3602a.slice/crio-bd74a60896b61486d90b845ffc352d79391002bf326d7524f2cd6b3bd2207faa WatchSource:0}: Error finding container bd74a60896b61486d90b845ffc352d79391002bf326d7524f2cd6b3bd2207faa: Status 404 returned error can't find the container with id bd74a60896b61486d90b845ffc352d79391002bf326d7524f2cd6b3bd2207faa Apr 17 11:34:34.621368 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:34.621327 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xw9bz" event={"ID":"4155f35e-1865-499f-88fb-fdde1e2c1218","Type":"ContainerStarted","Data":"43cc3dadb43feaba8f3670963128e0d09eac5bc1032f6ce608e40b50db34761f"} Apr 17 11:34:34.622757 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:34.622725 2577 generic.go:358] "Generic (PLEG): container finished" podID="cb44461a-db39-4d2a-9070-8d8384d3602a" containerID="669fbf51a18b5d7e033afd27f6ae34d4b2c49fbfb91046f52bdb2cc01e427fb1" exitCode=0 Apr 17 11:34:34.622868 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:34.622792 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cb44461a-db39-4d2a-9070-8d8384d3602a","Type":"ContainerDied","Data":"669fbf51a18b5d7e033afd27f6ae34d4b2c49fbfb91046f52bdb2cc01e427fb1"} Apr 17 11:34:34.622868 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:34.622816 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cb44461a-db39-4d2a-9070-8d8384d3602a","Type":"ContainerStarted","Data":"bd74a60896b61486d90b845ffc352d79391002bf326d7524f2cd6b3bd2207faa"} Apr 17 11:34:34.798632 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:34.798591 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f55a56e3-552e-48aa-99fb-a91e83075401" path="/var/lib/kubelet/pods/f55a56e3-552e-48aa-99fb-a91e83075401/volumes" Apr 17 11:34:35.629031 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:35.628987 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cb44461a-db39-4d2a-9070-8d8384d3602a","Type":"ContainerStarted","Data":"7a54180aeab9d7204f96c2d336fb286cc0dc933422c101faaaaf715935401e2d"} Apr 17 11:34:35.629031 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:35.629033 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cb44461a-db39-4d2a-9070-8d8384d3602a","Type":"ContainerStarted","Data":"c54ef08f659bbb27c3577e30c0aba21e412fb89f16bd7f02a3739bf50e3fb694"} Apr 17 11:34:35.629539 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:35.629049 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cb44461a-db39-4d2a-9070-8d8384d3602a","Type":"ContainerStarted","Data":"6e74405bdf65ca5a7cc3b4e120f9e6ca7d6be945d37e9354a4f678f46c5ae085"} Apr 17 11:34:35.629539 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:35.629061 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cb44461a-db39-4d2a-9070-8d8384d3602a","Type":"ContainerStarted","Data":"c84594cbcb574f04d1a719bb147d05ad5032253306992b7f42cde57bbad48488"} Apr 17 11:34:35.629539 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:35.629074 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cb44461a-db39-4d2a-9070-8d8384d3602a","Type":"ContainerStarted","Data":"d9af032f9f747b4d8a51a77c72322059466c6f9979f7df415ddd968629531f1c"} Apr 17 11:34:35.629539 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:35.629089 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cb44461a-db39-4d2a-9070-8d8384d3602a","Type":"ContainerStarted","Data":"6a3c0c443ea5f3dbfaeaeecaf9907c8363a1baed58172599c7e1b8797793e40d"} Apr 17 11:34:35.630604 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:35.630582 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xw9bz" event={"ID":"4155f35e-1865-499f-88fb-fdde1e2c1218","Type":"ContainerStarted","Data":"70a7d81a1efb5e5e55d06f775dc0a97f79096c213c7b0b62ae313f8be75b03e5"} Apr 17 11:34:35.630695 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:35.630610 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xw9bz" event={"ID":"4155f35e-1865-499f-88fb-fdde1e2c1218","Type":"ContainerStarted","Data":"92d2dcce621e3ad15da828055525298665fddb88cf372fa72645f182a32c483b"} Apr 17 11:34:35.660240 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:35.660188 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.660171326 podStartE2EDuration="2.660171326s" podCreationTimestamp="2026-04-17 11:34:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:34:35.658466381 +0000 UTC m=+253.456180997" watchObservedRunningTime="2026-04-17 11:34:35.660171326 +0000 UTC m=+253.457885944" Apr 17 11:34:35.676594 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:35.676546 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-xw9bz" podStartSLOduration=252.784612679 podStartE2EDuration="4m13.676533771s" podCreationTimestamp="2026-04-17 11:30:22 +0000 UTC" firstStartedPulling="2026-04-17 11:34:34.038392385 +0000 UTC m=+251.836106994" lastFinishedPulling="2026-04-17 11:34:34.930313488 +0000 UTC m=+252.728028086" observedRunningTime="2026-04-17 11:34:35.675418779 +0000 UTC m=+253.473133396" watchObservedRunningTime="2026-04-17 11:34:35.676533771 +0000 UTC m=+253.474248387" Apr 17 11:34:35.727302 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:34:35.727235 2577 configmap.go:193] Couldn't get configMap openshift-monitoring/prometheus-k8s-rulefiles-0: configmap "prometheus-k8s-rulefiles-0" not found Apr 17 11:34:35.727469 ip-10-0-134-64 kubenswrapper[2577]: E0417 11:34:35.727362 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cb44461a-db39-4d2a-9070-8d8384d3602a-prometheus-k8s-rulefiles-0 podName:cb44461a-db39-4d2a-9070-8d8384d3602a nodeName:}" failed. No retries permitted until 2026-04-17 11:34:36.227342429 +0000 UTC m=+254.025057028 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-k8s-rulefiles-0" (UniqueName: "kubernetes.io/configmap/cb44461a-db39-4d2a-9070-8d8384d3602a-prometheus-k8s-rulefiles-0") pod "prometheus-k8s-0" (UID: "cb44461a-db39-4d2a-9070-8d8384d3602a") : configmap "prometheus-k8s-rulefiles-0" not found Apr 17 11:34:38.985823 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:34:38.985775 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:35:22.687015 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:35:22.686985 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-znj2s_43535899-eb5a-4030-8bab-db2650a0cbff/ovn-acl-logging/0.log" Apr 17 11:35:22.688628 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:35:22.688601 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-znj2s_43535899-eb5a-4030-8bab-db2650a0cbff/ovn-acl-logging/0.log" Apr 17 11:35:22.694562 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:35:22.694541 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 11:35:33.985466 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:35:33.985408 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:35:34.000793 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:35:34.000767 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:35:34.813322 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:35:34.813296 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 11:38:08.836754 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:38:08.836722 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-5995989c79-fm62f"] Apr 17 11:38:08.839885 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:38:08.839868 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-5995989c79-fm62f" Apr 17 11:38:08.842488 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:38:08.842467 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-controller-manager-dockercfg-j4fmm\"" Apr 17 11:38:08.842583 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:38:08.842467 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 11:38:08.843599 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:38:08.843580 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kubeflow-trainer-config\"" Apr 17 11:38:08.843599 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:38:08.843594 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-webhook-cert\"" Apr 17 11:38:08.843732 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:38:08.843604 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 11:38:08.848158 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:38:08.848136 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-5995989c79-fm62f"] Apr 17 11:38:08.965813 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:38:08.965778 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d1ae9b7f-5a9c-472f-a8f2-5bdbae946599-cert\") pod \"kubeflow-trainer-controller-manager-5995989c79-fm62f\" (UID: \"d1ae9b7f-5a9c-472f-a8f2-5bdbae946599\") " pod="opendatahub/kubeflow-trainer-controller-manager-5995989c79-fm62f" Apr 17 11:38:08.965996 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:38:08.965847 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/d1ae9b7f-5a9c-472f-a8f2-5bdbae946599-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-5995989c79-fm62f\" (UID: \"d1ae9b7f-5a9c-472f-a8f2-5bdbae946599\") " pod="opendatahub/kubeflow-trainer-controller-manager-5995989c79-fm62f" Apr 17 11:38:08.965996 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:38:08.965894 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hkq9\" (UniqueName: \"kubernetes.io/projected/d1ae9b7f-5a9c-472f-a8f2-5bdbae946599-kube-api-access-4hkq9\") pod \"kubeflow-trainer-controller-manager-5995989c79-fm62f\" (UID: \"d1ae9b7f-5a9c-472f-a8f2-5bdbae946599\") " pod="opendatahub/kubeflow-trainer-controller-manager-5995989c79-fm62f" Apr 17 11:38:09.066796 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:38:09.066766 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/d1ae9b7f-5a9c-472f-a8f2-5bdbae946599-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-5995989c79-fm62f\" (UID: \"d1ae9b7f-5a9c-472f-a8f2-5bdbae946599\") " pod="opendatahub/kubeflow-trainer-controller-manager-5995989c79-fm62f" Apr 17 11:38:09.066952 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:38:09.066818 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hkq9\" (UniqueName: \"kubernetes.io/projected/d1ae9b7f-5a9c-472f-a8f2-5bdbae946599-kube-api-access-4hkq9\") pod \"kubeflow-trainer-controller-manager-5995989c79-fm62f\" (UID: \"d1ae9b7f-5a9c-472f-a8f2-5bdbae946599\") " pod="opendatahub/kubeflow-trainer-controller-manager-5995989c79-fm62f" Apr 17 11:38:09.066952 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:38:09.066840 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d1ae9b7f-5a9c-472f-a8f2-5bdbae946599-cert\") pod \"kubeflow-trainer-controller-manager-5995989c79-fm62f\" (UID: \"d1ae9b7f-5a9c-472f-a8f2-5bdbae946599\") " pod="opendatahub/kubeflow-trainer-controller-manager-5995989c79-fm62f" Apr 17 11:38:09.067436 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:38:09.067413 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/d1ae9b7f-5a9c-472f-a8f2-5bdbae946599-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-5995989c79-fm62f\" (UID: \"d1ae9b7f-5a9c-472f-a8f2-5bdbae946599\") " pod="opendatahub/kubeflow-trainer-controller-manager-5995989c79-fm62f" Apr 17 11:38:09.068975 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:38:09.068957 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d1ae9b7f-5a9c-472f-a8f2-5bdbae946599-cert\") pod \"kubeflow-trainer-controller-manager-5995989c79-fm62f\" (UID: \"d1ae9b7f-5a9c-472f-a8f2-5bdbae946599\") " pod="opendatahub/kubeflow-trainer-controller-manager-5995989c79-fm62f" Apr 17 11:38:09.075445 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:38:09.075420 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hkq9\" (UniqueName: \"kubernetes.io/projected/d1ae9b7f-5a9c-472f-a8f2-5bdbae946599-kube-api-access-4hkq9\") pod \"kubeflow-trainer-controller-manager-5995989c79-fm62f\" (UID: \"d1ae9b7f-5a9c-472f-a8f2-5bdbae946599\") " pod="opendatahub/kubeflow-trainer-controller-manager-5995989c79-fm62f" Apr 17 11:38:09.149195 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:38:09.149113 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-5995989c79-fm62f" Apr 17 11:38:09.263281 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:38:09.263247 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-5995989c79-fm62f"] Apr 17 11:38:09.265702 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:38:09.265676 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1ae9b7f_5a9c_472f_a8f2_5bdbae946599.slice/crio-912c7e88df09f76643b9c636c77f74dd70c2305292724a2433a90add30dc7518 WatchSource:0}: Error finding container 912c7e88df09f76643b9c636c77f74dd70c2305292724a2433a90add30dc7518: Status 404 returned error can't find the container with id 912c7e88df09f76643b9c636c77f74dd70c2305292724a2433a90add30dc7518 Apr 17 11:38:09.267513 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:38:09.267497 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 11:38:10.218728 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:38:10.218676 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-5995989c79-fm62f" event={"ID":"d1ae9b7f-5a9c-472f-a8f2-5bdbae946599","Type":"ContainerStarted","Data":"912c7e88df09f76643b9c636c77f74dd70c2305292724a2433a90add30dc7518"} Apr 17 11:38:12.225311 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:38:12.225255 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-5995989c79-fm62f" event={"ID":"d1ae9b7f-5a9c-472f-a8f2-5bdbae946599","Type":"ContainerStarted","Data":"63f32e424d9e12551d9b5d219ac2ea60740caba86108b731d55cf38f4b547a48"} Apr 17 11:38:12.225676 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:38:12.225352 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kubeflow-trainer-controller-manager-5995989c79-fm62f" Apr 17 11:38:12.243457 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:38:12.243418 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kubeflow-trainer-controller-manager-5995989c79-fm62f" podStartSLOduration=1.963088985 podStartE2EDuration="4.243404277s" podCreationTimestamp="2026-04-17 11:38:08 +0000 UTC" firstStartedPulling="2026-04-17 11:38:09.267620036 +0000 UTC m=+467.065334635" lastFinishedPulling="2026-04-17 11:38:11.547935329 +0000 UTC m=+469.345649927" observedRunningTime="2026-04-17 11:38:12.242590013 +0000 UTC m=+470.040304644" watchObservedRunningTime="2026-04-17 11:38:12.243404277 +0000 UTC m=+470.041118893" Apr 17 11:38:28.232978 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:38:28.232942 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kubeflow-trainer-controller-manager-5995989c79-fm62f" Apr 17 11:39:59.382011 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:39:59.381973 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-hfmts/progression-enabled-node-0-0-qwhrf"] Apr 17 11:39:59.384014 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:39:59.383998 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-hfmts/progression-enabled-node-0-0-qwhrf" Apr 17 11:39:59.386552 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:39:59.386528 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-hfmts\"/\"kube-root-ca.crt\"" Apr 17 11:39:59.386649 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:39:59.386563 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-hfmts\"/\"openshift-service-ca.crt\"" Apr 17 11:39:59.386649 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:39:59.386625 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-hfmts\"/\"default-dockercfg-h6m6b\"" Apr 17 11:39:59.400924 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:39:59.400902 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-hfmts/progression-enabled-node-0-0-qwhrf"] Apr 17 11:39:59.424430 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:39:59.424402 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x98dl\" (UniqueName: \"kubernetes.io/projected/abe9df75-bf9a-4cb2-af7f-1da834a42517-kube-api-access-x98dl\") pod \"progression-enabled-node-0-0-qwhrf\" (UID: \"abe9df75-bf9a-4cb2-af7f-1da834a42517\") " pod="rhai-e2e-progression-hfmts/progression-enabled-node-0-0-qwhrf" Apr 17 11:39:59.525830 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:39:59.525786 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x98dl\" (UniqueName: \"kubernetes.io/projected/abe9df75-bf9a-4cb2-af7f-1da834a42517-kube-api-access-x98dl\") pod \"progression-enabled-node-0-0-qwhrf\" (UID: \"abe9df75-bf9a-4cb2-af7f-1da834a42517\") " pod="rhai-e2e-progression-hfmts/progression-enabled-node-0-0-qwhrf" Apr 17 11:39:59.535756 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:39:59.535727 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x98dl\" (UniqueName: \"kubernetes.io/projected/abe9df75-bf9a-4cb2-af7f-1da834a42517-kube-api-access-x98dl\") pod \"progression-enabled-node-0-0-qwhrf\" (UID: \"abe9df75-bf9a-4cb2-af7f-1da834a42517\") " pod="rhai-e2e-progression-hfmts/progression-enabled-node-0-0-qwhrf" Apr 17 11:39:59.694090 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:39:59.694007 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-hfmts/progression-enabled-node-0-0-qwhrf" Apr 17 11:39:59.812574 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:39:59.812554 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-hfmts/progression-enabled-node-0-0-qwhrf"] Apr 17 11:39:59.814800 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:39:59.814775 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabe9df75_bf9a_4cb2_af7f_1da834a42517.slice/crio-c818b47c89809bbacaf5d3b67639e2e0a0213836358402dc4b9cc25dfe13c5d3 WatchSource:0}: Error finding container c818b47c89809bbacaf5d3b67639e2e0a0213836358402dc4b9cc25dfe13c5d3: Status 404 returned error can't find the container with id c818b47c89809bbacaf5d3b67639e2e0a0213836358402dc4b9cc25dfe13c5d3 Apr 17 11:40:00.524309 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:40:00.524236 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-hfmts/progression-enabled-node-0-0-qwhrf" event={"ID":"abe9df75-bf9a-4cb2-af7f-1da834a42517","Type":"ContainerStarted","Data":"c818b47c89809bbacaf5d3b67639e2e0a0213836358402dc4b9cc25dfe13c5d3"} Apr 17 11:40:22.712756 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:40:22.712726 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-znj2s_43535899-eb5a-4030-8bab-db2650a0cbff/ovn-acl-logging/0.log" Apr 17 11:40:22.713238 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:40:22.712867 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-znj2s_43535899-eb5a-4030-8bab-db2650a0cbff/ovn-acl-logging/0.log" Apr 17 11:41:43.842290 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:41:43.842233 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-hfmts/progression-enabled-node-0-0-qwhrf" event={"ID":"abe9df75-bf9a-4cb2-af7f-1da834a42517","Type":"ContainerStarted","Data":"9c58bf1c1e182c2de6274239ca21bafba50af1560e7535ec574794c312032a94"} Apr 17 11:41:43.842712 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:41:43.842402 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-hfmts/progression-enabled-node-0-0-qwhrf" Apr 17 11:41:43.860243 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:41:43.860187 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-hfmts/progression-enabled-node-0-0-qwhrf" podStartSLOduration=1.327868888 podStartE2EDuration="1m44.860167989s" podCreationTimestamp="2026-04-17 11:39:59 +0000 UTC" firstStartedPulling="2026-04-17 11:39:59.81685015 +0000 UTC m=+577.614564745" lastFinishedPulling="2026-04-17 11:41:43.349149234 +0000 UTC m=+681.146863846" observedRunningTime="2026-04-17 11:41:43.859605969 +0000 UTC m=+681.657320581" watchObservedRunningTime="2026-04-17 11:41:43.860167989 +0000 UTC m=+681.657882612" Apr 17 11:41:45.847625 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:41:45.847597 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-hfmts/progression-enabled-node-0-0-qwhrf" Apr 17 11:42:06.845997 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:42:06.845956 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-hfmts/progression-enabled-node-0-0-qwhrf" podUID="abe9df75-bf9a-4cb2-af7f-1da834a42517" containerName="node" probeResult="failure" output="Get \"http://10.133.0.18:28080/metrics\": dial tcp 10.133.0.18:28080: connect: connection refused" Apr 17 11:42:06.904063 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:42:06.904030 2577 generic.go:358] "Generic (PLEG): container finished" podID="abe9df75-bf9a-4cb2-af7f-1da834a42517" containerID="9c58bf1c1e182c2de6274239ca21bafba50af1560e7535ec574794c312032a94" exitCode=0 Apr 17 11:42:06.904215 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:42:06.904101 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-hfmts/progression-enabled-node-0-0-qwhrf" event={"ID":"abe9df75-bf9a-4cb2-af7f-1da834a42517","Type":"ContainerDied","Data":"9c58bf1c1e182c2de6274239ca21bafba50af1560e7535ec574794c312032a94"} Apr 17 11:42:08.033993 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:42:08.033972 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-hfmts/progression-enabled-node-0-0-qwhrf" Apr 17 11:42:08.128013 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:42:08.127980 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x98dl\" (UniqueName: \"kubernetes.io/projected/abe9df75-bf9a-4cb2-af7f-1da834a42517-kube-api-access-x98dl\") pod \"abe9df75-bf9a-4cb2-af7f-1da834a42517\" (UID: \"abe9df75-bf9a-4cb2-af7f-1da834a42517\") " Apr 17 11:42:08.130119 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:42:08.130097 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abe9df75-bf9a-4cb2-af7f-1da834a42517-kube-api-access-x98dl" (OuterVolumeSpecName: "kube-api-access-x98dl") pod "abe9df75-bf9a-4cb2-af7f-1da834a42517" (UID: "abe9df75-bf9a-4cb2-af7f-1da834a42517"). InnerVolumeSpecName "kube-api-access-x98dl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:42:08.228945 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:42:08.228915 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x98dl\" (UniqueName: \"kubernetes.io/projected/abe9df75-bf9a-4cb2-af7f-1da834a42517-kube-api-access-x98dl\") on node \"ip-10-0-134-64.ec2.internal\" DevicePath \"\"" Apr 17 11:42:08.911057 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:42:08.911021 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-hfmts/progression-enabled-node-0-0-qwhrf" event={"ID":"abe9df75-bf9a-4cb2-af7f-1da834a42517","Type":"ContainerDied","Data":"c818b47c89809bbacaf5d3b67639e2e0a0213836358402dc4b9cc25dfe13c5d3"} Apr 17 11:42:08.911057 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:42:08.911061 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c818b47c89809bbacaf5d3b67639e2e0a0213836358402dc4b9cc25dfe13c5d3" Apr 17 11:42:08.911241 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:42:08.911038 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-hfmts/progression-enabled-node-0-0-qwhrf" Apr 17 11:45:22.733414 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:45:22.733333 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-znj2s_43535899-eb5a-4030-8bab-db2650a0cbff/ovn-acl-logging/0.log" Apr 17 11:45:22.734466 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:45:22.734447 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-znj2s_43535899-eb5a-4030-8bab-db2650a0cbff/ovn-acl-logging/0.log" Apr 17 11:46:45.735258 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:46:45.735170 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-hfmts/progression-disabled-node-0-0-kgmdh"] Apr 17 11:46:45.737617 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:46:45.735465 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="abe9df75-bf9a-4cb2-af7f-1da834a42517" containerName="node" Apr 17 11:46:45.737617 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:46:45.735476 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="abe9df75-bf9a-4cb2-af7f-1da834a42517" containerName="node" Apr 17 11:46:45.737617 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:46:45.735521 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="abe9df75-bf9a-4cb2-af7f-1da834a42517" containerName="node" Apr 17 11:46:45.738471 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:46:45.738455 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-hfmts/progression-disabled-node-0-0-kgmdh" Apr 17 11:46:45.740863 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:46:45.740844 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-hfmts\"/\"kube-root-ca.crt\"" Apr 17 11:46:45.740958 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:46:45.740870 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-hfmts\"/\"openshift-service-ca.crt\"" Apr 17 11:46:45.740958 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:46:45.740918 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-hfmts\"/\"default-dockercfg-h6m6b\"" Apr 17 11:46:45.754494 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:46:45.754468 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-hfmts/progression-disabled-node-0-0-kgmdh"] Apr 17 11:46:45.791925 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:46:45.791894 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jsrc\" (UniqueName: \"kubernetes.io/projected/6ec037c8-de07-4100-93e9-15501330e152-kube-api-access-4jsrc\") pod \"progression-disabled-node-0-0-kgmdh\" (UID: \"6ec037c8-de07-4100-93e9-15501330e152\") " pod="rhai-e2e-progression-hfmts/progression-disabled-node-0-0-kgmdh" Apr 17 11:46:45.893242 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:46:45.893208 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4jsrc\" (UniqueName: \"kubernetes.io/projected/6ec037c8-de07-4100-93e9-15501330e152-kube-api-access-4jsrc\") pod \"progression-disabled-node-0-0-kgmdh\" (UID: \"6ec037c8-de07-4100-93e9-15501330e152\") " pod="rhai-e2e-progression-hfmts/progression-disabled-node-0-0-kgmdh" Apr 17 11:46:45.901587 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:46:45.901562 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jsrc\" (UniqueName: \"kubernetes.io/projected/6ec037c8-de07-4100-93e9-15501330e152-kube-api-access-4jsrc\") pod \"progression-disabled-node-0-0-kgmdh\" (UID: \"6ec037c8-de07-4100-93e9-15501330e152\") " pod="rhai-e2e-progression-hfmts/progression-disabled-node-0-0-kgmdh" Apr 17 11:46:46.047943 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:46:46.047852 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-hfmts/progression-disabled-node-0-0-kgmdh" Apr 17 11:46:46.164965 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:46:46.164935 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-hfmts/progression-disabled-node-0-0-kgmdh"] Apr 17 11:46:46.168471 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:46:46.168445 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ec037c8_de07_4100_93e9_15501330e152.slice/crio-5e4dd73ea23467c5eaaf8064298cdb06faf228f3d36c4357d2b50c1762625ce2 WatchSource:0}: Error finding container 5e4dd73ea23467c5eaaf8064298cdb06faf228f3d36c4357d2b50c1762625ce2: Status 404 returned error can't find the container with id 5e4dd73ea23467c5eaaf8064298cdb06faf228f3d36c4357d2b50c1762625ce2 Apr 17 11:46:46.170597 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:46:46.170579 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 11:46:46.674289 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:46:46.674241 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-hfmts/progression-disabled-node-0-0-kgmdh" event={"ID":"6ec037c8-de07-4100-93e9-15501330e152","Type":"ContainerStarted","Data":"c41f32bf9044d9794ed2529a55aa5ee3aca53ae93d76c62270f295d0e4d7136a"} Apr 17 11:46:46.674673 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:46:46.674639 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-hfmts/progression-disabled-node-0-0-kgmdh" event={"ID":"6ec037c8-de07-4100-93e9-15501330e152","Type":"ContainerStarted","Data":"5e4dd73ea23467c5eaaf8064298cdb06faf228f3d36c4357d2b50c1762625ce2"} Apr 17 11:46:46.674767 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:46:46.674715 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-hfmts/progression-disabled-node-0-0-kgmdh" Apr 17 11:46:46.697198 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:46:46.697145 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-hfmts/progression-disabled-node-0-0-kgmdh" podStartSLOduration=1.6971320109999999 podStartE2EDuration="1.697132011s" podCreationTimestamp="2026-04-17 11:46:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:46:46.695775899 +0000 UTC m=+984.493490517" watchObservedRunningTime="2026-04-17 11:46:46.697132011 +0000 UTC m=+984.494846628" Apr 17 11:46:48.680608 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:46:48.680574 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-hfmts/progression-disabled-node-0-0-kgmdh" Apr 17 11:47:09.678498 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:09.678407 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-hfmts/progression-disabled-node-0-0-kgmdh" podUID="6ec037c8-de07-4100-93e9-15501330e152" containerName="node" probeResult="failure" output="Get \"http://10.133.0.19:28080/metrics\": dial tcp 10.133.0.19:28080: connect: connection refused" Apr 17 11:47:09.738939 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:09.738909 2577 generic.go:358] "Generic (PLEG): container finished" podID="6ec037c8-de07-4100-93e9-15501330e152" containerID="c41f32bf9044d9794ed2529a55aa5ee3aca53ae93d76c62270f295d0e4d7136a" exitCode=0 Apr 17 11:47:09.739073 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:09.738963 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-hfmts/progression-disabled-node-0-0-kgmdh" event={"ID":"6ec037c8-de07-4100-93e9-15501330e152","Type":"ContainerDied","Data":"c41f32bf9044d9794ed2529a55aa5ee3aca53ae93d76c62270f295d0e4d7136a"} Apr 17 11:47:10.859530 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:10.859506 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-hfmts/progression-disabled-node-0-0-kgmdh" Apr 17 11:47:10.892505 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:10.892468 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jsrc\" (UniqueName: \"kubernetes.io/projected/6ec037c8-de07-4100-93e9-15501330e152-kube-api-access-4jsrc\") pod \"6ec037c8-de07-4100-93e9-15501330e152\" (UID: \"6ec037c8-de07-4100-93e9-15501330e152\") " Apr 17 11:47:10.894484 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:10.894455 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ec037c8-de07-4100-93e9-15501330e152-kube-api-access-4jsrc" (OuterVolumeSpecName: "kube-api-access-4jsrc") pod "6ec037c8-de07-4100-93e9-15501330e152" (UID: "6ec037c8-de07-4100-93e9-15501330e152"). InnerVolumeSpecName "kube-api-access-4jsrc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:47:10.992970 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:10.992894 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4jsrc\" (UniqueName: \"kubernetes.io/projected/6ec037c8-de07-4100-93e9-15501330e152-kube-api-access-4jsrc\") on node \"ip-10-0-134-64.ec2.internal\" DevicePath \"\"" Apr 17 11:47:11.745903 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:11.745868 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-hfmts/progression-disabled-node-0-0-kgmdh" event={"ID":"6ec037c8-de07-4100-93e9-15501330e152","Type":"ContainerDied","Data":"5e4dd73ea23467c5eaaf8064298cdb06faf228f3d36c4357d2b50c1762625ce2"} Apr 17 11:47:11.745903 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:11.745897 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-hfmts/progression-disabled-node-0-0-kgmdh" Apr 17 11:47:11.746103 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:11.745901 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e4dd73ea23467c5eaaf8064298cdb06faf228f3d36c4357d2b50c1762625ce2" Apr 17 11:47:20.732427 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:20.732387 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-hfmts/progression-invalid-node-0-0-z5sd6"] Apr 17 11:47:20.733013 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:20.732808 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ec037c8-de07-4100-93e9-15501330e152" containerName="node" Apr 17 11:47:20.733013 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:20.732826 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec037c8-de07-4100-93e9-15501330e152" containerName="node" Apr 17 11:47:20.733013 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:20.732898 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="6ec037c8-de07-4100-93e9-15501330e152" containerName="node" Apr 17 11:47:20.735831 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:20.735809 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-hfmts/progression-invalid-node-0-0-z5sd6" Apr 17 11:47:20.739325 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:20.739294 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-hfmts\"/\"openshift-service-ca.crt\"" Apr 17 11:47:20.739415 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:20.739396 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-hfmts\"/\"default-dockercfg-h6m6b\"" Apr 17 11:47:20.747238 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:20.747219 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-hfmts\"/\"kube-root-ca.crt\"" Apr 17 11:47:20.758868 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:20.758846 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-hfmts/progression-invalid-node-0-0-z5sd6"] Apr 17 11:47:20.863808 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:20.863774 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74xh7\" (UniqueName: \"kubernetes.io/projected/c13e3d99-a2a7-4358-8c23-eebc46aa3ab1-kube-api-access-74xh7\") pod \"progression-invalid-node-0-0-z5sd6\" (UID: \"c13e3d99-a2a7-4358-8c23-eebc46aa3ab1\") " pod="rhai-e2e-progression-hfmts/progression-invalid-node-0-0-z5sd6" Apr 17 11:47:20.964494 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:20.964461 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-74xh7\" (UniqueName: \"kubernetes.io/projected/c13e3d99-a2a7-4358-8c23-eebc46aa3ab1-kube-api-access-74xh7\") pod \"progression-invalid-node-0-0-z5sd6\" (UID: \"c13e3d99-a2a7-4358-8c23-eebc46aa3ab1\") " pod="rhai-e2e-progression-hfmts/progression-invalid-node-0-0-z5sd6" Apr 17 11:47:20.973565 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:20.973534 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-74xh7\" (UniqueName: \"kubernetes.io/projected/c13e3d99-a2a7-4358-8c23-eebc46aa3ab1-kube-api-access-74xh7\") pod \"progression-invalid-node-0-0-z5sd6\" (UID: \"c13e3d99-a2a7-4358-8c23-eebc46aa3ab1\") " pod="rhai-e2e-progression-hfmts/progression-invalid-node-0-0-z5sd6" Apr 17 11:47:21.044831 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:21.044756 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-hfmts/progression-invalid-node-0-0-z5sd6" Apr 17 11:47:21.167844 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:21.167791 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-hfmts/progression-invalid-node-0-0-z5sd6"] Apr 17 11:47:21.171055 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:47:21.171028 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc13e3d99_a2a7_4358_8c23_eebc46aa3ab1.slice/crio-b4ca35e081d38169315291e9e42f1bb9bb9128e4017db91b187b86ec0b0cd993 WatchSource:0}: Error finding container b4ca35e081d38169315291e9e42f1bb9bb9128e4017db91b187b86ec0b0cd993: Status 404 returned error can't find the container with id b4ca35e081d38169315291e9e42f1bb9bb9128e4017db91b187b86ec0b0cd993 Apr 17 11:47:21.776522 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:21.776485 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-hfmts/progression-invalid-node-0-0-z5sd6" event={"ID":"c13e3d99-a2a7-4358-8c23-eebc46aa3ab1","Type":"ContainerStarted","Data":"2530837c7e129ff61d1713e63b423173ba8173d885d4ca839a4188147fa2b82b"} Apr 17 11:47:21.776522 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:21.776526 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-hfmts/progression-invalid-node-0-0-z5sd6" event={"ID":"c13e3d99-a2a7-4358-8c23-eebc46aa3ab1","Type":"ContainerStarted","Data":"b4ca35e081d38169315291e9e42f1bb9bb9128e4017db91b187b86ec0b0cd993"} Apr 17 11:47:21.776944 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:21.776555 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-hfmts/progression-invalid-node-0-0-z5sd6" Apr 17 11:47:21.794161 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:21.794114 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-hfmts/progression-invalid-node-0-0-z5sd6" podStartSLOduration=1.794101551 podStartE2EDuration="1.794101551s" podCreationTimestamp="2026-04-17 11:47:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:47:21.793066222 +0000 UTC m=+1019.590780849" watchObservedRunningTime="2026-04-17 11:47:21.794101551 +0000 UTC m=+1019.591816167" Apr 17 11:47:23.782302 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:23.782258 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-hfmts/progression-invalid-node-0-0-z5sd6" Apr 17 11:47:44.779416 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:44.779375 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-hfmts/progression-invalid-node-0-0-z5sd6" podUID="c13e3d99-a2a7-4358-8c23-eebc46aa3ab1" containerName="node" probeResult="failure" output="Get \"http://10.133.0.20:28080/metrics\": dial tcp 10.133.0.20:28080: connect: connection refused" Apr 17 11:47:44.847313 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:44.847280 2577 generic.go:358] "Generic (PLEG): container finished" podID="c13e3d99-a2a7-4358-8c23-eebc46aa3ab1" containerID="2530837c7e129ff61d1713e63b423173ba8173d885d4ca839a4188147fa2b82b" exitCode=0 Apr 17 11:47:44.847470 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:44.847342 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-hfmts/progression-invalid-node-0-0-z5sd6" event={"ID":"c13e3d99-a2a7-4358-8c23-eebc46aa3ab1","Type":"ContainerDied","Data":"2530837c7e129ff61d1713e63b423173ba8173d885d4ca839a4188147fa2b82b"} Apr 17 11:47:45.970228 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:45.970207 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-hfmts/progression-invalid-node-0-0-z5sd6" Apr 17 11:47:46.067632 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:46.067601 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74xh7\" (UniqueName: \"kubernetes.io/projected/c13e3d99-a2a7-4358-8c23-eebc46aa3ab1-kube-api-access-74xh7\") pod \"c13e3d99-a2a7-4358-8c23-eebc46aa3ab1\" (UID: \"c13e3d99-a2a7-4358-8c23-eebc46aa3ab1\") " Apr 17 11:47:46.069693 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:46.069660 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c13e3d99-a2a7-4358-8c23-eebc46aa3ab1-kube-api-access-74xh7" (OuterVolumeSpecName: "kube-api-access-74xh7") pod "c13e3d99-a2a7-4358-8c23-eebc46aa3ab1" (UID: "c13e3d99-a2a7-4358-8c23-eebc46aa3ab1"). InnerVolumeSpecName "kube-api-access-74xh7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:47:46.168349 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:46.168249 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-74xh7\" (UniqueName: \"kubernetes.io/projected/c13e3d99-a2a7-4358-8c23-eebc46aa3ab1-kube-api-access-74xh7\") on node \"ip-10-0-134-64.ec2.internal\" DevicePath \"\"" Apr 17 11:47:46.854100 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:46.854075 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-hfmts/progression-invalid-node-0-0-z5sd6" Apr 17 11:47:46.854249 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:46.854070 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-hfmts/progression-invalid-node-0-0-z5sd6" event={"ID":"c13e3d99-a2a7-4358-8c23-eebc46aa3ab1","Type":"ContainerDied","Data":"b4ca35e081d38169315291e9e42f1bb9bb9128e4017db91b187b86ec0b0cd993"} Apr 17 11:47:46.854249 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:47:46.854184 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4ca35e081d38169315291e9e42f1bb9bb9128e4017db91b187b86ec0b0cd993" Apr 17 11:50:22.754045 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:50:22.754017 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-znj2s_43535899-eb5a-4030-8bab-db2650a0cbff/ovn-acl-logging/0.log" Apr 17 11:50:22.755334 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:50:22.755317 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-znj2s_43535899-eb5a-4030-8bab-db2650a0cbff/ovn-acl-logging/0.log" Apr 17 11:52:53.031912 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:52:53.031834 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-hfmts/progression-disabled-node-0-0-kgmdh"] Apr 17 11:52:53.034432 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:52:53.034142 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-hfmts/progression-disabled-node-0-0-kgmdh"] Apr 17 11:52:53.038900 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:52:53.038877 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-hfmts/progression-enabled-node-0-0-qwhrf"] Apr 17 11:52:53.042316 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:52:53.042296 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-hfmts/progression-enabled-node-0-0-qwhrf"] Apr 17 11:52:53.046694 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:52:53.046672 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-hfmts/progression-invalid-node-0-0-z5sd6"] Apr 17 11:52:53.050167 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:52:53.050149 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-hfmts/progression-invalid-node-0-0-z5sd6"] Apr 17 11:52:54.797676 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:52:54.797647 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ec037c8-de07-4100-93e9-15501330e152" path="/var/lib/kubelet/pods/6ec037c8-de07-4100-93e9-15501330e152/volumes" Apr 17 11:52:54.798048 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:52:54.797957 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abe9df75-bf9a-4cb2-af7f-1da834a42517" path="/var/lib/kubelet/pods/abe9df75-bf9a-4cb2-af7f-1da834a42517/volumes" Apr 17 11:52:54.798240 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:52:54.798227 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c13e3d99-a2a7-4358-8c23-eebc46aa3ab1" path="/var/lib/kubelet/pods/c13e3d99-a2a7-4358-8c23-eebc46aa3ab1/volumes" Apr 17 11:53:04.227666 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:04.227637 2577 ???:1] "http2: server: error reading preface from client 10.0.134.64:32974: read tcp 10.0.134.64:10250->10.0.134.64:32974: read: connection reset by peer" Apr 17 11:53:04.241614 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:04.241586 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-5995989c79-fm62f_d1ae9b7f-5a9c-472f-a8f2-5bdbae946599/manager/0.log" Apr 17 11:53:04.671381 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:04.671305 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-5995989c79-fm62f_d1ae9b7f-5a9c-472f-a8f2-5bdbae946599/manager/0.log" Apr 17 11:53:05.114694 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:05.114666 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-5995989c79-fm62f_d1ae9b7f-5a9c-472f-a8f2-5bdbae946599/manager/0.log" Apr 17 11:53:22.763552 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:22.763512 2577 scope.go:117] "RemoveContainer" containerID="c41f32bf9044d9794ed2529a55aa5ee3aca53ae93d76c62270f295d0e4d7136a" Apr 17 11:53:22.770971 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:22.770952 2577 scope.go:117] "RemoveContainer" containerID="9c58bf1c1e182c2de6274239ca21bafba50af1560e7535ec574794c312032a94" Apr 17 11:53:22.777621 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:22.777604 2577 scope.go:117] "RemoveContainer" containerID="2530837c7e129ff61d1713e63b423173ba8173d885d4ca839a4188147fa2b82b" Apr 17 11:53:41.272912 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:41.272877 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jjn79/must-gather-tr86l"] Apr 17 11:53:41.273393 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:41.273175 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c13e3d99-a2a7-4358-8c23-eebc46aa3ab1" containerName="node" Apr 17 11:53:41.273393 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:41.273186 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c13e3d99-a2a7-4358-8c23-eebc46aa3ab1" containerName="node" Apr 17 11:53:41.273393 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:41.273242 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c13e3d99-a2a7-4358-8c23-eebc46aa3ab1" containerName="node" Apr 17 11:53:41.277222 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:41.277205 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjn79/must-gather-tr86l" Apr 17 11:53:41.279641 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:41.279619 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jjn79\"/\"openshift-service-ca.crt\"" Apr 17 11:53:41.280721 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:41.280698 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jjn79\"/\"kube-root-ca.crt\"" Apr 17 11:53:41.280889 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:41.280707 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-jjn79\"/\"default-dockercfg-9rt4j\"" Apr 17 11:53:41.282382 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:41.282361 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jjn79/must-gather-tr86l"] Apr 17 11:53:41.398385 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:41.398348 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/464b60e5-46f6-4068-9a09-255aeb29089a-must-gather-output\") pod \"must-gather-tr86l\" (UID: \"464b60e5-46f6-4068-9a09-255aeb29089a\") " pod="openshift-must-gather-jjn79/must-gather-tr86l" Apr 17 11:53:41.398549 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:41.398405 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnxl5\" (UniqueName: \"kubernetes.io/projected/464b60e5-46f6-4068-9a09-255aeb29089a-kube-api-access-xnxl5\") pod \"must-gather-tr86l\" (UID: \"464b60e5-46f6-4068-9a09-255aeb29089a\") " pod="openshift-must-gather-jjn79/must-gather-tr86l" Apr 17 11:53:41.499028 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:41.498993 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/464b60e5-46f6-4068-9a09-255aeb29089a-must-gather-output\") pod \"must-gather-tr86l\" (UID: \"464b60e5-46f6-4068-9a09-255aeb29089a\") " pod="openshift-must-gather-jjn79/must-gather-tr86l" Apr 17 11:53:41.499190 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:41.499054 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xnxl5\" (UniqueName: \"kubernetes.io/projected/464b60e5-46f6-4068-9a09-255aeb29089a-kube-api-access-xnxl5\") pod \"must-gather-tr86l\" (UID: \"464b60e5-46f6-4068-9a09-255aeb29089a\") " pod="openshift-must-gather-jjn79/must-gather-tr86l" Apr 17 11:53:41.499348 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:41.499329 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/464b60e5-46f6-4068-9a09-255aeb29089a-must-gather-output\") pod \"must-gather-tr86l\" (UID: \"464b60e5-46f6-4068-9a09-255aeb29089a\") " pod="openshift-must-gather-jjn79/must-gather-tr86l" Apr 17 11:53:41.506209 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:41.506191 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnxl5\" (UniqueName: \"kubernetes.io/projected/464b60e5-46f6-4068-9a09-255aeb29089a-kube-api-access-xnxl5\") pod \"must-gather-tr86l\" (UID: \"464b60e5-46f6-4068-9a09-255aeb29089a\") " pod="openshift-must-gather-jjn79/must-gather-tr86l" Apr 17 11:53:41.586532 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:41.586462 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjn79/must-gather-tr86l" Apr 17 11:53:41.702614 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:41.702585 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jjn79/must-gather-tr86l"] Apr 17 11:53:41.705606 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:53:41.705576 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod464b60e5_46f6_4068_9a09_255aeb29089a.slice/crio-f8f5e30e11597092fb03e33be21315b885da34a05ffbf9d34e7226c29a3f05f2 WatchSource:0}: Error finding container f8f5e30e11597092fb03e33be21315b885da34a05ffbf9d34e7226c29a3f05f2: Status 404 returned error can't find the container with id f8f5e30e11597092fb03e33be21315b885da34a05ffbf9d34e7226c29a3f05f2 Apr 17 11:53:41.707398 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:41.707379 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 11:53:41.843334 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:41.843245 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jjn79/must-gather-tr86l" event={"ID":"464b60e5-46f6-4068-9a09-255aeb29089a","Type":"ContainerStarted","Data":"f8f5e30e11597092fb03e33be21315b885da34a05ffbf9d34e7226c29a3f05f2"} Apr 17 11:53:42.848665 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:42.848621 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jjn79/must-gather-tr86l" event={"ID":"464b60e5-46f6-4068-9a09-255aeb29089a","Type":"ContainerStarted","Data":"dfdaa4e4ceb840de9f4b567173bc8c67a92a4de9222af9a349c642288c51055f"} Apr 17 11:53:43.853597 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:43.853553 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jjn79/must-gather-tr86l" event={"ID":"464b60e5-46f6-4068-9a09-255aeb29089a","Type":"ContainerStarted","Data":"df5c986592914cc4528a6daf6da4f0c537921690fb13ddfa3acd7d2f654470db"} Apr 17 11:53:43.868952 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:43.868884 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jjn79/must-gather-tr86l" podStartSLOduration=1.909307953 podStartE2EDuration="2.868865639s" podCreationTimestamp="2026-04-17 11:53:41 +0000 UTC" firstStartedPulling="2026-04-17 11:53:41.707530733 +0000 UTC m=+1399.505245331" lastFinishedPulling="2026-04-17 11:53:42.667088415 +0000 UTC m=+1400.464803017" observedRunningTime="2026-04-17 11:53:43.868265326 +0000 UTC m=+1401.665979956" watchObservedRunningTime="2026-04-17 11:53:43.868865639 +0000 UTC m=+1401.666580257" Apr 17 11:53:44.011580 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:44.011554 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-49lr8_e74f83b5-a2a2-4262-89a6-a122df9a5401/global-pull-secret-syncer/0.log" Apr 17 11:53:44.129434 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:44.129361 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-fxwv2_93c44504-2cac-4acc-82af-a24fa55d1c56/konnectivity-agent/0.log" Apr 17 11:53:44.227117 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:44.227075 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-64.ec2.internal_060506f2615e2a76fac2c219a480cfb1/haproxy/0.log" Apr 17 11:53:47.379728 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:47.378909 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-775d77b857-hppnr_2f2de312-3ca2-4d8f-ac12-6ed1decec071/metrics-server/0.log" Apr 17 11:53:47.405475 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:47.405386 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-gdfmh_bedd9666-11f0-4ba3-999c-964028f81db4/monitoring-plugin/0.log" Apr 17 11:53:47.512440 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:47.512368 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9wzj8_b5f0a58b-6092-4b24-b961-0e7f736de486/node-exporter/0.log" Apr 17 11:53:47.537123 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:47.537092 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9wzj8_b5f0a58b-6092-4b24-b961-0e7f736de486/kube-rbac-proxy/0.log" Apr 17 11:53:47.556575 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:47.556544 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9wzj8_b5f0a58b-6092-4b24-b961-0e7f736de486/init-textfile/0.log" Apr 17 11:53:47.652618 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:47.652533 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-2xksx_329c88e2-dddc-446c-b2b2-4dff96a8eb08/kube-rbac-proxy-main/0.log" Apr 17 11:53:47.671563 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:47.671532 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-2xksx_329c88e2-dddc-446c-b2b2-4dff96a8eb08/kube-rbac-proxy-self/0.log" Apr 17 11:53:47.690757 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:47.690729 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-2xksx_329c88e2-dddc-446c-b2b2-4dff96a8eb08/openshift-state-metrics/0.log" Apr 17 11:53:47.724789 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:47.724756 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cb44461a-db39-4d2a-9070-8d8384d3602a/prometheus/0.log" Apr 17 11:53:47.746217 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:47.746187 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cb44461a-db39-4d2a-9070-8d8384d3602a/config-reloader/0.log" Apr 17 11:53:47.764850 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:47.764823 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cb44461a-db39-4d2a-9070-8d8384d3602a/thanos-sidecar/0.log" Apr 17 11:53:47.786089 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:47.786043 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cb44461a-db39-4d2a-9070-8d8384d3602a/kube-rbac-proxy-web/0.log" Apr 17 11:53:47.807752 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:47.807707 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cb44461a-db39-4d2a-9070-8d8384d3602a/kube-rbac-proxy/0.log" Apr 17 11:53:47.826738 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:47.826707 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cb44461a-db39-4d2a-9070-8d8384d3602a/kube-rbac-proxy-thanos/0.log" Apr 17 11:53:47.847385 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:47.847337 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cb44461a-db39-4d2a-9070-8d8384d3602a/init-config-reloader/0.log" Apr 17 11:53:47.915411 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:47.915315 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-5m4sp_f58c2499-4fc3-4b9f-88eb-b576bf8234f4/prometheus-operator-admission-webhook/0.log" Apr 17 11:53:50.806176 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:50.806139 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jjn79/perf-node-gather-daemonset-q9bj7"] Apr 17 11:53:50.811004 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:50.810978 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-q9bj7" Apr 17 11:53:50.817207 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:50.817172 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jjn79/perf-node-gather-daemonset-q9bj7"] Apr 17 11:53:50.980252 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:50.980207 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0b1cda27-0b1d-4c4e-aa85-3a224a7759a1-sys\") pod \"perf-node-gather-daemonset-q9bj7\" (UID: \"0b1cda27-0b1d-4c4e-aa85-3a224a7759a1\") " pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-q9bj7" Apr 17 11:53:50.980433 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:50.980310 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87m6d\" (UniqueName: \"kubernetes.io/projected/0b1cda27-0b1d-4c4e-aa85-3a224a7759a1-kube-api-access-87m6d\") pod \"perf-node-gather-daemonset-q9bj7\" (UID: \"0b1cda27-0b1d-4c4e-aa85-3a224a7759a1\") " pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-q9bj7" Apr 17 11:53:50.980433 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:50.980356 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0b1cda27-0b1d-4c4e-aa85-3a224a7759a1-proc\") pod \"perf-node-gather-daemonset-q9bj7\" (UID: \"0b1cda27-0b1d-4c4e-aa85-3a224a7759a1\") " pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-q9bj7" Apr 17 11:53:50.980433 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:50.980381 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0b1cda27-0b1d-4c4e-aa85-3a224a7759a1-lib-modules\") pod \"perf-node-gather-daemonset-q9bj7\" (UID: \"0b1cda27-0b1d-4c4e-aa85-3a224a7759a1\") " pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-q9bj7" Apr 17 11:53:50.980433 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:50.980425 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0b1cda27-0b1d-4c4e-aa85-3a224a7759a1-podres\") pod \"perf-node-gather-daemonset-q9bj7\" (UID: \"0b1cda27-0b1d-4c4e-aa85-3a224a7759a1\") " pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-q9bj7" Apr 17 11:53:51.081378 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:51.081310 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0b1cda27-0b1d-4c4e-aa85-3a224a7759a1-sys\") pod \"perf-node-gather-daemonset-q9bj7\" (UID: \"0b1cda27-0b1d-4c4e-aa85-3a224a7759a1\") " pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-q9bj7" Apr 17 11:53:51.081378 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:51.081350 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-87m6d\" (UniqueName: \"kubernetes.io/projected/0b1cda27-0b1d-4c4e-aa85-3a224a7759a1-kube-api-access-87m6d\") pod \"perf-node-gather-daemonset-q9bj7\" (UID: \"0b1cda27-0b1d-4c4e-aa85-3a224a7759a1\") " pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-q9bj7" Apr 17 11:53:51.081568 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:51.081387 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0b1cda27-0b1d-4c4e-aa85-3a224a7759a1-proc\") pod \"perf-node-gather-daemonset-q9bj7\" (UID: \"0b1cda27-0b1d-4c4e-aa85-3a224a7759a1\") " pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-q9bj7" Apr 17 11:53:51.081568 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:51.081413 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0b1cda27-0b1d-4c4e-aa85-3a224a7759a1-lib-modules\") pod \"perf-node-gather-daemonset-q9bj7\" (UID: \"0b1cda27-0b1d-4c4e-aa85-3a224a7759a1\") " pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-q9bj7" Apr 17 11:53:51.081568 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:51.081437 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0b1cda27-0b1d-4c4e-aa85-3a224a7759a1-sys\") pod \"perf-node-gather-daemonset-q9bj7\" (UID: \"0b1cda27-0b1d-4c4e-aa85-3a224a7759a1\") " pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-q9bj7" Apr 17 11:53:51.081568 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:51.081453 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0b1cda27-0b1d-4c4e-aa85-3a224a7759a1-podres\") pod \"perf-node-gather-daemonset-q9bj7\" (UID: \"0b1cda27-0b1d-4c4e-aa85-3a224a7759a1\") " pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-q9bj7" Apr 17 11:53:51.081712 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:51.081578 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0b1cda27-0b1d-4c4e-aa85-3a224a7759a1-lib-modules\") pod \"perf-node-gather-daemonset-q9bj7\" (UID: \"0b1cda27-0b1d-4c4e-aa85-3a224a7759a1\") " pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-q9bj7" Apr 17 11:53:51.081712 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:51.081585 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0b1cda27-0b1d-4c4e-aa85-3a224a7759a1-proc\") pod \"perf-node-gather-daemonset-q9bj7\" (UID: \"0b1cda27-0b1d-4c4e-aa85-3a224a7759a1\") " pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-q9bj7" Apr 17 11:53:51.081712 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:51.081583 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0b1cda27-0b1d-4c4e-aa85-3a224a7759a1-podres\") pod \"perf-node-gather-daemonset-q9bj7\" (UID: \"0b1cda27-0b1d-4c4e-aa85-3a224a7759a1\") " pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-q9bj7" Apr 17 11:53:51.088683 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:51.088659 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-87m6d\" (UniqueName: \"kubernetes.io/projected/0b1cda27-0b1d-4c4e-aa85-3a224a7759a1-kube-api-access-87m6d\") pod \"perf-node-gather-daemonset-q9bj7\" (UID: \"0b1cda27-0b1d-4c4e-aa85-3a224a7759a1\") " pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-q9bj7" Apr 17 11:53:51.102582 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:51.102562 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-gsgsr_15ba3943-b4d6-43fe-88ff-590573a317b8/dns/0.log" Apr 17 11:53:51.119677 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:51.119655 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-gsgsr_15ba3943-b4d6-43fe-88ff-590573a317b8/kube-rbac-proxy/0.log" Apr 17 11:53:51.121534 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:51.121517 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-q9bj7" Apr 17 11:53:51.181736 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:51.181613 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-stmhs_3f06ebee-cbe3-4266-bf01-0bb889437be7/dns-node-resolver/0.log" Apr 17 11:53:51.255423 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:51.255365 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jjn79/perf-node-gather-daemonset-q9bj7"] Apr 17 11:53:51.259133 ip-10-0-134-64 kubenswrapper[2577]: W0417 11:53:51.259090 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0b1cda27_0b1d_4c4e_aa85_3a224a7759a1.slice/crio-56350757c7881c8746b7f33520603789371d0bc09c5f710dc43422354200fe27 WatchSource:0}: Error finding container 56350757c7881c8746b7f33520603789371d0bc09c5f710dc43422354200fe27: Status 404 returned error can't find the container with id 56350757c7881c8746b7f33520603789371d0bc09c5f710dc43422354200fe27 Apr 17 11:53:51.566617 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:51.566585 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-frb9s_a1b0eda5-8b26-4ce3-af63-74364b0ea28f/node-ca/0.log" Apr 17 11:53:51.882710 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:51.882635 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-q9bj7" event={"ID":"0b1cda27-0b1d-4c4e-aa85-3a224a7759a1","Type":"ContainerStarted","Data":"c9d4673441810b7ca33ca249ffb4aed7e75183a6186bd31ba4440bcacde3508d"} Apr 17 11:53:51.882710 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:51.882669 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-q9bj7" event={"ID":"0b1cda27-0b1d-4c4e-aa85-3a224a7759a1","Type":"ContainerStarted","Data":"56350757c7881c8746b7f33520603789371d0bc09c5f710dc43422354200fe27"} Apr 17 11:53:51.883137 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:51.882769 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-q9bj7" Apr 17 11:53:51.897671 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:51.897619 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-q9bj7" podStartSLOduration=1.8976061290000001 podStartE2EDuration="1.897606129s" podCreationTimestamp="2026-04-17 11:53:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:53:51.897332023 +0000 UTC m=+1409.695046642" watchObservedRunningTime="2026-04-17 11:53:51.897606129 +0000 UTC m=+1409.695320745" Apr 17 11:53:52.529635 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:52.529609 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-c5z7w_8a43f043-3738-4fc8-9a0f-9a3de52038b5/serve-healthcheck-canary/0.log" Apr 17 11:53:52.957404 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:52.957365 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dn4vn_d12374a5-1ff4-4f08-9980-138794998ec3/kube-rbac-proxy/0.log" Apr 17 11:53:52.973914 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:52.973889 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dn4vn_d12374a5-1ff4-4f08-9980-138794998ec3/exporter/0.log" Apr 17 11:53:52.994832 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:52.994804 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dn4vn_d12374a5-1ff4-4f08-9980-138794998ec3/extractor/0.log" Apr 17 11:53:57.895717 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:57.895694 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-jjn79/perf-node-gather-daemonset-q9bj7" Apr 17 11:53:58.500350 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:58.500323 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fl46q_bb2737eb-5571-4fee-8d9a-10110cc1a205/kube-multus-additional-cni-plugins/0.log" Apr 17 11:53:58.518849 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:58.518825 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fl46q_bb2737eb-5571-4fee-8d9a-10110cc1a205/egress-router-binary-copy/0.log" Apr 17 11:53:58.535892 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:58.535868 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fl46q_bb2737eb-5571-4fee-8d9a-10110cc1a205/cni-plugins/0.log" Apr 17 11:53:58.552754 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:58.552735 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fl46q_bb2737eb-5571-4fee-8d9a-10110cc1a205/bond-cni-plugin/0.log" Apr 17 11:53:58.568924 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:58.568908 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fl46q_bb2737eb-5571-4fee-8d9a-10110cc1a205/routeoverride-cni/0.log" Apr 17 11:53:58.588942 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:58.588920 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fl46q_bb2737eb-5571-4fee-8d9a-10110cc1a205/whereabouts-cni-bincopy/0.log" Apr 17 11:53:58.606440 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:58.606422 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fl46q_bb2737eb-5571-4fee-8d9a-10110cc1a205/whereabouts-cni/0.log" Apr 17 11:53:58.896902 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:58.896812 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cgvrz_517a579e-7efd-4d38-8225-2b0c7c48d532/kube-multus/0.log" Apr 17 11:53:59.009471 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:59.009411 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-xw9bz_4155f35e-1865-499f-88fb-fdde1e2c1218/network-metrics-daemon/0.log" Apr 17 11:53:59.026714 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:53:59.026685 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-xw9bz_4155f35e-1865-499f-88fb-fdde1e2c1218/kube-rbac-proxy/0.log" Apr 17 11:54:00.452726 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:54:00.452698 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-znj2s_43535899-eb5a-4030-8bab-db2650a0cbff/ovn-controller/0.log" Apr 17 11:54:00.468837 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:54:00.468805 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-znj2s_43535899-eb5a-4030-8bab-db2650a0cbff/ovn-acl-logging/0.log" Apr 17 11:54:00.475913 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:54:00.475885 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-znj2s_43535899-eb5a-4030-8bab-db2650a0cbff/ovn-acl-logging/1.log" Apr 17 11:54:00.493809 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:54:00.493741 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-znj2s_43535899-eb5a-4030-8bab-db2650a0cbff/kube-rbac-proxy-node/0.log" Apr 17 11:54:00.512508 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:54:00.512489 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-znj2s_43535899-eb5a-4030-8bab-db2650a0cbff/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 11:54:00.527700 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:54:00.527684 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-znj2s_43535899-eb5a-4030-8bab-db2650a0cbff/northd/0.log" Apr 17 11:54:00.545367 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:54:00.545349 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-znj2s_43535899-eb5a-4030-8bab-db2650a0cbff/nbdb/0.log" Apr 17 11:54:00.562607 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:54:00.562585 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-znj2s_43535899-eb5a-4030-8bab-db2650a0cbff/sbdb/0.log" Apr 17 11:54:00.657698 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:54:00.657669 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-znj2s_43535899-eb5a-4030-8bab-db2650a0cbff/ovnkube-controller/0.log" Apr 17 11:54:01.629688 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:54:01.629654 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-cgpzp_34d03c01-00bf-416b-8b46-2274587cc240/network-check-target-container/0.log" Apr 17 11:54:02.415338 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:54:02.415311 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-qkqfc_ecf1ff91-10cb-4ca8-8a83-7ed6b852b5ac/iptables-alerter/0.log" Apr 17 11:54:03.061560 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:54:03.061531 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-524kl_ff8d9fca-80b5-4d5a-99c0-374a747b0900/tuned/0.log" Apr 17 11:54:06.159198 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:54:06.159167 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-swb84_187342e6-1155-44f7-a799-bfeab7d58152/csi-driver/0.log" Apr 17 11:54:06.176920 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:54:06.176850 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-swb84_187342e6-1155-44f7-a799-bfeab7d58152/csi-node-driver-registrar/0.log" Apr 17 11:54:06.193934 ip-10-0-134-64 kubenswrapper[2577]: I0417 11:54:06.193907 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-swb84_187342e6-1155-44f7-a799-bfeab7d58152/csi-liveness-probe/0.log"