Apr 23 16:32:35.654274 ip-10-0-136-190 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 23 16:32:35.654285 ip-10-0-136-190 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 23 16:32:35.654292 ip-10-0-136-190 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 23 16:32:35.654437 ip-10-0-136-190 systemd[1]: Failed to start Kubernetes Kubelet. Apr 23 16:32:45.814944 ip-10-0-136-190 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 23 16:32:45.814959 ip-10-0-136-190 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 80315098c19b4deaa6ee8f7a066d5255 -- Apr 23 16:35:13.994882 ip-10-0-136-190 systemd[1]: Starting Kubernetes Kubelet... Apr 23 16:35:14.483206 ip-10-0-136-190 kubenswrapper[2571]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 16:35:14.483206 ip-10-0-136-190 kubenswrapper[2571]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 16:35:14.483206 ip-10-0-136-190 kubenswrapper[2571]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 16:35:14.483206 ip-10-0-136-190 kubenswrapper[2571]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 16:35:14.483206 ip-10-0-136-190 kubenswrapper[2571]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 16:35:14.484818 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.484705 2571 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 16:35:14.490820 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490797 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:14.490820 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490815 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:14.490820 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490820 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:14.490820 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490824 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:14.491074 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490829 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:14.491074 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490833 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:14.491074 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490837 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:14.491074 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490841 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:14.491074 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490845 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:14.491074 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490849 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:14.491074 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490852 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:14.491074 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490856 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:14.491074 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490859 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:14.491074 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490862 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:14.491074 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490866 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:14.491074 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490870 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:14.491074 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490874 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:14.491074 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490878 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:14.491074 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490882 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:14.491074 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490885 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:14.491074 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490889 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:14.491074 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490899 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:14.491074 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490903 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:14.491074 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490907 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:14.491905 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490910 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:14.491905 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490914 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:14.491905 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490918 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:14.491905 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490922 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:14.491905 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490926 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:14.491905 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490929 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:14.491905 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490933 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:14.491905 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490938 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:14.491905 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490942 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:14.491905 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490946 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:14.491905 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490949 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:14.491905 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490954 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:14.491905 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490958 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:14.491905 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490963 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:14.491905 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490968 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:14.491905 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490972 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:14.491905 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490976 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:14.491905 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490981 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:14.491905 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490985 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:14.492749 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490989 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:14.492749 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490993 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:14.492749 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.490999 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:14.492749 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491003 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:14.492749 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491008 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:14.492749 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491012 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:14.492749 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491016 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:14.492749 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491020 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:14.492749 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491024 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:14.492749 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491028 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:14.492749 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491032 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:14.492749 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491036 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:14.492749 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491041 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:14.492749 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491047 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:14.492749 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491051 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:14.492749 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491055 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:14.492749 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491060 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:14.492749 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491064 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:14.492749 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491068 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:14.492749 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491072 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:14.493663 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491076 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:14.493663 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491080 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:14.493663 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491085 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:14.493663 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491089 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:14.493663 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491095 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:14.493663 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491099 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:14.493663 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491104 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:14.493663 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491112 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:14.493663 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491117 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:14.493663 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491122 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:14.493663 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491126 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:14.493663 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491131 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:14.493663 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491137 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:14.493663 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491143 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:14.493663 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491148 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:14.493663 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491154 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:14.493663 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491158 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:14.493663 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491162 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:14.493663 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491167 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:14.494196 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491171 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:14.494196 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491176 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:14.494196 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491180 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:14.494196 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491185 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:14.494196 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491829 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:14.494196 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491839 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:14.494196 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491843 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:14.494196 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491848 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:14.494196 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491852 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:14.494196 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491856 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:14.494196 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491860 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:14.494196 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491865 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:14.494196 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491869 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:14.494196 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491874 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:14.494196 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491878 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:14.494196 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491883 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:14.494196 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491888 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:14.494196 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491894 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:14.494196 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491901 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:14.494706 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491908 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:14.494706 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491913 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:14.494706 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491918 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:14.494706 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491922 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:14.494706 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491926 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:14.494706 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491931 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:14.494706 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491935 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:14.494706 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491939 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:14.494706 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491944 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:14.494706 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491949 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:14.494706 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491953 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:14.494706 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491957 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:14.494706 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491961 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:14.494706 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491965 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:14.494706 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491970 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:14.494706 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491974 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:14.494706 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491980 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:14.494706 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491985 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:14.494706 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491989 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:14.494706 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491993 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:14.495323 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.491997 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:14.495323 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492001 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:14.495323 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492006 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:14.495323 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492010 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:14.495323 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492014 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:14.495323 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492018 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:14.495323 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492022 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:14.495323 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492027 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:14.495323 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492031 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:14.495323 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492035 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:14.495323 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492040 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:14.495323 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492044 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:14.495323 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492048 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:14.495323 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492052 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:14.495323 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492057 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:14.495323 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492061 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:14.495323 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492066 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:14.495323 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492070 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:14.495323 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492074 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:14.495323 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492078 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:14.496011 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492082 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:14.496011 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492086 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:14.496011 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492099 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:14.496011 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492105 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:14.496011 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492110 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:14.496011 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492114 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:14.496011 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492118 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:14.496011 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492122 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:14.496011 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492126 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:14.496011 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492130 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:14.496011 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492134 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:14.496011 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492139 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:14.496011 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492144 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:14.496011 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492148 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:14.496011 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492152 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:14.496011 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492157 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:14.496011 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492161 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:14.496011 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492165 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:14.496011 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492169 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:14.496011 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492173 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:14.496498 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492177 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:14.496498 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492181 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:14.496498 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492186 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:14.496498 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492190 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:14.496498 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492195 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:14.496498 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492202 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:14.496498 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492206 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:14.496498 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492210 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:14.496498 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492215 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:14.496498 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492219 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:14.496498 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.492223 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:14.496498 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492326 2571 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 16:35:14.496498 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492337 2571 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 16:35:14.496498 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492352 2571 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 16:35:14.496498 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492359 2571 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 16:35:14.496498 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492366 2571 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 16:35:14.496498 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492372 2571 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 16:35:14.496498 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492384 2571 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 16:35:14.496498 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492391 2571 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 16:35:14.496498 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492396 2571 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 16:35:14.496498 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492401 2571 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 16:35:14.497111 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492407 2571 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 16:35:14.497111 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492412 2571 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 16:35:14.497111 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492417 2571 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 16:35:14.497111 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492422 2571 flags.go:64] FLAG: --cgroup-root="" Apr 23 16:35:14.497111 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492426 2571 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 16:35:14.497111 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492431 2571 flags.go:64] FLAG: --client-ca-file="" Apr 23 16:35:14.497111 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492436 2571 flags.go:64] FLAG: --cloud-config="" Apr 23 16:35:14.497111 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492440 2571 flags.go:64] FLAG: --cloud-provider="external" Apr 23 16:35:14.497111 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492445 2571 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 16:35:14.497111 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492451 2571 flags.go:64] FLAG: --cluster-domain="" Apr 23 16:35:14.497111 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492455 2571 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 16:35:14.497111 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492461 2571 flags.go:64] FLAG: --config-dir="" Apr 23 16:35:14.497111 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492465 2571 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 16:35:14.497111 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492471 2571 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 16:35:14.497111 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492477 2571 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 16:35:14.497111 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492483 2571 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 16:35:14.497111 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492488 2571 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 16:35:14.497111 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492493 2571 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 16:35:14.497111 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492498 2571 flags.go:64] FLAG: --contention-profiling="false" Apr 23 16:35:14.497111 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492503 2571 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 16:35:14.497111 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492507 2571 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 16:35:14.497111 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492512 2571 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 16:35:14.497111 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492517 2571 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 16:35:14.497111 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492524 2571 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 16:35:14.497111 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492529 2571 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 16:35:14.497755 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492533 2571 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 16:35:14.497755 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492538 2571 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 16:35:14.497755 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492544 2571 flags.go:64] FLAG: --enable-server="true" Apr 23 16:35:14.497755 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492548 2571 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 16:35:14.497755 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492555 2571 flags.go:64] FLAG: --event-burst="100" Apr 23 16:35:14.497755 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492561 2571 flags.go:64] FLAG: --event-qps="50" Apr 23 16:35:14.497755 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492566 2571 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 16:35:14.497755 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492571 2571 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 16:35:14.497755 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492576 2571 flags.go:64] FLAG: --eviction-hard="" Apr 23 16:35:14.497755 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492583 2571 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 16:35:14.497755 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492587 2571 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 16:35:14.497755 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492592 2571 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 16:35:14.497755 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492612 2571 flags.go:64] FLAG: --eviction-soft="" Apr 23 16:35:14.497755 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492617 2571 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 16:35:14.497755 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492623 2571 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 16:35:14.497755 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492627 2571 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 16:35:14.497755 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492632 2571 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 16:35:14.497755 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492637 2571 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 16:35:14.497755 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492642 2571 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 16:35:14.497755 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492647 2571 flags.go:64] FLAG: --feature-gates="" Apr 23 16:35:14.497755 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492653 2571 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 16:35:14.497755 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492658 2571 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 16:35:14.497755 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492663 2571 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 16:35:14.497755 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492669 2571 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 16:35:14.497755 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492674 2571 flags.go:64] FLAG: --healthz-port="10248" Apr 23 16:35:14.497755 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492680 2571 flags.go:64] FLAG: --help="false" Apr 23 16:35:14.498415 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492684 2571 flags.go:64] FLAG: --hostname-override="ip-10-0-136-190.ec2.internal" Apr 23 16:35:14.498415 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492689 2571 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 16:35:14.498415 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492694 2571 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 16:35:14.498415 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492699 2571 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 16:35:14.498415 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492704 2571 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 16:35:14.498415 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492710 2571 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 16:35:14.498415 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492715 2571 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 16:35:14.498415 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492720 2571 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 16:35:14.498415 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492725 2571 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 16:35:14.498415 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492730 2571 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 16:35:14.498415 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492735 2571 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 16:35:14.498415 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492740 2571 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 16:35:14.498415 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492745 2571 flags.go:64] FLAG: --kube-reserved="" Apr 23 16:35:14.498415 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492750 2571 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 16:35:14.498415 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492754 2571 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 16:35:14.498415 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492759 2571 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 16:35:14.498415 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492763 2571 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 16:35:14.498415 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492767 2571 flags.go:64] FLAG: --lock-file="" Apr 23 16:35:14.498415 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492772 2571 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 16:35:14.498415 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492777 2571 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 16:35:14.498415 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492782 2571 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 16:35:14.498415 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492791 2571 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 16:35:14.498415 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492795 2571 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 16:35:14.499008 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492800 2571 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 16:35:14.499008 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492804 2571 flags.go:64] FLAG: --logging-format="text" Apr 23 16:35:14.499008 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492809 2571 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 16:35:14.499008 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492814 2571 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 16:35:14.499008 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492818 2571 flags.go:64] FLAG: --manifest-url="" Apr 23 16:35:14.499008 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492823 2571 flags.go:64] FLAG: --manifest-url-header="" Apr 23 16:35:14.499008 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492829 2571 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 16:35:14.499008 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492834 2571 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 16:35:14.499008 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492841 2571 flags.go:64] FLAG: --max-pods="110" Apr 23 16:35:14.499008 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492846 2571 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 16:35:14.499008 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492851 2571 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 16:35:14.499008 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492856 2571 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 16:35:14.499008 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492860 2571 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 16:35:14.499008 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492865 2571 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 16:35:14.499008 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492870 2571 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 16:35:14.499008 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492875 2571 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 16:35:14.499008 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492887 2571 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 16:35:14.499008 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492892 2571 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 16:35:14.499008 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492897 2571 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 16:35:14.499008 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492903 2571 flags.go:64] FLAG: --pod-cidr="" Apr 23 16:35:14.499008 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492908 2571 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 16:35:14.499008 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492917 2571 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 16:35:14.499008 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492922 2571 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 16:35:14.499008 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492927 2571 flags.go:64] FLAG: --pods-per-core="0" Apr 23 16:35:14.499691 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492932 2571 flags.go:64] FLAG: --port="10250" Apr 23 16:35:14.499691 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492940 2571 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 16:35:14.499691 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492944 2571 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0991bf638e47cd6ab" Apr 23 16:35:14.499691 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492950 2571 flags.go:64] FLAG: --qos-reserved="" Apr 23 16:35:14.499691 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492955 2571 flags.go:64] FLAG: --read-only-port="10255" Apr 23 16:35:14.499691 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492960 2571 flags.go:64] FLAG: --register-node="true" Apr 23 16:35:14.499691 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492965 2571 flags.go:64] FLAG: --register-schedulable="true" Apr 23 16:35:14.499691 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492970 2571 flags.go:64] FLAG: --register-with-taints="" Apr 23 16:35:14.499691 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492976 2571 flags.go:64] FLAG: --registry-burst="10" Apr 23 16:35:14.499691 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492980 2571 flags.go:64] FLAG: --registry-qps="5" Apr 23 16:35:14.499691 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492985 2571 flags.go:64] FLAG: --reserved-cpus="" Apr 23 16:35:14.499691 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492990 2571 flags.go:64] FLAG: --reserved-memory="" Apr 23 16:35:14.499691 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.492996 2571 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 16:35:14.499691 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.493001 2571 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 16:35:14.499691 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.493006 2571 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 16:35:14.499691 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.493010 2571 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 16:35:14.499691 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.493015 2571 flags.go:64] FLAG: --runonce="false" Apr 23 16:35:14.499691 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.493020 2571 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 16:35:14.499691 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.493025 2571 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 16:35:14.499691 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.493029 2571 flags.go:64] FLAG: --seccomp-default="false" Apr 23 16:35:14.499691 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.493034 2571 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 16:35:14.499691 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.493039 2571 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 16:35:14.499691 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.493044 2571 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 16:35:14.499691 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.493049 2571 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 16:35:14.499691 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.493054 2571 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 16:35:14.499691 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.493058 2571 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 16:35:14.500315 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.493063 2571 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 16:35:14.500315 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.493067 2571 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 16:35:14.500315 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.493073 2571 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 16:35:14.500315 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.493078 2571 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 16:35:14.500315 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.493083 2571 flags.go:64] FLAG: --system-cgroups="" Apr 23 16:35:14.500315 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.493087 2571 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 16:35:14.500315 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.493096 2571 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 16:35:14.500315 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.493102 2571 flags.go:64] FLAG: --tls-cert-file="" Apr 23 16:35:14.500315 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.493107 2571 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 16:35:14.500315 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.493117 2571 flags.go:64] FLAG: --tls-min-version="" Apr 23 16:35:14.500315 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.493122 2571 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 16:35:14.500315 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.493127 2571 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 16:35:14.500315 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.493131 2571 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 16:35:14.500315 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.493136 2571 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 16:35:14.500315 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.493141 2571 flags.go:64] FLAG: --v="2" Apr 23 16:35:14.500315 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.493148 2571 flags.go:64] FLAG: --version="false" Apr 23 16:35:14.500315 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.493154 2571 flags.go:64] FLAG: --vmodule="" Apr 23 16:35:14.500315 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.493161 2571 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 16:35:14.500315 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.493166 2571 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 16:35:14.500315 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493319 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:14.500315 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493326 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:14.500315 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493331 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:14.500315 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493336 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:14.500315 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493341 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:14.500944 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493345 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:14.500944 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493349 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:14.500944 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493353 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:14.500944 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493358 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:14.500944 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493362 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:14.500944 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493366 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:14.500944 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493371 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:14.500944 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493375 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:14.500944 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493379 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:14.500944 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493383 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:14.500944 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493387 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:14.500944 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493392 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:14.500944 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493397 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:14.500944 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493401 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:14.500944 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493405 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:14.500944 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493411 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:14.500944 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493415 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:14.500944 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493421 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:14.500944 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493426 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:14.500944 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493430 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:14.501439 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493434 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:14.501439 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493439 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:14.501439 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493443 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:14.501439 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493447 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:14.501439 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493451 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:14.501439 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493455 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:14.501439 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493460 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:14.501439 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493464 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:14.501439 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493468 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:14.501439 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493473 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:14.501439 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493477 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:14.501439 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493481 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:14.501439 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493486 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:14.501439 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493490 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:14.501439 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493521 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:14.501439 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493528 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:14.501439 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493533 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:14.501439 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493540 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:14.501439 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493545 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:14.501935 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493549 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:14.501935 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493554 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:14.501935 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493558 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:14.501935 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493562 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:14.501935 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493566 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:14.501935 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493571 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:14.501935 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493575 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:14.501935 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493580 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:14.501935 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493586 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:14.501935 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493590 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:14.501935 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493612 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:14.501935 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493617 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:14.501935 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493621 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:14.501935 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493625 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:14.501935 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493629 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:14.501935 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493633 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:14.501935 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493638 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:14.501935 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493642 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:14.501935 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493646 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:14.501935 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493650 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:14.502442 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493657 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:14.502442 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493663 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:14.502442 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493667 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:14.502442 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493671 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:14.502442 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493675 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:14.502442 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493679 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:14.502442 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493683 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:14.502442 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493688 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:14.502442 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493692 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:14.502442 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493696 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:14.502442 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493700 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:14.502442 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493704 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:14.502442 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493709 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:14.502442 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493713 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:14.502442 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493718 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:14.502442 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493722 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:14.502442 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493726 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:14.502442 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493730 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:14.502442 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493735 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:14.502934 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493739 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:14.502934 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493744 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:14.502934 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.493748 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:14.502934 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.494614 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 16:35:14.502934 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.500850 2571 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 16:35:14.502934 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.500866 2571 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 16:35:14.502934 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.500912 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:14.502934 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.500919 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:14.502934 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.500923 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:14.502934 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.500926 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:14.502934 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.500930 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:14.502934 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.500932 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:14.502934 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.500935 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:14.502934 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.500938 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:14.502934 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.500941 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:14.503304 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.500943 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:14.503304 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.500946 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:14.503304 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.500949 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:14.503304 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.500951 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:14.503304 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.500954 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:14.503304 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.500957 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:14.503304 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.500960 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:14.503304 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.500963 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:14.503304 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.500965 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:14.503304 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.500968 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:14.503304 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.500971 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:14.503304 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.500974 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:14.503304 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.500976 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:14.503304 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.500979 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:14.503304 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.500982 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:14.503304 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.500984 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:14.503304 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.500987 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:14.503304 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.500990 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:14.503304 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.500993 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:14.503304 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.500995 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:14.503847 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.500998 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:14.503847 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501001 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:14.503847 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501005 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:14.503847 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501008 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:14.503847 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501011 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:14.503847 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501013 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:14.503847 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501016 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:14.503847 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501018 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:14.503847 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501021 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:14.503847 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501023 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:14.503847 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501026 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:14.503847 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501028 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:14.503847 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501031 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:14.503847 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501033 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:14.503847 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501036 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:14.503847 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501039 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:14.503847 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501041 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:14.503847 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501044 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:14.503847 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501046 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:14.503847 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501049 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:14.504372 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501051 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:14.504372 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501054 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:14.504372 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501056 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:14.504372 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501059 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:14.504372 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501061 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:14.504372 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501063 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:14.504372 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501068 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:14.504372 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501070 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:14.504372 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501073 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:14.504372 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501075 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:14.504372 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501078 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:14.504372 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501080 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:14.504372 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501083 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:14.504372 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501086 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:14.504372 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501088 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:14.504372 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501092 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:14.504372 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501094 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:14.504372 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501097 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:14.504372 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501099 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:14.504844 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501102 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:14.504844 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501104 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:14.504844 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501107 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:14.504844 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501109 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:14.504844 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501112 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:14.504844 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501115 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:14.504844 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501117 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:14.504844 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501120 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:14.504844 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501122 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:14.504844 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501125 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:14.504844 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501127 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:14.504844 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501130 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:14.504844 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501132 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:14.504844 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501135 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:14.504844 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501137 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:14.504844 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501140 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:14.504844 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501143 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:14.504844 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501147 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:14.505293 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.501152 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 16:35:14.505293 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501250 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:14.505293 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501255 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:14.505293 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501259 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:14.505293 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501264 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:14.505293 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501267 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:14.505293 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501270 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:14.505293 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501273 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:14.505293 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501276 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:14.505293 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501279 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:14.505293 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501282 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:14.505293 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501286 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:14.505293 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501288 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:14.505293 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501291 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:14.505293 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501294 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:14.505695 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501297 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:14.505695 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501300 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:14.505695 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501302 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:14.505695 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501305 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:14.505695 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501308 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:14.505695 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501311 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:14.505695 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501314 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:14.505695 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501316 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:14.505695 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501319 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:14.505695 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501322 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:14.505695 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501324 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:14.505695 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501327 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:14.505695 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501329 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:14.505695 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501332 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:14.505695 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501335 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:14.505695 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501337 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:14.505695 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501339 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:14.505695 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501342 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:14.505695 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501345 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:14.505695 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501347 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:14.506183 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501350 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:14.506183 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501352 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:14.506183 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501355 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:14.506183 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501357 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:14.506183 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501360 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:14.506183 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501362 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:14.506183 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501365 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:14.506183 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501368 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:14.506183 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501370 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:14.506183 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501376 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:14.506183 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501379 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:14.506183 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501381 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:14.506183 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501384 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:14.506183 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501386 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:14.506183 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501389 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:14.506183 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501391 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:14.506183 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501394 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:14.506183 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501396 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:14.506183 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501399 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:14.506668 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501401 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:14.506668 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501403 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:14.506668 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501406 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:14.506668 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501408 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:14.506668 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501411 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:14.506668 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501413 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:14.506668 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501417 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:14.506668 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501420 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:14.506668 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501423 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:14.506668 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501426 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:14.506668 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501428 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:14.506668 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501432 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:14.506668 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501434 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:14.506668 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501437 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:14.506668 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501440 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:14.506668 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501442 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:14.506668 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501445 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:14.506668 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501448 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:14.506668 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501451 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:14.506668 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501453 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:14.507161 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501456 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:14.507161 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501459 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:14.507161 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501461 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:14.507161 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501464 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:14.507161 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501467 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:14.507161 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501469 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:14.507161 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501472 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:14.507161 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501474 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:14.507161 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501477 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:14.507161 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501479 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:14.507161 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501482 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:14.507161 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501484 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:14.507161 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:14.501487 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:14.507161 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.501491 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 16:35:14.507161 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.502188 2571 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 16:35:14.507532 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.504097 2571 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 16:35:14.507532 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.504983 2571 server.go:1019] "Starting client certificate rotation" Apr 23 16:35:14.507532 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.505078 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 16:35:14.507532 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.505467 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 16:35:14.533909 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.533892 2571 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 16:35:14.536575 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.536556 2571 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 16:35:14.553057 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.553040 2571 log.go:25] "Validated CRI v1 runtime API" Apr 23 16:35:14.558103 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.558088 2571 log.go:25] "Validated CRI v1 image API" Apr 23 16:35:14.560665 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.560652 2571 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 16:35:14.564804 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.564783 2571 fs.go:135] Filesystem UUIDs: map[53e04970-634b-4d68-99b6-2a90982495cc:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 b7eac1bb-35fa-4811-87c7-c43d3c557a98:/dev/nvme0n1p3] Apr 23 16:35:14.564884 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.564803 2571 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 16:35:14.567918 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.567901 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 16:35:14.570240 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.570129 2571 manager.go:217] Machine: {Timestamp:2026-04-23 16:35:14.568584517 +0000 UTC m=+0.446994114 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099209 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec271648b80ecd83a1f8a0c76fa956de SystemUUID:ec271648-b80e-cd83-a1f8-a0c76fa956de BootID:80315098-c19b-4dea-a6ee-8f7a066d5255 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:b6:0d:6a:db:cd Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:b6:0d:6a:db:cd Speed:0 Mtu:9001} {Name:ovs-system MacAddress:72:f2:e8:fe:00:15 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 16:35:14.570240 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.570238 2571 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 16:35:14.570356 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.570331 2571 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 16:35:14.571443 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.571418 2571 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 16:35:14.571593 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.571446 2571 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-136-190.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 16:35:14.571654 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.571611 2571 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 16:35:14.571654 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.571620 2571 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 16:35:14.571654 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.571633 2571 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 16:35:14.573500 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.573489 2571 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 16:35:14.576482 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.576471 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 23 16:35:14.576591 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.576582 2571 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 16:35:14.579084 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.579074 2571 kubelet.go:491] "Attempting to sync node with API server" Apr 23 16:35:14.579127 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.579089 2571 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 16:35:14.579127 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.579102 2571 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 16:35:14.579127 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.579111 2571 kubelet.go:397] "Adding apiserver pod source" Apr 23 16:35:14.579127 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.579121 2571 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 16:35:14.580026 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.580012 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 16:35:14.580066 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.580038 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 16:35:14.582815 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.582801 2571 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 16:35:14.584330 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.584317 2571 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 16:35:14.587565 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.587551 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 16:35:14.587626 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.587576 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 16:35:14.587626 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.587585 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 16:35:14.587626 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.587590 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 16:35:14.587626 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.587607 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 16:35:14.587626 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.587620 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 16:35:14.587626 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.587626 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 16:35:14.587785 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.587632 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 16:35:14.587785 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.587639 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 16:35:14.587785 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.587644 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 16:35:14.587785 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.587657 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 16:35:14.587785 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.587665 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 16:35:14.588526 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.588513 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 16:35:14.588559 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.588528 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 16:35:14.591964 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.591951 2571 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 16:35:14.592026 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.591985 2571 server.go:1295] "Started kubelet" Apr 23 16:35:14.592095 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.592063 2571 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 16:35:14.592370 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.592262 2571 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 16:35:14.592486 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.592473 2571 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 16:35:14.592860 ip-10-0-136-190 systemd[1]: Started Kubernetes Kubelet. Apr 23 16:35:14.593454 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:14.593431 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 16:35:14.593562 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:14.593449 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-136-190.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 16:35:14.593562 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.593557 2571 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-136-190.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 16:35:14.594114 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.594086 2571 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 16:35:14.596362 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.596341 2571 server.go:317] "Adding debug handlers to kubelet server" Apr 23 16:35:14.599364 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.599343 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 16:35:14.599737 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.599717 2571 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 16:35:14.600734 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.600707 2571 factory.go:153] Registering CRI-O factory Apr 23 16:35:14.600822 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.600765 2571 factory.go:223] Registration of the crio container factory successfully Apr 23 16:35:14.600822 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.600816 2571 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 16:35:14.600921 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.600826 2571 factory.go:55] Registering systemd factory Apr 23 16:35:14.600921 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.600877 2571 factory.go:223] Registration of the systemd container factory successfully Apr 23 16:35:14.600921 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.600896 2571 factory.go:103] Registering Raw factory Apr 23 16:35:14.600921 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.600909 2571 manager.go:1196] Started watching for new ooms in manager Apr 23 16:35:14.601663 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.601293 2571 manager.go:319] Starting recovery of all containers Apr 23 16:35:14.601858 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:14.601832 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-190.ec2.internal\" not found" Apr 23 16:35:14.601995 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.601969 2571 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 16:35:14.601995 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.601998 2571 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 16:35:14.602151 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.602113 2571 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 16:35:14.602208 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.602181 2571 reconstruct.go:97] "Volume reconstruction finished" Apr 23 16:35:14.602208 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.602188 2571 reconciler.go:26] "Reconciler: start to sync state" Apr 23 16:35:14.605097 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:14.605067 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 23 16:35:14.605188 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:14.605133 2571 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-136-190.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 23 16:35:14.606327 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:14.605242 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-190.ec2.internal.18a909a0e394b72d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-190.ec2.internal,UID:ip-10-0-136-190.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-136-190.ec2.internal,},FirstTimestamp:2026-04-23 16:35:14.591962925 +0000 UTC m=+0.470372521,LastTimestamp:2026-04-23 16:35:14.591962925 +0000 UTC m=+0.470372521,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-190.ec2.internal,}" Apr 23 16:35:14.609995 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.609978 2571 manager.go:324] Recovery completed Apr 23 16:35:14.614770 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.614757 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:14.616843 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.616827 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vdtxd" Apr 23 16:35:14.617296 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.617281 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-190.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:14.617385 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.617311 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-190.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:14.617385 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.617326 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-190.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:14.617765 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.617752 2571 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 16:35:14.617765 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.617764 2571 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 16:35:14.617900 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.617781 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 23 16:35:14.619838 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.619823 2571 policy_none.go:49] "None policy: Start" Apr 23 16:35:14.619919 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.619843 2571 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 16:35:14.619919 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.619856 2571 state_mem.go:35] "Initializing new in-memory state store" Apr 23 16:35:14.623723 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:14.623649 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-190.ec2.internal.18a909a0e51741e0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-190.ec2.internal,UID:ip-10-0-136-190.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-136-190.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-136-190.ec2.internal,},FirstTimestamp:2026-04-23 16:35:14.617295328 +0000 UTC m=+0.495704927,LastTimestamp:2026-04-23 16:35:14.617295328 +0000 UTC m=+0.495704927,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-190.ec2.internal,}" Apr 23 16:35:14.633329 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:14.633272 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-190.ec2.internal.18a909a0e5179a53 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-190.ec2.internal,UID:ip-10-0-136-190.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-136-190.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-136-190.ec2.internal,},FirstTimestamp:2026-04-23 16:35:14.617317971 +0000 UTC m=+0.495727568,LastTimestamp:2026-04-23 16:35:14.617317971 +0000 UTC m=+0.495727568,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-190.ec2.internal,}" Apr 23 16:35:14.635834 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.635819 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vdtxd" Apr 23 16:35:14.656220 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.656204 2571 manager.go:341] "Starting Device Plugin manager" Apr 23 16:35:14.672656 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:14.656236 2571 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 16:35:14.672656 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.656251 2571 server.go:85] "Starting device plugin registration server" Apr 23 16:35:14.672656 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.656499 2571 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 16:35:14.672656 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.656513 2571 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 16:35:14.672656 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.656585 2571 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 16:35:14.672656 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.656676 2571 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 16:35:14.672656 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.656685 2571 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 16:35:14.672656 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:14.657180 2571 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 16:35:14.672656 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:14.657209 2571 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-136-190.ec2.internal\" not found" Apr 23 16:35:14.737868 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.737831 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 16:35:14.739049 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.739031 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 16:35:14.739152 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.739056 2571 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 16:35:14.739152 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.739074 2571 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 16:35:14.739152 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.739081 2571 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 16:35:14.739298 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:14.739161 2571 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 16:35:14.741985 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.741964 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:14.756811 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.756773 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:14.757549 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.757533 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-190.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:14.757658 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.757559 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-190.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:14.757658 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.757570 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-190.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:14.757658 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.757592 2571 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-136-190.ec2.internal" Apr 23 16:35:14.767339 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.767319 2571 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-136-190.ec2.internal" Apr 23 16:35:14.767427 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:14.767342 2571 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-136-190.ec2.internal\": node \"ip-10-0-136-190.ec2.internal\" not found" Apr 23 16:35:14.785894 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:14.785874 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-190.ec2.internal\" not found" Apr 23 16:35:14.839870 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.839842 2571 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-190.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-136-190.ec2.internal"] Apr 23 16:35:14.839954 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.839913 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:14.842066 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.842049 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-190.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:14.842146 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.842079 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-190.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:14.842146 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.842088 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-190.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:14.843148 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.843136 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:14.843262 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.843251 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-190.ec2.internal" Apr 23 16:35:14.843311 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.843278 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:14.843888 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.843870 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-190.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:14.843888 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.843880 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-190.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:14.844020 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.843901 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-190.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:14.844020 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.843907 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-190.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:14.844020 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.843913 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-190.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:14.844020 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.843921 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-190.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:14.844828 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.844813 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-190.ec2.internal" Apr 23 16:35:14.844879 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.844840 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:14.845425 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.845408 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-190.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:14.845489 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.845436 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-190.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:14.845489 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.845446 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-190.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:14.859128 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:14.859103 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-190.ec2.internal\" not found" node="ip-10-0-136-190.ec2.internal" Apr 23 16:35:14.863510 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:14.863492 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-190.ec2.internal\" not found" node="ip-10-0-136-190.ec2.internal" Apr 23 16:35:14.886288 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:14.886261 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-190.ec2.internal\" not found" Apr 23 16:35:14.905118 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.905099 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2031b0c27b4132e2a5b8904a3843f99d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-190.ec2.internal\" (UID: \"2031b0c27b4132e2a5b8904a3843f99d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-190.ec2.internal" Apr 23 16:35:14.905213 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.905134 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2031b0c27b4132e2a5b8904a3843f99d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-190.ec2.internal\" (UID: \"2031b0c27b4132e2a5b8904a3843f99d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-190.ec2.internal" Apr 23 16:35:14.905213 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:14.905161 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f4da7ff0dd296ec0b2891ebc8475588c-config\") pod \"kube-apiserver-proxy-ip-10-0-136-190.ec2.internal\" (UID: \"f4da7ff0dd296ec0b2891ebc8475588c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-190.ec2.internal" Apr 23 16:35:14.986366 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:14.986325 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-190.ec2.internal\" not found" Apr 23 16:35:15.005396 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:15.005306 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2031b0c27b4132e2a5b8904a3843f99d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-190.ec2.internal\" (UID: \"2031b0c27b4132e2a5b8904a3843f99d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-190.ec2.internal" Apr 23 16:35:15.005396 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:15.005385 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2031b0c27b4132e2a5b8904a3843f99d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-190.ec2.internal\" (UID: \"2031b0c27b4132e2a5b8904a3843f99d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-190.ec2.internal" Apr 23 16:35:15.005547 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:15.005422 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2031b0c27b4132e2a5b8904a3843f99d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-190.ec2.internal\" (UID: \"2031b0c27b4132e2a5b8904a3843f99d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-190.ec2.internal" Apr 23 16:35:15.005547 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:15.005441 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f4da7ff0dd296ec0b2891ebc8475588c-config\") pod \"kube-apiserver-proxy-ip-10-0-136-190.ec2.internal\" (UID: \"f4da7ff0dd296ec0b2891ebc8475588c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-190.ec2.internal" Apr 23 16:35:15.005547 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:15.005478 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f4da7ff0dd296ec0b2891ebc8475588c-config\") pod \"kube-apiserver-proxy-ip-10-0-136-190.ec2.internal\" (UID: \"f4da7ff0dd296ec0b2891ebc8475588c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-190.ec2.internal" Apr 23 16:35:15.005547 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:15.005528 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2031b0c27b4132e2a5b8904a3843f99d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-190.ec2.internal\" (UID: \"2031b0c27b4132e2a5b8904a3843f99d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-190.ec2.internal" Apr 23 16:35:15.086478 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:15.086403 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-190.ec2.internal\" not found" Apr 23 16:35:15.161030 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:15.161001 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-190.ec2.internal" Apr 23 16:35:15.165593 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:15.165572 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-190.ec2.internal" Apr 23 16:35:15.187277 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:15.187245 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-190.ec2.internal\" not found" Apr 23 16:35:15.287883 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:15.287845 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-190.ec2.internal\" not found" Apr 23 16:35:15.388550 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:15.388485 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-190.ec2.internal\" not found" Apr 23 16:35:15.489233 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:15.489178 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-190.ec2.internal\" not found" Apr 23 16:35:15.504622 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:15.504590 2571 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 16:35:15.504742 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:15.504727 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 16:35:15.590433 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:15.590296 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-190.ec2.internal\" not found" Apr 23 16:35:15.599959 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:15.599936 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 16:35:15.632303 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:15.632275 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 16:35:15.637562 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:15.637531 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 16:30:14 +0000 UTC" deadline="2027-09-25 11:52:55.616790886 +0000 UTC" Apr 23 16:35:15.637562 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:15.637560 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12475h17m39.979234195s" Apr 23 16:35:15.647734 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:15.647657 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2031b0c27b4132e2a5b8904a3843f99d.slice/crio-8887880f40ce3cd03290f8ef381c1afc00719c35c884f3ceb61f544dc2d57044 WatchSource:0}: Error finding container 8887880f40ce3cd03290f8ef381c1afc00719c35c884f3ceb61f544dc2d57044: Status 404 returned error can't find the container with id 8887880f40ce3cd03290f8ef381c1afc00719c35c884f3ceb61f544dc2d57044 Apr 23 16:35:15.648087 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:15.648071 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4da7ff0dd296ec0b2891ebc8475588c.slice/crio-71f0e8ae6de0dec05f2fc48557ec6991f4181eb8786a0d6201c6df345e7e5f30 WatchSource:0}: Error finding container 71f0e8ae6de0dec05f2fc48557ec6991f4181eb8786a0d6201c6df345e7e5f30: Status 404 returned error can't find the container with id 71f0e8ae6de0dec05f2fc48557ec6991f4181eb8786a0d6201c6df345e7e5f30 Apr 23 16:35:15.652029 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:15.652016 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:35:15.665878 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:15.665860 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-9hxlg" Apr 23 16:35:15.686829 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:15.686808 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-9hxlg" Apr 23 16:35:15.691455 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:15.691440 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-190.ec2.internal\" not found" Apr 23 16:35:15.741733 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:15.741690 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-190.ec2.internal" event={"ID":"f4da7ff0dd296ec0b2891ebc8475588c","Type":"ContainerStarted","Data":"71f0e8ae6de0dec05f2fc48557ec6991f4181eb8786a0d6201c6df345e7e5f30"} Apr 23 16:35:15.742467 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:15.742450 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-190.ec2.internal" event={"ID":"2031b0c27b4132e2a5b8904a3843f99d","Type":"ContainerStarted","Data":"8887880f40ce3cd03290f8ef381c1afc00719c35c884f3ceb61f544dc2d57044"} Apr 23 16:35:15.791657 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:15.791629 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-190.ec2.internal\" not found" Apr 23 16:35:15.818914 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:15.818888 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:15.892231 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:15.892205 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-190.ec2.internal\" not found" Apr 23 16:35:15.992897 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:15.992815 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-190.ec2.internal\" not found" Apr 23 16:35:16.069634 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.069424 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:16.077959 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.077896 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:16.100942 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.100916 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-190.ec2.internal" Apr 23 16:35:16.113716 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.113689 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 16:35:16.114668 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.114648 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-190.ec2.internal" Apr 23 16:35:16.124515 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.124494 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 16:35:16.580639 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.580592 2571 apiserver.go:52] "Watching apiserver" Apr 23 16:35:16.588435 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.588408 2571 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 16:35:16.589544 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.589522 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-6z9f6","openshift-multus/network-metrics-daemon-7kt9x","openshift-ovn-kubernetes/ovnkube-node-q6qml","kube-system/konnectivity-agent-97r2m","kube-system/kube-apiserver-proxy-ip-10-0-136-190.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-190.ec2.internal","openshift-multus/multus-9bf7w","openshift-network-diagnostics/network-check-target-nvscm","openshift-network-operator/iptables-alerter-p6qdp","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-md68h","openshift-cluster-node-tuning-operator/tuned-8dckj","openshift-dns/node-resolver-rpmks","openshift-image-registry/node-ca-zrzxd"] Apr 23 16:35:16.590863 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.590842 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.591783 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.591755 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kt9x" Apr 23 16:35:16.591866 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:16.591827 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kt9x" podUID="dbfa21d6-bf64-400e-9cbe-b26171f7ef99" Apr 23 16:35:16.593072 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.593049 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.593510 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.593491 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 16:35:16.593621 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.593530 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 16:35:16.593621 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.593505 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 16:35:16.593851 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.593838 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-mz2r4\"" Apr 23 16:35:16.593955 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.593866 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 16:35:16.594135 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.594118 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-97r2m" Apr 23 16:35:16.595264 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.595215 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvscm" Apr 23 16:35:16.595360 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:16.595294 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nvscm" podUID="1e04a51c-81ee-4ef6-b546-0b1e611642d1" Apr 23 16:35:16.596861 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.596337 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-p6qdp" Apr 23 16:35:16.596861 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.596731 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 16:35:16.596861 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.596783 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 16:35:16.598143 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.597488 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 16:35:16.598143 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.597507 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rpmks" Apr 23 16:35:16.598143 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.597720 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 16:35:16.598143 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.597916 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 16:35:16.598143 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.597939 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-t6kth\"" Apr 23 16:35:16.598143 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.598033 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-7vv6n\"" Apr 23 16:35:16.598143 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.598065 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 16:35:16.599117 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.599101 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 16:35:16.599117 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.599110 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 16:35:16.599936 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.599914 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6z9f6" Apr 23 16:35:16.600050 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.600035 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-md68h" Apr 23 16:35:16.600761 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.600743 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:35:16.601057 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.601043 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 16:35:16.601265 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.601225 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.601347 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.601328 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 16:35:16.601668 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.601645 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-9kgt9\"" Apr 23 16:35:16.602518 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.602425 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-zwkbr\"" Apr 23 16:35:16.605051 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.602863 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 16:35:16.605051 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.603035 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 16:35:16.605051 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.603119 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zrzxd" Apr 23 16:35:16.605051 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.603158 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 16:35:16.605051 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.603445 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 16:35:16.605051 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.603593 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 16:35:16.605051 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.604407 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-bjgcj\"" Apr 23 16:35:16.605051 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.604571 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-zqfld\"" Apr 23 16:35:16.605051 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.604733 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:35:16.605051 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.604784 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-l87hb\"" Apr 23 16:35:16.605051 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.604888 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 16:35:16.605499 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.605063 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 16:35:16.605499 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.605132 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 16:35:16.606489 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.606467 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 16:35:16.607461 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.607440 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-2tgsr\"" Apr 23 16:35:16.607548 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.607440 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 16:35:16.608462 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.608444 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 16:35:16.614417 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.614400 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/adecd1bd-5d92-46f9-97d2-d824dfa55524-iptables-alerter-script\") pod \"iptables-alerter-p6qdp\" (UID: \"adecd1bd-5d92-46f9-97d2-d824dfa55524\") " pod="openshift-network-operator/iptables-alerter-p6qdp" Apr 23 16:35:16.614475 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.614424 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-run-systemd\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.614475 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.614441 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9af32d6e-ad97-4dd0-a962-8bc05ab87464-ovnkube-config\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.614475 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.614456 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/243c2f8c-379f-4f30-8621-de9f2d7fc656-etc-sysctl-conf\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.614620 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.614474 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-cnibin\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.614620 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.614497 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e9cf07b9-000a-4b3d-b47a-38c57899b41d-multus-daemon-config\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.614620 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.614574 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-etc-kubernetes\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.614772 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.614644 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/204aff10-0787-493a-aee6-18d5bf3d74be-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6z9f6\" (UID: \"204aff10-0787-493a-aee6-18d5bf3d74be\") " pod="openshift-multus/multus-additional-cni-plugins-6z9f6" Apr 23 16:35:16.614772 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.614701 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-run-ovn\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.614772 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.614728 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxlnn\" (UniqueName: \"kubernetes.io/projected/243c2f8c-379f-4f30-8621-de9f2d7fc656-kube-api-access-xxlnn\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.614772 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.614753 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-var-lib-openvswitch\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.614941 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.614777 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/243c2f8c-379f-4f30-8621-de9f2d7fc656-etc-kubernetes\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.614941 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.614801 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-host-var-lib-cni-bin\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.614941 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.614824 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-multus-conf-dir\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.614941 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.614850 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-host-slash\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.614941 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.614881 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-log-socket\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.614941 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.614922 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/cea1706c-bca2-4388-9143-d1aa2fbcdc62-serviceca\") pod \"node-ca-zrzxd\" (UID: \"cea1706c-bca2-4388-9143-d1aa2fbcdc62\") " pod="openshift-image-registry/node-ca-zrzxd" Apr 23 16:35:16.615149 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.614952 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/204aff10-0787-493a-aee6-18d5bf3d74be-cnibin\") pod \"multus-additional-cni-plugins-6z9f6\" (UID: \"204aff10-0787-493a-aee6-18d5bf3d74be\") " pod="openshift-multus/multus-additional-cni-plugins-6z9f6" Apr 23 16:35:16.615149 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.614967 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/204aff10-0787-493a-aee6-18d5bf3d74be-cni-binary-copy\") pod \"multus-additional-cni-plugins-6z9f6\" (UID: \"204aff10-0787-493a-aee6-18d5bf3d74be\") " pod="openshift-multus/multus-additional-cni-plugins-6z9f6" Apr 23 16:35:16.615149 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.614983 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qrjk\" (UniqueName: \"kubernetes.io/projected/e9cf07b9-000a-4b3d-b47a-38c57899b41d-kube-api-access-7qrjk\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.615149 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.614996 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/243c2f8c-379f-4f30-8621-de9f2d7fc656-etc-sysconfig\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.615149 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.615010 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/adecd1bd-5d92-46f9-97d2-d824dfa55524-host-slash\") pod \"iptables-alerter-p6qdp\" (UID: \"adecd1bd-5d92-46f9-97d2-d824dfa55524\") " pod="openshift-network-operator/iptables-alerter-p6qdp" Apr 23 16:35:16.615149 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.615029 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls76g\" (UniqueName: \"kubernetes.io/projected/204aff10-0787-493a-aee6-18d5bf3d74be-kube-api-access-ls76g\") pod \"multus-additional-cni-plugins-6z9f6\" (UID: \"204aff10-0787-493a-aee6-18d5bf3d74be\") " pod="openshift-multus/multus-additional-cni-plugins-6z9f6" Apr 23 16:35:16.615149 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.615054 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.615149 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.615071 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ee000696-33d6-4a0e-a2ac-f710ef5e1515-hosts-file\") pod \"node-resolver-rpmks\" (UID: \"ee000696-33d6-4a0e-a2ac-f710ef5e1515\") " pod="openshift-dns/node-resolver-rpmks" Apr 23 16:35:16.615149 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.615102 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-hostroot\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.615149 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.615125 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbfa21d6-bf64-400e-9cbe-b26171f7ef99-metrics-certs\") pod \"network-metrics-daemon-7kt9x\" (UID: \"dbfa21d6-bf64-400e-9cbe-b26171f7ef99\") " pod="openshift-multus/network-metrics-daemon-7kt9x" Apr 23 16:35:16.615486 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.615153 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ffff79cd-77b6-4fa7-a7f2-a92f6efbb577-etc-selinux\") pod \"aws-ebs-csi-driver-node-md68h\" (UID: \"ffff79cd-77b6-4fa7-a7f2-a92f6efbb577\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-md68h" Apr 23 16:35:16.615486 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.615202 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j7bd\" (UniqueName: \"kubernetes.io/projected/cea1706c-bca2-4388-9143-d1aa2fbcdc62-kube-api-access-5j7bd\") pod \"node-ca-zrzxd\" (UID: \"cea1706c-bca2-4388-9143-d1aa2fbcdc62\") " pod="openshift-image-registry/node-ca-zrzxd" Apr 23 16:35:16.615486 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.615230 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d8172c3c-2af9-4787-983c-905f0f36a0b7-konnectivity-ca\") pod \"konnectivity-agent-97r2m\" (UID: \"d8172c3c-2af9-4787-983c-905f0f36a0b7\") " pod="kube-system/konnectivity-agent-97r2m" Apr 23 16:35:16.615486 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.615251 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-host-cni-netd\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.615486 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.615270 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4t5d\" (UniqueName: \"kubernetes.io/projected/adecd1bd-5d92-46f9-97d2-d824dfa55524-kube-api-access-c4t5d\") pod \"iptables-alerter-p6qdp\" (UID: \"adecd1bd-5d92-46f9-97d2-d824dfa55524\") " pod="openshift-network-operator/iptables-alerter-p6qdp" Apr 23 16:35:16.615486 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.615284 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/204aff10-0787-493a-aee6-18d5bf3d74be-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6z9f6\" (UID: \"204aff10-0787-493a-aee6-18d5bf3d74be\") " pod="openshift-multus/multus-additional-cni-plugins-6z9f6" Apr 23 16:35:16.615486 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.615299 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-systemd-units\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.615486 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.615314 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ffff79cd-77b6-4fa7-a7f2-a92f6efbb577-socket-dir\") pod \"aws-ebs-csi-driver-node-md68h\" (UID: \"ffff79cd-77b6-4fa7-a7f2-a92f6efbb577\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-md68h" Apr 23 16:35:16.615486 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.615354 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9r5w\" (UniqueName: \"kubernetes.io/projected/ffff79cd-77b6-4fa7-a7f2-a92f6efbb577-kube-api-access-m9r5w\") pod \"aws-ebs-csi-driver-node-md68h\" (UID: \"ffff79cd-77b6-4fa7-a7f2-a92f6efbb577\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-md68h" Apr 23 16:35:16.615486 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.615389 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/243c2f8c-379f-4f30-8621-de9f2d7fc656-var-lib-kubelet\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.615486 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.615422 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e9cf07b9-000a-4b3d-b47a-38c57899b41d-cni-binary-copy\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.615486 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.615455 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-host-run-ovn-kubernetes\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.615988 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.615490 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj4vx\" (UniqueName: \"kubernetes.io/projected/9af32d6e-ad97-4dd0-a962-8bc05ab87464-kube-api-access-jj4vx\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.615988 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.615517 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x74gk\" (UniqueName: \"kubernetes.io/projected/ee000696-33d6-4a0e-a2ac-f710ef5e1515-kube-api-access-x74gk\") pod \"node-resolver-rpmks\" (UID: \"ee000696-33d6-4a0e-a2ac-f710ef5e1515\") " pod="openshift-dns/node-resolver-rpmks" Apr 23 16:35:16.615988 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.615560 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-host-run-k8s-cni-cncf-io\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.615988 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.615611 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-host-run-netns\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.615988 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.615643 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-host-run-multus-certs\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.615988 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.615671 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgd2l\" (UniqueName: \"kubernetes.io/projected/dbfa21d6-bf64-400e-9cbe-b26171f7ef99-kube-api-access-sgd2l\") pod \"network-metrics-daemon-7kt9x\" (UID: \"dbfa21d6-bf64-400e-9cbe-b26171f7ef99\") " pod="openshift-multus/network-metrics-daemon-7kt9x" Apr 23 16:35:16.615988 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.615692 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-host-kubelet\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.615988 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.615725 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/243c2f8c-379f-4f30-8621-de9f2d7fc656-tmp\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.615988 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.615755 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/204aff10-0787-493a-aee6-18d5bf3d74be-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6z9f6\" (UID: \"204aff10-0787-493a-aee6-18d5bf3d74be\") " pod="openshift-multus/multus-additional-cni-plugins-6z9f6" Apr 23 16:35:16.615988 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.615784 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/243c2f8c-379f-4f30-8621-de9f2d7fc656-run\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.615988 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.615807 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/243c2f8c-379f-4f30-8621-de9f2d7fc656-lib-modules\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.615988 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.615831 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/243c2f8c-379f-4f30-8621-de9f2d7fc656-etc-tuned\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.615988 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.615854 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-os-release\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.615988 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.615891 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-run-openvswitch\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.615988 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.615921 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-host-cni-bin\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.615988 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.615943 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cea1706c-bca2-4388-9143-d1aa2fbcdc62-host\") pod \"node-ca-zrzxd\" (UID: \"cea1706c-bca2-4388-9143-d1aa2fbcdc62\") " pod="openshift-image-registry/node-ca-zrzxd" Apr 23 16:35:16.615988 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.615967 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-host-var-lib-kubelet\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.616802 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.615990 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2bbv\" (UniqueName: \"kubernetes.io/projected/1e04a51c-81ee-4ef6-b546-0b1e611642d1-kube-api-access-q2bbv\") pod \"network-check-target-nvscm\" (UID: \"1e04a51c-81ee-4ef6-b546-0b1e611642d1\") " pod="openshift-network-diagnostics/network-check-target-nvscm" Apr 23 16:35:16.616802 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.616030 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/204aff10-0787-493a-aee6-18d5bf3d74be-system-cni-dir\") pod \"multus-additional-cni-plugins-6z9f6\" (UID: \"204aff10-0787-493a-aee6-18d5bf3d74be\") " pod="openshift-multus/multus-additional-cni-plugins-6z9f6" Apr 23 16:35:16.616802 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.616054 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9af32d6e-ad97-4dd0-a962-8bc05ab87464-env-overrides\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.616802 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.616077 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ffff79cd-77b6-4fa7-a7f2-a92f6efbb577-registration-dir\") pod \"aws-ebs-csi-driver-node-md68h\" (UID: \"ffff79cd-77b6-4fa7-a7f2-a92f6efbb577\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-md68h" Apr 23 16:35:16.616802 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.616102 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ffff79cd-77b6-4fa7-a7f2-a92f6efbb577-sys-fs\") pod \"aws-ebs-csi-driver-node-md68h\" (UID: \"ffff79cd-77b6-4fa7-a7f2-a92f6efbb577\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-md68h" Apr 23 16:35:16.616802 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.616144 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-system-cni-dir\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.616802 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.616164 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-multus-socket-dir-parent\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.616802 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.616193 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d8172c3c-2af9-4787-983c-905f0f36a0b7-agent-certs\") pod \"konnectivity-agent-97r2m\" (UID: \"d8172c3c-2af9-4787-983c-905f0f36a0b7\") " pod="kube-system/konnectivity-agent-97r2m" Apr 23 16:35:16.616802 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.616221 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/243c2f8c-379f-4f30-8621-de9f2d7fc656-etc-systemd\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.616802 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.616241 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-host-run-netns\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.616802 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.616261 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9af32d6e-ad97-4dd0-a962-8bc05ab87464-ovn-node-metrics-cert\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.616802 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.616286 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ffff79cd-77b6-4fa7-a7f2-a92f6efbb577-kubelet-dir\") pod \"aws-ebs-csi-driver-node-md68h\" (UID: \"ffff79cd-77b6-4fa7-a7f2-a92f6efbb577\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-md68h" Apr 23 16:35:16.616802 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.616310 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-etc-openvswitch\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.616802 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.616338 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-node-log\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.616802 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.616364 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9af32d6e-ad97-4dd0-a962-8bc05ab87464-ovnkube-script-lib\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.616802 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.616396 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ffff79cd-77b6-4fa7-a7f2-a92f6efbb577-device-dir\") pod \"aws-ebs-csi-driver-node-md68h\" (UID: \"ffff79cd-77b6-4fa7-a7f2-a92f6efbb577\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-md68h" Apr 23 16:35:16.617492 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.616423 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/243c2f8c-379f-4f30-8621-de9f2d7fc656-etc-modprobe-d\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.617492 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.616447 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/243c2f8c-379f-4f30-8621-de9f2d7fc656-etc-sysctl-d\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.617492 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.616470 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/243c2f8c-379f-4f30-8621-de9f2d7fc656-host\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.617492 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.616496 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-host-var-lib-cni-multus\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.617492 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.616523 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/204aff10-0787-493a-aee6-18d5bf3d74be-os-release\") pod \"multus-additional-cni-plugins-6z9f6\" (UID: \"204aff10-0787-493a-aee6-18d5bf3d74be\") " pod="openshift-multus/multus-additional-cni-plugins-6z9f6" Apr 23 16:35:16.617492 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.616549 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ee000696-33d6-4a0e-a2ac-f710ef5e1515-tmp-dir\") pod \"node-resolver-rpmks\" (UID: \"ee000696-33d6-4a0e-a2ac-f710ef5e1515\") " pod="openshift-dns/node-resolver-rpmks" Apr 23 16:35:16.617492 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.616570 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/243c2f8c-379f-4f30-8621-de9f2d7fc656-sys\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.617492 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.616590 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-multus-cni-dir\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.687674 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.687638 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 16:30:15 +0000 UTC" deadline="2027-12-24 17:15:50.18405735 +0000 UTC" Apr 23 16:35:16.687674 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.687668 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14640h40m33.496391547s" Apr 23 16:35:16.703308 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.703282 2571 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 16:35:16.716810 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.716782 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/adecd1bd-5d92-46f9-97d2-d824dfa55524-iptables-alerter-script\") pod \"iptables-alerter-p6qdp\" (UID: \"adecd1bd-5d92-46f9-97d2-d824dfa55524\") " pod="openshift-network-operator/iptables-alerter-p6qdp" Apr 23 16:35:16.716957 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.716817 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-run-systemd\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.716957 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.716838 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9af32d6e-ad97-4dd0-a962-8bc05ab87464-ovnkube-config\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.716957 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.716877 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/243c2f8c-379f-4f30-8621-de9f2d7fc656-etc-sysctl-conf\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.716957 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.716926 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-run-systemd\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.717153 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.716974 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-cnibin\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.717153 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.717002 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e9cf07b9-000a-4b3d-b47a-38c57899b41d-multus-daemon-config\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.717153 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.717025 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-etc-kubernetes\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.717153 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.717044 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/204aff10-0787-493a-aee6-18d5bf3d74be-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6z9f6\" (UID: \"204aff10-0787-493a-aee6-18d5bf3d74be\") " pod="openshift-multus/multus-additional-cni-plugins-6z9f6" Apr 23 16:35:16.717153 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.717022 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/243c2f8c-379f-4f30-8621-de9f2d7fc656-etc-sysctl-conf\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.717153 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.717064 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-cnibin\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.717153 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.717104 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-etc-kubernetes\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.717562 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.717382 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/adecd1bd-5d92-46f9-97d2-d824dfa55524-iptables-alerter-script\") pod \"iptables-alerter-p6qdp\" (UID: \"adecd1bd-5d92-46f9-97d2-d824dfa55524\") " pod="openshift-network-operator/iptables-alerter-p6qdp" Apr 23 16:35:16.717562 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.717517 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e9cf07b9-000a-4b3d-b47a-38c57899b41d-multus-daemon-config\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.717691 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.717640 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-run-ovn\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.717691 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.717675 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xxlnn\" (UniqueName: \"kubernetes.io/projected/243c2f8c-379f-4f30-8621-de9f2d7fc656-kube-api-access-xxlnn\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.717821 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.717692 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/204aff10-0787-493a-aee6-18d5bf3d74be-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6z9f6\" (UID: \"204aff10-0787-493a-aee6-18d5bf3d74be\") " pod="openshift-multus/multus-additional-cni-plugins-6z9f6" Apr 23 16:35:16.717821 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.717712 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-var-lib-openvswitch\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.717821 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.717748 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-var-lib-openvswitch\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.717821 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.717755 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/243c2f8c-379f-4f30-8621-de9f2d7fc656-etc-kubernetes\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.717821 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.717783 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-host-var-lib-cni-bin\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.717821 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.717809 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-multus-conf-dir\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.718148 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.717832 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-host-slash\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.718148 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.717858 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-log-socket\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.718148 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.717887 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/cea1706c-bca2-4388-9143-d1aa2fbcdc62-serviceca\") pod \"node-ca-zrzxd\" (UID: \"cea1706c-bca2-4388-9143-d1aa2fbcdc62\") " pod="openshift-image-registry/node-ca-zrzxd" Apr 23 16:35:16.718148 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.717910 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/204aff10-0787-493a-aee6-18d5bf3d74be-cnibin\") pod \"multus-additional-cni-plugins-6z9f6\" (UID: \"204aff10-0787-493a-aee6-18d5bf3d74be\") " pod="openshift-multus/multus-additional-cni-plugins-6z9f6" Apr 23 16:35:16.718148 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.717919 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9af32d6e-ad97-4dd0-a962-8bc05ab87464-ovnkube-config\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.718148 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.717932 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/204aff10-0787-493a-aee6-18d5bf3d74be-cni-binary-copy\") pod \"multus-additional-cni-plugins-6z9f6\" (UID: \"204aff10-0787-493a-aee6-18d5bf3d74be\") " pod="openshift-multus/multus-additional-cni-plugins-6z9f6" Apr 23 16:35:16.718148 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.717957 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7qrjk\" (UniqueName: \"kubernetes.io/projected/e9cf07b9-000a-4b3d-b47a-38c57899b41d-kube-api-access-7qrjk\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.718148 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.717986 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/243c2f8c-379f-4f30-8621-de9f2d7fc656-etc-sysconfig\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.718148 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.717990 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-host-slash\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.718148 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718008 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/adecd1bd-5d92-46f9-97d2-d824dfa55524-host-slash\") pod \"iptables-alerter-p6qdp\" (UID: \"adecd1bd-5d92-46f9-97d2-d824dfa55524\") " pod="openshift-network-operator/iptables-alerter-p6qdp" Apr 23 16:35:16.718148 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718043 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/243c2f8c-379f-4f30-8621-de9f2d7fc656-etc-kubernetes\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.718148 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718059 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ls76g\" (UniqueName: \"kubernetes.io/projected/204aff10-0787-493a-aee6-18d5bf3d74be-kube-api-access-ls76g\") pod \"multus-additional-cni-plugins-6z9f6\" (UID: \"204aff10-0787-493a-aee6-18d5bf3d74be\") " pod="openshift-multus/multus-additional-cni-plugins-6z9f6" Apr 23 16:35:16.718148 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718089 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-host-var-lib-cni-bin\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.718148 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718089 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.718148 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718129 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.718148 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718132 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ee000696-33d6-4a0e-a2ac-f710ef5e1515-hosts-file\") pod \"node-resolver-rpmks\" (UID: \"ee000696-33d6-4a0e-a2ac-f710ef5e1515\") " pod="openshift-dns/node-resolver-rpmks" Apr 23 16:35:16.718957 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718162 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-hostroot\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.718957 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718169 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ee000696-33d6-4a0e-a2ac-f710ef5e1515-hosts-file\") pod \"node-resolver-rpmks\" (UID: \"ee000696-33d6-4a0e-a2ac-f710ef5e1515\") " pod="openshift-dns/node-resolver-rpmks" Apr 23 16:35:16.718957 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.717785 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-run-ovn\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.718957 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718186 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbfa21d6-bf64-400e-9cbe-b26171f7ef99-metrics-certs\") pod \"network-metrics-daemon-7kt9x\" (UID: \"dbfa21d6-bf64-400e-9cbe-b26171f7ef99\") " pod="openshift-multus/network-metrics-daemon-7kt9x" Apr 23 16:35:16.718957 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718222 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ffff79cd-77b6-4fa7-a7f2-a92f6efbb577-etc-selinux\") pod \"aws-ebs-csi-driver-node-md68h\" (UID: \"ffff79cd-77b6-4fa7-a7f2-a92f6efbb577\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-md68h" Apr 23 16:35:16.718957 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718237 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-multus-conf-dir\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.718957 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718252 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5j7bd\" (UniqueName: \"kubernetes.io/projected/cea1706c-bca2-4388-9143-d1aa2fbcdc62-kube-api-access-5j7bd\") pod \"node-ca-zrzxd\" (UID: \"cea1706c-bca2-4388-9143-d1aa2fbcdc62\") " pod="openshift-image-registry/node-ca-zrzxd" Apr 23 16:35:16.718957 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:16.718269 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:16.718957 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718278 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d8172c3c-2af9-4787-983c-905f0f36a0b7-konnectivity-ca\") pod \"konnectivity-agent-97r2m\" (UID: \"d8172c3c-2af9-4787-983c-905f0f36a0b7\") " pod="kube-system/konnectivity-agent-97r2m" Apr 23 16:35:16.718957 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718303 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-log-socket\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.718957 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718311 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-host-cni-netd\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.718957 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718359 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-host-cni-netd\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.718957 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:16.718371 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbfa21d6-bf64-400e-9cbe-b26171f7ef99-metrics-certs podName:dbfa21d6-bf64-400e-9cbe-b26171f7ef99 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:17.218327461 +0000 UTC m=+3.096737046 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dbfa21d6-bf64-400e-9cbe-b26171f7ef99-metrics-certs") pod "network-metrics-daemon-7kt9x" (UID: "dbfa21d6-bf64-400e-9cbe-b26171f7ef99") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:16.718957 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718392 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c4t5d\" (UniqueName: \"kubernetes.io/projected/adecd1bd-5d92-46f9-97d2-d824dfa55524-kube-api-access-c4t5d\") pod \"iptables-alerter-p6qdp\" (UID: \"adecd1bd-5d92-46f9-97d2-d824dfa55524\") " pod="openshift-network-operator/iptables-alerter-p6qdp" Apr 23 16:35:16.718957 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718411 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ffff79cd-77b6-4fa7-a7f2-a92f6efbb577-etc-selinux\") pod \"aws-ebs-csi-driver-node-md68h\" (UID: \"ffff79cd-77b6-4fa7-a7f2-a92f6efbb577\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-md68h" Apr 23 16:35:16.718957 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718419 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/204aff10-0787-493a-aee6-18d5bf3d74be-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6z9f6\" (UID: \"204aff10-0787-493a-aee6-18d5bf3d74be\") " pod="openshift-multus/multus-additional-cni-plugins-6z9f6" Apr 23 16:35:16.718957 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718429 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/243c2f8c-379f-4f30-8621-de9f2d7fc656-etc-sysconfig\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.719733 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718444 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-systemd-units\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.719733 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718469 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ffff79cd-77b6-4fa7-a7f2-a92f6efbb577-socket-dir\") pod \"aws-ebs-csi-driver-node-md68h\" (UID: \"ffff79cd-77b6-4fa7-a7f2-a92f6efbb577\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-md68h" Apr 23 16:35:16.719733 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718495 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m9r5w\" (UniqueName: \"kubernetes.io/projected/ffff79cd-77b6-4fa7-a7f2-a92f6efbb577-kube-api-access-m9r5w\") pod \"aws-ebs-csi-driver-node-md68h\" (UID: \"ffff79cd-77b6-4fa7-a7f2-a92f6efbb577\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-md68h" Apr 23 16:35:16.719733 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718524 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/243c2f8c-379f-4f30-8621-de9f2d7fc656-var-lib-kubelet\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.719733 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718547 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e9cf07b9-000a-4b3d-b47a-38c57899b41d-cni-binary-copy\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.719733 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718572 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-host-run-ovn-kubernetes\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.719733 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718617 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jj4vx\" (UniqueName: \"kubernetes.io/projected/9af32d6e-ad97-4dd0-a962-8bc05ab87464-kube-api-access-jj4vx\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.719733 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718661 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x74gk\" (UniqueName: \"kubernetes.io/projected/ee000696-33d6-4a0e-a2ac-f710ef5e1515-kube-api-access-x74gk\") pod \"node-resolver-rpmks\" (UID: \"ee000696-33d6-4a0e-a2ac-f710ef5e1515\") " pod="openshift-dns/node-resolver-rpmks" Apr 23 16:35:16.719733 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718688 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-host-run-k8s-cni-cncf-io\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.719733 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718711 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-host-run-netns\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.719733 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718736 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-host-run-multus-certs\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.719733 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718762 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgd2l\" (UniqueName: \"kubernetes.io/projected/dbfa21d6-bf64-400e-9cbe-b26171f7ef99-kube-api-access-sgd2l\") pod \"network-metrics-daemon-7kt9x\" (UID: \"dbfa21d6-bf64-400e-9cbe-b26171f7ef99\") " pod="openshift-multus/network-metrics-daemon-7kt9x" Apr 23 16:35:16.719733 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718789 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-host-kubelet\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.719733 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718813 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/243c2f8c-379f-4f30-8621-de9f2d7fc656-tmp\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.719733 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718851 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/204aff10-0787-493a-aee6-18d5bf3d74be-cni-binary-copy\") pod \"multus-additional-cni-plugins-6z9f6\" (UID: \"204aff10-0787-493a-aee6-18d5bf3d74be\") " pod="openshift-multus/multus-additional-cni-plugins-6z9f6" Apr 23 16:35:16.719733 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718872 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/204aff10-0787-493a-aee6-18d5bf3d74be-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6z9f6\" (UID: \"204aff10-0787-493a-aee6-18d5bf3d74be\") " pod="openshift-multus/multus-additional-cni-plugins-6z9f6" Apr 23 16:35:16.719733 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718898 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/adecd1bd-5d92-46f9-97d2-d824dfa55524-host-slash\") pod \"iptables-alerter-p6qdp\" (UID: \"adecd1bd-5d92-46f9-97d2-d824dfa55524\") " pod="openshift-network-operator/iptables-alerter-p6qdp" Apr 23 16:35:16.720428 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718899 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/243c2f8c-379f-4f30-8621-de9f2d7fc656-run\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.720428 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718926 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/243c2f8c-379f-4f30-8621-de9f2d7fc656-lib-modules\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.720428 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718360 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/204aff10-0787-493a-aee6-18d5bf3d74be-cnibin\") pod \"multus-additional-cni-plugins-6z9f6\" (UID: \"204aff10-0787-493a-aee6-18d5bf3d74be\") " pod="openshift-multus/multus-additional-cni-plugins-6z9f6" Apr 23 16:35:16.720428 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.718946 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d8172c3c-2af9-4787-983c-905f0f36a0b7-konnectivity-ca\") pod \"konnectivity-agent-97r2m\" (UID: \"d8172c3c-2af9-4787-983c-905f0f36a0b7\") " pod="kube-system/konnectivity-agent-97r2m" Apr 23 16:35:16.720428 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.719789 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/243c2f8c-379f-4f30-8621-de9f2d7fc656-var-lib-kubelet\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.720428 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.719837 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/cea1706c-bca2-4388-9143-d1aa2fbcdc62-serviceca\") pod \"node-ca-zrzxd\" (UID: \"cea1706c-bca2-4388-9143-d1aa2fbcdc62\") " pod="openshift-image-registry/node-ca-zrzxd" Apr 23 16:35:16.720428 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.719852 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/243c2f8c-379f-4f30-8621-de9f2d7fc656-etc-tuned\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.720428 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.719893 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ffff79cd-77b6-4fa7-a7f2-a92f6efbb577-socket-dir\") pod \"aws-ebs-csi-driver-node-md68h\" (UID: \"ffff79cd-77b6-4fa7-a7f2-a92f6efbb577\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-md68h" Apr 23 16:35:16.720428 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.719911 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-host-kubelet\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.720428 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.720052 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-host-run-k8s-cni-cncf-io\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.720891 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.720731 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-host-run-netns\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.720891 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.720791 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-systemd-units\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.720980 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.720937 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/243c2f8c-379f-4f30-8621-de9f2d7fc656-run\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.721022 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.720988 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-os-release\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.721022 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.721031 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-run-openvswitch\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.721208 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.721064 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-host-cni-bin\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.721208 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.721054 2571 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 16:35:16.721208 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.721098 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cea1706c-bca2-4388-9143-d1aa2fbcdc62-host\") pod \"node-ca-zrzxd\" (UID: \"cea1706c-bca2-4388-9143-d1aa2fbcdc62\") " pod="openshift-image-registry/node-ca-zrzxd" Apr 23 16:35:16.721208 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.721131 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-host-var-lib-kubelet\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.721208 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.721183 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-run-openvswitch\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.721208 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.721186 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-host-cni-bin\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.721497 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.721231 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-host-var-lib-kubelet\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.721497 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.721235 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cea1706c-bca2-4388-9143-d1aa2fbcdc62-host\") pod \"node-ca-zrzxd\" (UID: \"cea1706c-bca2-4388-9143-d1aa2fbcdc62\") " pod="openshift-image-registry/node-ca-zrzxd" Apr 23 16:35:16.721497 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.721267 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q2bbv\" (UniqueName: \"kubernetes.io/projected/1e04a51c-81ee-4ef6-b546-0b1e611642d1-kube-api-access-q2bbv\") pod \"network-check-target-nvscm\" (UID: \"1e04a51c-81ee-4ef6-b546-0b1e611642d1\") " pod="openshift-network-diagnostics/network-check-target-nvscm" Apr 23 16:35:16.721497 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.721293 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/204aff10-0787-493a-aee6-18d5bf3d74be-system-cni-dir\") pod \"multus-additional-cni-plugins-6z9f6\" (UID: \"204aff10-0787-493a-aee6-18d5bf3d74be\") " pod="openshift-multus/multus-additional-cni-plugins-6z9f6" Apr 23 16:35:16.721497 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.721293 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-host-run-multus-certs\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.721497 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.721314 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9af32d6e-ad97-4dd0-a962-8bc05ab87464-env-overrides\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.721497 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.721341 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ffff79cd-77b6-4fa7-a7f2-a92f6efbb577-registration-dir\") pod \"aws-ebs-csi-driver-node-md68h\" (UID: \"ffff79cd-77b6-4fa7-a7f2-a92f6efbb577\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-md68h" Apr 23 16:35:16.721497 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.721354 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-hostroot\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.721497 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.721368 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ffff79cd-77b6-4fa7-a7f2-a92f6efbb577-sys-fs\") pod \"aws-ebs-csi-driver-node-md68h\" (UID: \"ffff79cd-77b6-4fa7-a7f2-a92f6efbb577\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-md68h" Apr 23 16:35:16.721497 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.721401 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-system-cni-dir\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.721497 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.721436 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-multus-socket-dir-parent\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.721497 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.721461 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/243c2f8c-379f-4f30-8621-de9f2d7fc656-lib-modules\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.721497 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.721466 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d8172c3c-2af9-4787-983c-905f0f36a0b7-agent-certs\") pod \"konnectivity-agent-97r2m\" (UID: \"d8172c3c-2af9-4787-983c-905f0f36a0b7\") " pod="kube-system/konnectivity-agent-97r2m" Apr 23 16:35:16.722133 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.721507 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/243c2f8c-379f-4f30-8621-de9f2d7fc656-etc-systemd\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.722133 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.721542 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-host-run-netns\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.722133 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.721571 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9af32d6e-ad97-4dd0-a962-8bc05ab87464-ovn-node-metrics-cert\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.722133 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.721612 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ffff79cd-77b6-4fa7-a7f2-a92f6efbb577-kubelet-dir\") pod \"aws-ebs-csi-driver-node-md68h\" (UID: \"ffff79cd-77b6-4fa7-a7f2-a92f6efbb577\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-md68h" Apr 23 16:35:16.722133 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.721643 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-etc-openvswitch\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.722133 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.721674 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-node-log\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.722133 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.721705 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9af32d6e-ad97-4dd0-a962-8bc05ab87464-ovnkube-script-lib\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.722133 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.721739 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/204aff10-0787-493a-aee6-18d5bf3d74be-system-cni-dir\") pod \"multus-additional-cni-plugins-6z9f6\" (UID: \"204aff10-0787-493a-aee6-18d5bf3d74be\") " pod="openshift-multus/multus-additional-cni-plugins-6z9f6" Apr 23 16:35:16.722133 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.721805 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ffff79cd-77b6-4fa7-a7f2-a92f6efbb577-registration-dir\") pod \"aws-ebs-csi-driver-node-md68h\" (UID: \"ffff79cd-77b6-4fa7-a7f2-a92f6efbb577\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-md68h" Apr 23 16:35:16.722133 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.721851 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ffff79cd-77b6-4fa7-a7f2-a92f6efbb577-sys-fs\") pod \"aws-ebs-csi-driver-node-md68h\" (UID: \"ffff79cd-77b6-4fa7-a7f2-a92f6efbb577\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-md68h" Apr 23 16:35:16.722133 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.721900 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-system-cni-dir\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.722133 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.721949 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-multus-socket-dir-parent\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.722133 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.721741 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ffff79cd-77b6-4fa7-a7f2-a92f6efbb577-device-dir\") pod \"aws-ebs-csi-driver-node-md68h\" (UID: \"ffff79cd-77b6-4fa7-a7f2-a92f6efbb577\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-md68h" Apr 23 16:35:16.722133 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.721996 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/243c2f8c-379f-4f30-8621-de9f2d7fc656-etc-modprobe-d\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.722133 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.722028 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/243c2f8c-379f-4f30-8621-de9f2d7fc656-etc-sysctl-d\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.722133 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.722059 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/243c2f8c-379f-4f30-8621-de9f2d7fc656-host\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.722133 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.722088 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-host-var-lib-cni-multus\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.722958 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.722118 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/204aff10-0787-493a-aee6-18d5bf3d74be-os-release\") pod \"multus-additional-cni-plugins-6z9f6\" (UID: \"204aff10-0787-493a-aee6-18d5bf3d74be\") " pod="openshift-multus/multus-additional-cni-plugins-6z9f6" Apr 23 16:35:16.722958 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.722150 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ee000696-33d6-4a0e-a2ac-f710ef5e1515-tmp-dir\") pod \"node-resolver-rpmks\" (UID: \"ee000696-33d6-4a0e-a2ac-f710ef5e1515\") " pod="openshift-dns/node-resolver-rpmks" Apr 23 16:35:16.722958 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.722163 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e9cf07b9-000a-4b3d-b47a-38c57899b41d-cni-binary-copy\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.722958 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.722179 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/243c2f8c-379f-4f30-8621-de9f2d7fc656-sys\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.722958 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.722211 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-multus-cni-dir\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.722958 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.722231 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ffff79cd-77b6-4fa7-a7f2-a92f6efbb577-device-dir\") pod \"aws-ebs-csi-driver-node-md68h\" (UID: \"ffff79cd-77b6-4fa7-a7f2-a92f6efbb577\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-md68h" Apr 23 16:35:16.722958 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.722377 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/243c2f8c-379f-4f30-8621-de9f2d7fc656-etc-modprobe-d\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.722958 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.722481 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/243c2f8c-379f-4f30-8621-de9f2d7fc656-etc-sysctl-d\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.722958 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.722537 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/243c2f8c-379f-4f30-8621-de9f2d7fc656-host\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.722958 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.722590 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-host-var-lib-cni-multus\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.722958 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.722674 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/204aff10-0787-493a-aee6-18d5bf3d74be-os-release\") pod \"multus-additional-cni-plugins-6z9f6\" (UID: \"204aff10-0787-493a-aee6-18d5bf3d74be\") " pod="openshift-multus/multus-additional-cni-plugins-6z9f6" Apr 23 16:35:16.722958 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.722719 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-etc-openvswitch\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.722958 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.722935 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ee000696-33d6-4a0e-a2ac-f710ef5e1515-tmp-dir\") pod \"node-resolver-rpmks\" (UID: \"ee000696-33d6-4a0e-a2ac-f710ef5e1515\") " pod="openshift-dns/node-resolver-rpmks" Apr 23 16:35:16.723534 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.722995 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/243c2f8c-379f-4f30-8621-de9f2d7fc656-etc-systemd\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.723534 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.723028 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/243c2f8c-379f-4f30-8621-de9f2d7fc656-sys\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.723534 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.723061 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-host-run-netns\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.723534 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.723094 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-multus-cni-dir\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.723534 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.723496 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-node-log\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.723792 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.723555 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9af32d6e-ad97-4dd0-a962-8bc05ab87464-host-run-ovn-kubernetes\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.723792 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.723591 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9af32d6e-ad97-4dd0-a962-8bc05ab87464-ovnkube-script-lib\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.723939 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.723845 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ffff79cd-77b6-4fa7-a7f2-a92f6efbb577-kubelet-dir\") pod \"aws-ebs-csi-driver-node-md68h\" (UID: \"ffff79cd-77b6-4fa7-a7f2-a92f6efbb577\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-md68h" Apr 23 16:35:16.724504 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.724047 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/204aff10-0787-493a-aee6-18d5bf3d74be-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6z9f6\" (UID: \"204aff10-0787-493a-aee6-18d5bf3d74be\") " pod="openshift-multus/multus-additional-cni-plugins-6z9f6" Apr 23 16:35:16.724504 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.724136 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e9cf07b9-000a-4b3d-b47a-38c57899b41d-os-release\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.724941 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.724569 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/204aff10-0787-493a-aee6-18d5bf3d74be-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6z9f6\" (UID: \"204aff10-0787-493a-aee6-18d5bf3d74be\") " pod="openshift-multus/multus-additional-cni-plugins-6z9f6" Apr 23 16:35:16.724941 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.724714 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9af32d6e-ad97-4dd0-a962-8bc05ab87464-env-overrides\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.725578 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.725548 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/243c2f8c-379f-4f30-8621-de9f2d7fc656-tmp\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.726770 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.726714 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d8172c3c-2af9-4787-983c-905f0f36a0b7-agent-certs\") pod \"konnectivity-agent-97r2m\" (UID: \"d8172c3c-2af9-4787-983c-905f0f36a0b7\") " pod="kube-system/konnectivity-agent-97r2m" Apr 23 16:35:16.727829 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.727711 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/243c2f8c-379f-4f30-8621-de9f2d7fc656-etc-tuned\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.727829 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.727807 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9af32d6e-ad97-4dd0-a962-8bc05ab87464-ovn-node-metrics-cert\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.728097 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.728075 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qrjk\" (UniqueName: \"kubernetes.io/projected/e9cf07b9-000a-4b3d-b47a-38c57899b41d-kube-api-access-7qrjk\") pod \"multus-9bf7w\" (UID: \"e9cf07b9-000a-4b3d-b47a-38c57899b41d\") " pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.728251 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:16.728232 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:16.728314 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:16.728262 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:16.728314 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:16.728275 2571 projected.go:194] Error preparing data for projected volume kube-api-access-q2bbv for pod openshift-network-diagnostics/network-check-target-nvscm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:16.728426 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:16.728377 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1e04a51c-81ee-4ef6-b546-0b1e611642d1-kube-api-access-q2bbv podName:1e04a51c-81ee-4ef6-b546-0b1e611642d1 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:17.228360029 +0000 UTC m=+3.106769621 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-q2bbv" (UniqueName: "kubernetes.io/projected/1e04a51c-81ee-4ef6-b546-0b1e611642d1-kube-api-access-q2bbv") pod "network-check-target-nvscm" (UID: "1e04a51c-81ee-4ef6-b546-0b1e611642d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:16.729656 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.729624 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j7bd\" (UniqueName: \"kubernetes.io/projected/cea1706c-bca2-4388-9143-d1aa2fbcdc62-kube-api-access-5j7bd\") pod \"node-ca-zrzxd\" (UID: \"cea1706c-bca2-4388-9143-d1aa2fbcdc62\") " pod="openshift-image-registry/node-ca-zrzxd" Apr 23 16:35:16.729897 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.729866 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxlnn\" (UniqueName: \"kubernetes.io/projected/243c2f8c-379f-4f30-8621-de9f2d7fc656-kube-api-access-xxlnn\") pod \"tuned-8dckj\" (UID: \"243c2f8c-379f-4f30-8621-de9f2d7fc656\") " pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.730694 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.730649 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls76g\" (UniqueName: \"kubernetes.io/projected/204aff10-0787-493a-aee6-18d5bf3d74be-kube-api-access-ls76g\") pod \"multus-additional-cni-plugins-6z9f6\" (UID: \"204aff10-0787-493a-aee6-18d5bf3d74be\") " pod="openshift-multus/multus-additional-cni-plugins-6z9f6" Apr 23 16:35:16.731743 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.731720 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9r5w\" (UniqueName: \"kubernetes.io/projected/ffff79cd-77b6-4fa7-a7f2-a92f6efbb577-kube-api-access-m9r5w\") pod \"aws-ebs-csi-driver-node-md68h\" (UID: \"ffff79cd-77b6-4fa7-a7f2-a92f6efbb577\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-md68h" Apr 23 16:35:16.731990 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.731956 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgd2l\" (UniqueName: \"kubernetes.io/projected/dbfa21d6-bf64-400e-9cbe-b26171f7ef99-kube-api-access-sgd2l\") pod \"network-metrics-daemon-7kt9x\" (UID: \"dbfa21d6-bf64-400e-9cbe-b26171f7ef99\") " pod="openshift-multus/network-metrics-daemon-7kt9x" Apr 23 16:35:16.732891 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.732872 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj4vx\" (UniqueName: \"kubernetes.io/projected/9af32d6e-ad97-4dd0-a962-8bc05ab87464-kube-api-access-jj4vx\") pod \"ovnkube-node-q6qml\" (UID: \"9af32d6e-ad97-4dd0-a962-8bc05ab87464\") " pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.733337 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.733314 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4t5d\" (UniqueName: \"kubernetes.io/projected/adecd1bd-5d92-46f9-97d2-d824dfa55524-kube-api-access-c4t5d\") pod \"iptables-alerter-p6qdp\" (UID: \"adecd1bd-5d92-46f9-97d2-d824dfa55524\") " pod="openshift-network-operator/iptables-alerter-p6qdp" Apr 23 16:35:16.733760 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.733741 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x74gk\" (UniqueName: \"kubernetes.io/projected/ee000696-33d6-4a0e-a2ac-f710ef5e1515-kube-api-access-x74gk\") pod \"node-resolver-rpmks\" (UID: \"ee000696-33d6-4a0e-a2ac-f710ef5e1515\") " pod="openshift-dns/node-resolver-rpmks" Apr 23 16:35:16.817900 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.817871 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:16.902559 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.902472 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9bf7w" Apr 23 16:35:16.911405 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.911373 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:16.918419 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.918387 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-97r2m" Apr 23 16:35:16.924042 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.924015 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-p6qdp" Apr 23 16:35:16.930626 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.930591 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rpmks" Apr 23 16:35:16.936159 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.936141 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6z9f6" Apr 23 16:35:16.942689 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.942666 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-md68h" Apr 23 16:35:16.949266 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.949245 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8dckj" Apr 23 16:35:16.954818 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:16.954800 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zrzxd" Apr 23 16:35:17.152102 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:17.152071 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9cf07b9_000a_4b3d_b47a_38c57899b41d.slice/crio-2d8b67e30e99a21455b3ae8bd5a1764343677d490c0a08ad1905963263ac3c9a WatchSource:0}: Error finding container 2d8b67e30e99a21455b3ae8bd5a1764343677d490c0a08ad1905963263ac3c9a: Status 404 returned error can't find the container with id 2d8b67e30e99a21455b3ae8bd5a1764343677d490c0a08ad1905963263ac3c9a Apr 23 16:35:17.153213 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:17.153189 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee000696_33d6_4a0e_a2ac_f710ef5e1515.slice/crio-96d5a9da4c70168e55c511d4e2baedb8a8cdae5eac95d67bf6b0ff2c14284c09 WatchSource:0}: Error finding container 96d5a9da4c70168e55c511d4e2baedb8a8cdae5eac95d67bf6b0ff2c14284c09: Status 404 returned error can't find the container with id 96d5a9da4c70168e55c511d4e2baedb8a8cdae5eac95d67bf6b0ff2c14284c09 Apr 23 16:35:17.155096 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:17.154904 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcea1706c_bca2_4388_9143_d1aa2fbcdc62.slice/crio-0b6fc359505d508010a3d595e4fae742263775f8f79cb08ef6d5fa0bf7ecef78 WatchSource:0}: Error finding container 0b6fc359505d508010a3d595e4fae742263775f8f79cb08ef6d5fa0bf7ecef78: Status 404 returned error can't find the container with id 0b6fc359505d508010a3d595e4fae742263775f8f79cb08ef6d5fa0bf7ecef78 Apr 23 16:35:17.159004 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:17.158978 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8172c3c_2af9_4787_983c_905f0f36a0b7.slice/crio-4b4f5405e38fe53a8d3fd717f89175e5f387d7e76e9c982df032880f63b6cafa WatchSource:0}: Error finding container 4b4f5405e38fe53a8d3fd717f89175e5f387d7e76e9c982df032880f63b6cafa: Status 404 returned error can't find the container with id 4b4f5405e38fe53a8d3fd717f89175e5f387d7e76e9c982df032880f63b6cafa Apr 23 16:35:17.160076 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:17.160053 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod204aff10_0787_493a_aee6_18d5bf3d74be.slice/crio-05a3704ba1c163756a45270c6cd12bb1dac35468011896a2b6fa4b48e72c8b0e WatchSource:0}: Error finding container 05a3704ba1c163756a45270c6cd12bb1dac35468011896a2b6fa4b48e72c8b0e: Status 404 returned error can't find the container with id 05a3704ba1c163756a45270c6cd12bb1dac35468011896a2b6fa4b48e72c8b0e Apr 23 16:35:17.160546 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:17.160523 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffff79cd_77b6_4fa7_a7f2_a92f6efbb577.slice/crio-f79a8b288a717d1125b19313705690117bd43c926394b105ad6bec429a8dca10 WatchSource:0}: Error finding container f79a8b288a717d1125b19313705690117bd43c926394b105ad6bec429a8dca10: Status 404 returned error can't find the container with id f79a8b288a717d1125b19313705690117bd43c926394b105ad6bec429a8dca10 Apr 23 16:35:17.161308 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:17.161269 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadecd1bd_5d92_46f9_97d2_d824dfa55524.slice/crio-af1332a00c53298493bacdd044e36105b1d5166bedf95d80e182ad2c73b44cc1 WatchSource:0}: Error finding container af1332a00c53298493bacdd044e36105b1d5166bedf95d80e182ad2c73b44cc1: Status 404 returned error can't find the container with id af1332a00c53298493bacdd044e36105b1d5166bedf95d80e182ad2c73b44cc1 Apr 23 16:35:17.162415 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:17.162364 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod243c2f8c_379f_4f30_8621_de9f2d7fc656.slice/crio-41f8065f5f913c8a8e6e310b560b56d9e84ba7e5c40b837e4d7d707923e04fc1 WatchSource:0}: Error finding container 41f8065f5f913c8a8e6e310b560b56d9e84ba7e5c40b837e4d7d707923e04fc1: Status 404 returned error can't find the container with id 41f8065f5f913c8a8e6e310b560b56d9e84ba7e5c40b837e4d7d707923e04fc1 Apr 23 16:35:17.163355 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:17.163295 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9af32d6e_ad97_4dd0_a962_8bc05ab87464.slice/crio-35519d7b3f0cb116a882b790e7bd6bd77ab5937792efd3da0474928c64d2e1ae WatchSource:0}: Error finding container 35519d7b3f0cb116a882b790e7bd6bd77ab5937792efd3da0474928c64d2e1ae: Status 404 returned error can't find the container with id 35519d7b3f0cb116a882b790e7bd6bd77ab5937792efd3da0474928c64d2e1ae Apr 23 16:35:17.225348 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:17.225261 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbfa21d6-bf64-400e-9cbe-b26171f7ef99-metrics-certs\") pod \"network-metrics-daemon-7kt9x\" (UID: \"dbfa21d6-bf64-400e-9cbe-b26171f7ef99\") " pod="openshift-multus/network-metrics-daemon-7kt9x" Apr 23 16:35:17.225420 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:17.225405 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:17.225486 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:17.225469 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbfa21d6-bf64-400e-9cbe-b26171f7ef99-metrics-certs podName:dbfa21d6-bf64-400e-9cbe-b26171f7ef99 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:18.225447934 +0000 UTC m=+4.103857533 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dbfa21d6-bf64-400e-9cbe-b26171f7ef99-metrics-certs") pod "network-metrics-daemon-7kt9x" (UID: "dbfa21d6-bf64-400e-9cbe-b26171f7ef99") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:17.326456 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:17.326426 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q2bbv\" (UniqueName: \"kubernetes.io/projected/1e04a51c-81ee-4ef6-b546-0b1e611642d1-kube-api-access-q2bbv\") pod \"network-check-target-nvscm\" (UID: \"1e04a51c-81ee-4ef6-b546-0b1e611642d1\") " pod="openshift-network-diagnostics/network-check-target-nvscm" Apr 23 16:35:17.326610 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:17.326542 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:17.326610 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:17.326568 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:17.326610 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:17.326581 2571 projected.go:194] Error preparing data for projected volume kube-api-access-q2bbv for pod openshift-network-diagnostics/network-check-target-nvscm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:17.326761 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:17.326654 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1e04a51c-81ee-4ef6-b546-0b1e611642d1-kube-api-access-q2bbv podName:1e04a51c-81ee-4ef6-b546-0b1e611642d1 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:18.326635097 +0000 UTC m=+4.205044681 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-q2bbv" (UniqueName: "kubernetes.io/projected/1e04a51c-81ee-4ef6-b546-0b1e611642d1-kube-api-access-q2bbv") pod "network-check-target-nvscm" (UID: "1e04a51c-81ee-4ef6-b546-0b1e611642d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:17.688111 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:17.688071 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 16:30:15 +0000 UTC" deadline="2027-10-31 14:43:34.089141662 +0000 UTC" Apr 23 16:35:17.688111 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:17.688107 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13342h8m16.401038067s" Apr 23 16:35:17.739812 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:17.739783 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvscm" Apr 23 16:35:17.740003 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:17.739921 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nvscm" podUID="1e04a51c-81ee-4ef6-b546-0b1e611642d1" Apr 23 16:35:17.759151 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:17.758386 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-8thv9"] Apr 23 16:35:17.760529 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:17.760196 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8thv9" Apr 23 16:35:17.760529 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:17.760273 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8thv9" podUID="810bb628-4255-436d-8a3c-1b1c2d54dc0e" Apr 23 16:35:17.765001 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:17.764975 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-190.ec2.internal" event={"ID":"f4da7ff0dd296ec0b2891ebc8475588c","Type":"ContainerStarted","Data":"88425cf877f2f44cbe4a11a08786d29e05dc51032df95bae3751ced0510a1cbc"} Apr 23 16:35:17.779243 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:17.779049 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" event={"ID":"9af32d6e-ad97-4dd0-a962-8bc05ab87464","Type":"ContainerStarted","Data":"35519d7b3f0cb116a882b790e7bd6bd77ab5937792efd3da0474928c64d2e1ae"} Apr 23 16:35:17.786054 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:17.783649 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8dckj" event={"ID":"243c2f8c-379f-4f30-8621-de9f2d7fc656","Type":"ContainerStarted","Data":"41f8065f5f913c8a8e6e310b560b56d9e84ba7e5c40b837e4d7d707923e04fc1"} Apr 23 16:35:17.795223 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:17.795186 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zrzxd" event={"ID":"cea1706c-bca2-4388-9143-d1aa2fbcdc62","Type":"ContainerStarted","Data":"0b6fc359505d508010a3d595e4fae742263775f8f79cb08ef6d5fa0bf7ecef78"} Apr 23 16:35:17.797408 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:17.797341 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rpmks" event={"ID":"ee000696-33d6-4a0e-a2ac-f710ef5e1515","Type":"ContainerStarted","Data":"96d5a9da4c70168e55c511d4e2baedb8a8cdae5eac95d67bf6b0ff2c14284c09"} Apr 23 16:35:17.807006 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:17.806954 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-p6qdp" event={"ID":"adecd1bd-5d92-46f9-97d2-d824dfa55524","Type":"ContainerStarted","Data":"af1332a00c53298493bacdd044e36105b1d5166bedf95d80e182ad2c73b44cc1"} Apr 23 16:35:17.808772 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:17.808712 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-md68h" event={"ID":"ffff79cd-77b6-4fa7-a7f2-a92f6efbb577","Type":"ContainerStarted","Data":"f79a8b288a717d1125b19313705690117bd43c926394b105ad6bec429a8dca10"} Apr 23 16:35:17.811271 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:17.811220 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6z9f6" event={"ID":"204aff10-0787-493a-aee6-18d5bf3d74be","Type":"ContainerStarted","Data":"05a3704ba1c163756a45270c6cd12bb1dac35468011896a2b6fa4b48e72c8b0e"} Apr 23 16:35:17.813863 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:17.813815 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-97r2m" event={"ID":"d8172c3c-2af9-4787-983c-905f0f36a0b7","Type":"ContainerStarted","Data":"4b4f5405e38fe53a8d3fd717f89175e5f387d7e76e9c982df032880f63b6cafa"} Apr 23 16:35:17.819412 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:17.819369 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9bf7w" event={"ID":"e9cf07b9-000a-4b3d-b47a-38c57899b41d","Type":"ContainerStarted","Data":"2d8b67e30e99a21455b3ae8bd5a1764343677d490c0a08ad1905963263ac3c9a"} Apr 23 16:35:17.830706 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:17.830532 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/810bb628-4255-436d-8a3c-1b1c2d54dc0e-dbus\") pod \"global-pull-secret-syncer-8thv9\" (UID: \"810bb628-4255-436d-8a3c-1b1c2d54dc0e\") " pod="kube-system/global-pull-secret-syncer-8thv9" Apr 23 16:35:17.830706 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:17.830574 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/810bb628-4255-436d-8a3c-1b1c2d54dc0e-original-pull-secret\") pod \"global-pull-secret-syncer-8thv9\" (UID: \"810bb628-4255-436d-8a3c-1b1c2d54dc0e\") " pod="kube-system/global-pull-secret-syncer-8thv9" Apr 23 16:35:17.830706 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:17.830637 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/810bb628-4255-436d-8a3c-1b1c2d54dc0e-kubelet-config\") pod \"global-pull-secret-syncer-8thv9\" (UID: \"810bb628-4255-436d-8a3c-1b1c2d54dc0e\") " pod="kube-system/global-pull-secret-syncer-8thv9" Apr 23 16:35:17.931988 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:17.931944 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/810bb628-4255-436d-8a3c-1b1c2d54dc0e-dbus\") pod \"global-pull-secret-syncer-8thv9\" (UID: \"810bb628-4255-436d-8a3c-1b1c2d54dc0e\") " pod="kube-system/global-pull-secret-syncer-8thv9" Apr 23 16:35:17.932150 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:17.932002 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/810bb628-4255-436d-8a3c-1b1c2d54dc0e-original-pull-secret\") pod \"global-pull-secret-syncer-8thv9\" (UID: \"810bb628-4255-436d-8a3c-1b1c2d54dc0e\") " pod="kube-system/global-pull-secret-syncer-8thv9" Apr 23 16:35:17.932150 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:17.932050 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/810bb628-4255-436d-8a3c-1b1c2d54dc0e-kubelet-config\") pod \"global-pull-secret-syncer-8thv9\" (UID: \"810bb628-4255-436d-8a3c-1b1c2d54dc0e\") " pod="kube-system/global-pull-secret-syncer-8thv9" Apr 23 16:35:17.932274 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:17.932178 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/810bb628-4255-436d-8a3c-1b1c2d54dc0e-kubelet-config\") pod \"global-pull-secret-syncer-8thv9\" (UID: \"810bb628-4255-436d-8a3c-1b1c2d54dc0e\") " pod="kube-system/global-pull-secret-syncer-8thv9" Apr 23 16:35:17.932274 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:17.932246 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/810bb628-4255-436d-8a3c-1b1c2d54dc0e-dbus\") pod \"global-pull-secret-syncer-8thv9\" (UID: \"810bb628-4255-436d-8a3c-1b1c2d54dc0e\") " pod="kube-system/global-pull-secret-syncer-8thv9" Apr 23 16:35:17.932450 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:17.932424 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:17.932515 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:17.932488 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/810bb628-4255-436d-8a3c-1b1c2d54dc0e-original-pull-secret podName:810bb628-4255-436d-8a3c-1b1c2d54dc0e nodeName:}" failed. No retries permitted until 2026-04-23 16:35:18.432469227 +0000 UTC m=+4.310878813 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/810bb628-4255-436d-8a3c-1b1c2d54dc0e-original-pull-secret") pod "global-pull-secret-syncer-8thv9" (UID: "810bb628-4255-436d-8a3c-1b1c2d54dc0e") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:18.235466 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:18.235425 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbfa21d6-bf64-400e-9cbe-b26171f7ef99-metrics-certs\") pod \"network-metrics-daemon-7kt9x\" (UID: \"dbfa21d6-bf64-400e-9cbe-b26171f7ef99\") " pod="openshift-multus/network-metrics-daemon-7kt9x" Apr 23 16:35:18.235815 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:18.235578 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:18.235815 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:18.235657 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbfa21d6-bf64-400e-9cbe-b26171f7ef99-metrics-certs podName:dbfa21d6-bf64-400e-9cbe-b26171f7ef99 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:20.235638436 +0000 UTC m=+6.114048035 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dbfa21d6-bf64-400e-9cbe-b26171f7ef99-metrics-certs") pod "network-metrics-daemon-7kt9x" (UID: "dbfa21d6-bf64-400e-9cbe-b26171f7ef99") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:18.336321 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:18.336280 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q2bbv\" (UniqueName: \"kubernetes.io/projected/1e04a51c-81ee-4ef6-b546-0b1e611642d1-kube-api-access-q2bbv\") pod \"network-check-target-nvscm\" (UID: \"1e04a51c-81ee-4ef6-b546-0b1e611642d1\") " pod="openshift-network-diagnostics/network-check-target-nvscm" Apr 23 16:35:18.336499 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:18.336458 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:18.336499 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:18.336477 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:18.336499 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:18.336490 2571 projected.go:194] Error preparing data for projected volume kube-api-access-q2bbv for pod openshift-network-diagnostics/network-check-target-nvscm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:18.336688 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:18.336546 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1e04a51c-81ee-4ef6-b546-0b1e611642d1-kube-api-access-q2bbv podName:1e04a51c-81ee-4ef6-b546-0b1e611642d1 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:20.336527788 +0000 UTC m=+6.214937386 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-q2bbv" (UniqueName: "kubernetes.io/projected/1e04a51c-81ee-4ef6-b546-0b1e611642d1-kube-api-access-q2bbv") pod "network-check-target-nvscm" (UID: "1e04a51c-81ee-4ef6-b546-0b1e611642d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:18.437204 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:18.437167 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/810bb628-4255-436d-8a3c-1b1c2d54dc0e-original-pull-secret\") pod \"global-pull-secret-syncer-8thv9\" (UID: \"810bb628-4255-436d-8a3c-1b1c2d54dc0e\") " pod="kube-system/global-pull-secret-syncer-8thv9" Apr 23 16:35:18.437367 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:18.437315 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:18.437419 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:18.437374 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/810bb628-4255-436d-8a3c-1b1c2d54dc0e-original-pull-secret podName:810bb628-4255-436d-8a3c-1b1c2d54dc0e nodeName:}" failed. No retries permitted until 2026-04-23 16:35:19.437357286 +0000 UTC m=+5.315766874 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/810bb628-4255-436d-8a3c-1b1c2d54dc0e-original-pull-secret") pod "global-pull-secret-syncer-8thv9" (UID: "810bb628-4255-436d-8a3c-1b1c2d54dc0e") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:18.739723 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:18.739692 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kt9x" Apr 23 16:35:18.740179 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:18.739833 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kt9x" podUID="dbfa21d6-bf64-400e-9cbe-b26171f7ef99" Apr 23 16:35:18.848394 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:18.848349 2571 generic.go:358] "Generic (PLEG): container finished" podID="2031b0c27b4132e2a5b8904a3843f99d" containerID="13ffe59d0d5d487b96814937071419edd32db2b933cf854d636e97632b33c419" exitCode=0 Apr 23 16:35:18.848570 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:18.848459 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-190.ec2.internal" event={"ID":"2031b0c27b4132e2a5b8904a3843f99d","Type":"ContainerDied","Data":"13ffe59d0d5d487b96814937071419edd32db2b933cf854d636e97632b33c419"} Apr 23 16:35:18.865379 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:18.864652 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-190.ec2.internal" podStartSLOduration=2.86463408 podStartE2EDuration="2.86463408s" podCreationTimestamp="2026-04-23 16:35:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:35:17.801255981 +0000 UTC m=+3.679665589" watchObservedRunningTime="2026-04-23 16:35:18.86463408 +0000 UTC m=+4.743043689" Apr 23 16:35:19.448634 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:19.448071 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/810bb628-4255-436d-8a3c-1b1c2d54dc0e-original-pull-secret\") pod \"global-pull-secret-syncer-8thv9\" (UID: \"810bb628-4255-436d-8a3c-1b1c2d54dc0e\") " pod="kube-system/global-pull-secret-syncer-8thv9" Apr 23 16:35:19.448634 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:19.448248 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:19.448634 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:19.448313 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/810bb628-4255-436d-8a3c-1b1c2d54dc0e-original-pull-secret podName:810bb628-4255-436d-8a3c-1b1c2d54dc0e nodeName:}" failed. No retries permitted until 2026-04-23 16:35:21.448294076 +0000 UTC m=+7.326703683 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/810bb628-4255-436d-8a3c-1b1c2d54dc0e-original-pull-secret") pod "global-pull-secret-syncer-8thv9" (UID: "810bb628-4255-436d-8a3c-1b1c2d54dc0e") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:19.739593 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:19.739520 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvscm" Apr 23 16:35:19.739771 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:19.739633 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nvscm" podUID="1e04a51c-81ee-4ef6-b546-0b1e611642d1" Apr 23 16:35:19.740138 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:19.739903 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8thv9" Apr 23 16:35:19.740138 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:19.739965 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8thv9" podUID="810bb628-4255-436d-8a3c-1b1c2d54dc0e" Apr 23 16:35:19.858446 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:19.857644 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-190.ec2.internal" event={"ID":"2031b0c27b4132e2a5b8904a3843f99d","Type":"ContainerStarted","Data":"d8fd8257b35f0f53dceefe7f35bb2b3aadc0154426f9f920a70e2189421b8a70"} Apr 23 16:35:20.255124 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:20.255087 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbfa21d6-bf64-400e-9cbe-b26171f7ef99-metrics-certs\") pod \"network-metrics-daemon-7kt9x\" (UID: \"dbfa21d6-bf64-400e-9cbe-b26171f7ef99\") " pod="openshift-multus/network-metrics-daemon-7kt9x" Apr 23 16:35:20.255303 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:20.255278 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:20.255378 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:20.255334 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbfa21d6-bf64-400e-9cbe-b26171f7ef99-metrics-certs podName:dbfa21d6-bf64-400e-9cbe-b26171f7ef99 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:24.255315699 +0000 UTC m=+10.133725283 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dbfa21d6-bf64-400e-9cbe-b26171f7ef99-metrics-certs") pod "network-metrics-daemon-7kt9x" (UID: "dbfa21d6-bf64-400e-9cbe-b26171f7ef99") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:20.355870 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:20.355717 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q2bbv\" (UniqueName: \"kubernetes.io/projected/1e04a51c-81ee-4ef6-b546-0b1e611642d1-kube-api-access-q2bbv\") pod \"network-check-target-nvscm\" (UID: \"1e04a51c-81ee-4ef6-b546-0b1e611642d1\") " pod="openshift-network-diagnostics/network-check-target-nvscm" Apr 23 16:35:20.356043 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:20.355899 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:20.356043 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:20.355925 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:20.356043 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:20.355940 2571 projected.go:194] Error preparing data for projected volume kube-api-access-q2bbv for pod openshift-network-diagnostics/network-check-target-nvscm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:20.356043 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:20.356018 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1e04a51c-81ee-4ef6-b546-0b1e611642d1-kube-api-access-q2bbv podName:1e04a51c-81ee-4ef6-b546-0b1e611642d1 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:24.355984932 +0000 UTC m=+10.234394517 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-q2bbv" (UniqueName: "kubernetes.io/projected/1e04a51c-81ee-4ef6-b546-0b1e611642d1-kube-api-access-q2bbv") pod "network-check-target-nvscm" (UID: "1e04a51c-81ee-4ef6-b546-0b1e611642d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:20.743821 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:20.743747 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kt9x" Apr 23 16:35:20.744261 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:20.743882 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kt9x" podUID="dbfa21d6-bf64-400e-9cbe-b26171f7ef99" Apr 23 16:35:21.464696 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:21.464092 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/810bb628-4255-436d-8a3c-1b1c2d54dc0e-original-pull-secret\") pod \"global-pull-secret-syncer-8thv9\" (UID: \"810bb628-4255-436d-8a3c-1b1c2d54dc0e\") " pod="kube-system/global-pull-secret-syncer-8thv9" Apr 23 16:35:21.464696 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:21.464224 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:21.464696 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:21.464286 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/810bb628-4255-436d-8a3c-1b1c2d54dc0e-original-pull-secret podName:810bb628-4255-436d-8a3c-1b1c2d54dc0e nodeName:}" failed. No retries permitted until 2026-04-23 16:35:25.464268267 +0000 UTC m=+11.342677853 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/810bb628-4255-436d-8a3c-1b1c2d54dc0e-original-pull-secret") pod "global-pull-secret-syncer-8thv9" (UID: "810bb628-4255-436d-8a3c-1b1c2d54dc0e") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:21.740205 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:21.740100 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8thv9" Apr 23 16:35:21.740372 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:21.740240 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8thv9" podUID="810bb628-4255-436d-8a3c-1b1c2d54dc0e" Apr 23 16:35:21.740372 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:21.740332 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvscm" Apr 23 16:35:21.740489 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:21.740437 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nvscm" podUID="1e04a51c-81ee-4ef6-b546-0b1e611642d1" Apr 23 16:35:22.740077 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:22.740042 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kt9x" Apr 23 16:35:22.740518 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:22.740188 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kt9x" podUID="dbfa21d6-bf64-400e-9cbe-b26171f7ef99" Apr 23 16:35:23.739670 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:23.739636 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvscm" Apr 23 16:35:23.739670 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:23.739684 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8thv9" Apr 23 16:35:23.739982 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:23.739780 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nvscm" podUID="1e04a51c-81ee-4ef6-b546-0b1e611642d1" Apr 23 16:35:23.739982 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:23.739844 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8thv9" podUID="810bb628-4255-436d-8a3c-1b1c2d54dc0e" Apr 23 16:35:24.290047 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:24.289980 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbfa21d6-bf64-400e-9cbe-b26171f7ef99-metrics-certs\") pod \"network-metrics-daemon-7kt9x\" (UID: \"dbfa21d6-bf64-400e-9cbe-b26171f7ef99\") " pod="openshift-multus/network-metrics-daemon-7kt9x" Apr 23 16:35:24.290535 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:24.290125 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:24.290535 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:24.290197 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbfa21d6-bf64-400e-9cbe-b26171f7ef99-metrics-certs podName:dbfa21d6-bf64-400e-9cbe-b26171f7ef99 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:32.290177753 +0000 UTC m=+18.168587342 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dbfa21d6-bf64-400e-9cbe-b26171f7ef99-metrics-certs") pod "network-metrics-daemon-7kt9x" (UID: "dbfa21d6-bf64-400e-9cbe-b26171f7ef99") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:24.391874 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:24.391297 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q2bbv\" (UniqueName: \"kubernetes.io/projected/1e04a51c-81ee-4ef6-b546-0b1e611642d1-kube-api-access-q2bbv\") pod \"network-check-target-nvscm\" (UID: \"1e04a51c-81ee-4ef6-b546-0b1e611642d1\") " pod="openshift-network-diagnostics/network-check-target-nvscm" Apr 23 16:35:24.391874 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:24.391447 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:24.391874 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:24.391467 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:24.391874 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:24.391479 2571 projected.go:194] Error preparing data for projected volume kube-api-access-q2bbv for pod openshift-network-diagnostics/network-check-target-nvscm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:24.391874 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:24.391535 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1e04a51c-81ee-4ef6-b546-0b1e611642d1-kube-api-access-q2bbv podName:1e04a51c-81ee-4ef6-b546-0b1e611642d1 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:32.391515051 +0000 UTC m=+18.269924641 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-q2bbv" (UniqueName: "kubernetes.io/projected/1e04a51c-81ee-4ef6-b546-0b1e611642d1-kube-api-access-q2bbv") pod "network-check-target-nvscm" (UID: "1e04a51c-81ee-4ef6-b546-0b1e611642d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:24.742169 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:24.740577 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kt9x" Apr 23 16:35:24.742169 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:24.740713 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kt9x" podUID="dbfa21d6-bf64-400e-9cbe-b26171f7ef99" Apr 23 16:35:25.501018 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:25.500965 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/810bb628-4255-436d-8a3c-1b1c2d54dc0e-original-pull-secret\") pod \"global-pull-secret-syncer-8thv9\" (UID: \"810bb628-4255-436d-8a3c-1b1c2d54dc0e\") " pod="kube-system/global-pull-secret-syncer-8thv9" Apr 23 16:35:25.501449 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:25.501087 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:25.501449 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:25.501156 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/810bb628-4255-436d-8a3c-1b1c2d54dc0e-original-pull-secret podName:810bb628-4255-436d-8a3c-1b1c2d54dc0e nodeName:}" failed. No retries permitted until 2026-04-23 16:35:33.501136532 +0000 UTC m=+19.379546133 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/810bb628-4255-436d-8a3c-1b1c2d54dc0e-original-pull-secret") pod "global-pull-secret-syncer-8thv9" (UID: "810bb628-4255-436d-8a3c-1b1c2d54dc0e") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:25.740265 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:25.740227 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8thv9" Apr 23 16:35:25.740443 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:25.740229 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvscm" Apr 23 16:35:25.740443 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:25.740355 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8thv9" podUID="810bb628-4255-436d-8a3c-1b1c2d54dc0e" Apr 23 16:35:25.740443 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:25.740434 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nvscm" podUID="1e04a51c-81ee-4ef6-b546-0b1e611642d1" Apr 23 16:35:26.740297 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:26.739995 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kt9x" Apr 23 16:35:26.740761 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:26.740366 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kt9x" podUID="dbfa21d6-bf64-400e-9cbe-b26171f7ef99" Apr 23 16:35:27.739448 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:27.739419 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvscm" Apr 23 16:35:27.739650 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:27.739492 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8thv9" Apr 23 16:35:27.739650 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:27.739612 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8thv9" podUID="810bb628-4255-436d-8a3c-1b1c2d54dc0e" Apr 23 16:35:27.739780 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:27.739752 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nvscm" podUID="1e04a51c-81ee-4ef6-b546-0b1e611642d1" Apr 23 16:35:28.739645 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:28.739610 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kt9x" Apr 23 16:35:28.740095 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:28.739725 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kt9x" podUID="dbfa21d6-bf64-400e-9cbe-b26171f7ef99" Apr 23 16:35:29.739756 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:29.739720 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8thv9" Apr 23 16:35:29.740299 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:29.739726 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvscm" Apr 23 16:35:29.740299 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:29.739835 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8thv9" podUID="810bb628-4255-436d-8a3c-1b1c2d54dc0e" Apr 23 16:35:29.740299 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:29.739934 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nvscm" podUID="1e04a51c-81ee-4ef6-b546-0b1e611642d1" Apr 23 16:35:30.739793 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:30.739759 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kt9x" Apr 23 16:35:30.740263 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:30.739900 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kt9x" podUID="dbfa21d6-bf64-400e-9cbe-b26171f7ef99" Apr 23 16:35:31.739490 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:31.739452 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8thv9" Apr 23 16:35:31.739685 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:31.739497 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvscm" Apr 23 16:35:31.739685 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:31.739568 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8thv9" podUID="810bb628-4255-436d-8a3c-1b1c2d54dc0e" Apr 23 16:35:31.739775 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:31.739689 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nvscm" podUID="1e04a51c-81ee-4ef6-b546-0b1e611642d1" Apr 23 16:35:32.354098 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:32.354068 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbfa21d6-bf64-400e-9cbe-b26171f7ef99-metrics-certs\") pod \"network-metrics-daemon-7kt9x\" (UID: \"dbfa21d6-bf64-400e-9cbe-b26171f7ef99\") " pod="openshift-multus/network-metrics-daemon-7kt9x" Apr 23 16:35:32.354474 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:32.354207 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:32.354474 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:32.354269 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbfa21d6-bf64-400e-9cbe-b26171f7ef99-metrics-certs podName:dbfa21d6-bf64-400e-9cbe-b26171f7ef99 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:48.35425121 +0000 UTC m=+34.232660808 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dbfa21d6-bf64-400e-9cbe-b26171f7ef99-metrics-certs") pod "network-metrics-daemon-7kt9x" (UID: "dbfa21d6-bf64-400e-9cbe-b26171f7ef99") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:32.454823 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:32.454787 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q2bbv\" (UniqueName: \"kubernetes.io/projected/1e04a51c-81ee-4ef6-b546-0b1e611642d1-kube-api-access-q2bbv\") pod \"network-check-target-nvscm\" (UID: \"1e04a51c-81ee-4ef6-b546-0b1e611642d1\") " pod="openshift-network-diagnostics/network-check-target-nvscm" Apr 23 16:35:32.454975 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:32.454916 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:32.454975 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:32.454932 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:32.454975 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:32.454940 2571 projected.go:194] Error preparing data for projected volume kube-api-access-q2bbv for pod openshift-network-diagnostics/network-check-target-nvscm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:32.455074 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:32.454991 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1e04a51c-81ee-4ef6-b546-0b1e611642d1-kube-api-access-q2bbv podName:1e04a51c-81ee-4ef6-b546-0b1e611642d1 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:48.454977002 +0000 UTC m=+34.333386608 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-q2bbv" (UniqueName: "kubernetes.io/projected/1e04a51c-81ee-4ef6-b546-0b1e611642d1-kube-api-access-q2bbv") pod "network-check-target-nvscm" (UID: "1e04a51c-81ee-4ef6-b546-0b1e611642d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:32.739557 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:32.739490 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kt9x" Apr 23 16:35:32.739712 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:32.739615 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kt9x" podUID="dbfa21d6-bf64-400e-9cbe-b26171f7ef99" Apr 23 16:35:33.562451 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:33.562417 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/810bb628-4255-436d-8a3c-1b1c2d54dc0e-original-pull-secret\") pod \"global-pull-secret-syncer-8thv9\" (UID: \"810bb628-4255-436d-8a3c-1b1c2d54dc0e\") " pod="kube-system/global-pull-secret-syncer-8thv9" Apr 23 16:35:33.562889 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:33.562567 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:33.562889 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:33.562643 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/810bb628-4255-436d-8a3c-1b1c2d54dc0e-original-pull-secret podName:810bb628-4255-436d-8a3c-1b1c2d54dc0e nodeName:}" failed. No retries permitted until 2026-04-23 16:35:49.562625164 +0000 UTC m=+35.441034760 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/810bb628-4255-436d-8a3c-1b1c2d54dc0e-original-pull-secret") pod "global-pull-secret-syncer-8thv9" (UID: "810bb628-4255-436d-8a3c-1b1c2d54dc0e") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:33.740208 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:33.740171 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvscm" Apr 23 16:35:33.740366 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:33.740286 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nvscm" podUID="1e04a51c-81ee-4ef6-b546-0b1e611642d1" Apr 23 16:35:33.740366 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:33.740354 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8thv9" Apr 23 16:35:33.740517 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:33.740491 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8thv9" podUID="810bb628-4255-436d-8a3c-1b1c2d54dc0e" Apr 23 16:35:34.740735 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:34.740533 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kt9x" Apr 23 16:35:34.741359 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:34.740838 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kt9x" podUID="dbfa21d6-bf64-400e-9cbe-b26171f7ef99" Apr 23 16:35:34.887365 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:34.887333 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9bf7w" event={"ID":"e9cf07b9-000a-4b3d-b47a-38c57899b41d","Type":"ContainerStarted","Data":"50429174da9d360affba0d53d9ed7273546b0f9b766811bd64506a287c6f13c0"} Apr 23 16:35:34.888735 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:34.888714 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6qml_9af32d6e-ad97-4dd0-a962-8bc05ab87464/ovn-acl-logging/0.log" Apr 23 16:35:34.889011 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:34.888992 2571 generic.go:358] "Generic (PLEG): container finished" podID="9af32d6e-ad97-4dd0-a962-8bc05ab87464" containerID="94ccfd5b1f1bd2a7155221abaa8f4a2facb7401a6499eca1bc70a605da1c46e7" exitCode=1 Apr 23 16:35:34.889064 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:34.889046 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" event={"ID":"9af32d6e-ad97-4dd0-a962-8bc05ab87464","Type":"ContainerDied","Data":"94ccfd5b1f1bd2a7155221abaa8f4a2facb7401a6499eca1bc70a605da1c46e7"} Apr 23 16:35:34.889107 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:34.889070 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" event={"ID":"9af32d6e-ad97-4dd0-a962-8bc05ab87464","Type":"ContainerStarted","Data":"6b7cce6c98958d69aa32373b88d02eefeda589807d0b559d235f9efc85b2b2eb"} Apr 23 16:35:34.890182 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:34.890158 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8dckj" event={"ID":"243c2f8c-379f-4f30-8621-de9f2d7fc656","Type":"ContainerStarted","Data":"f3ff68bc5b61ddefd3bf85b9216f5617b93182a5fee979cf85a71193aad9436d"} Apr 23 16:35:34.891260 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:34.891242 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zrzxd" event={"ID":"cea1706c-bca2-4388-9143-d1aa2fbcdc62","Type":"ContainerStarted","Data":"e4f736d3b1ca79d95f268aa13954b61ec62b5145b644d3605b7e989b3f8e8a65"} Apr 23 16:35:34.892288 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:34.892267 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rpmks" event={"ID":"ee000696-33d6-4a0e-a2ac-f710ef5e1515","Type":"ContainerStarted","Data":"41f838601893b8d17e45fb0fe757522b659dc77079082e27af7f3645c3e52134"} Apr 23 16:35:34.893444 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:34.893423 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-md68h" event={"ID":"ffff79cd-77b6-4fa7-a7f2-a92f6efbb577","Type":"ContainerStarted","Data":"a5d042a3e036621b638dc7be805b67a2364eb1018eabecc44786478438cb0c82"} Apr 23 16:35:34.894609 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:34.894575 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6z9f6" event={"ID":"204aff10-0787-493a-aee6-18d5bf3d74be","Type":"ContainerStarted","Data":"1ca8a105d27ae1147c72102c75e523dcc809d3967267520885aad31109367978"} Apr 23 16:35:34.895664 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:34.895644 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-97r2m" event={"ID":"d8172c3c-2af9-4787-983c-905f0f36a0b7","Type":"ContainerStarted","Data":"f1ca5481a88ebed008964e3b01ef87bda8e2f7a83cb4c52ba0bfc015ba17a3a3"} Apr 23 16:35:34.909006 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:34.908971 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-190.ec2.internal" podStartSLOduration=18.90895991 podStartE2EDuration="18.90895991s" podCreationTimestamp="2026-04-23 16:35:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:35:19.881246262 +0000 UTC m=+5.759655868" watchObservedRunningTime="2026-04-23 16:35:34.90895991 +0000 UTC m=+20.787369514" Apr 23 16:35:34.909110 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:34.909047 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-9bf7w" podStartSLOduration=3.60576173 podStartE2EDuration="20.909041254s" podCreationTimestamp="2026-04-23 16:35:14 +0000 UTC" firstStartedPulling="2026-04-23 16:35:17.153826404 +0000 UTC m=+3.032236002" lastFinishedPulling="2026-04-23 16:35:34.45710594 +0000 UTC m=+20.335515526" observedRunningTime="2026-04-23 16:35:34.908679117 +0000 UTC m=+20.787088738" watchObservedRunningTime="2026-04-23 16:35:34.909041254 +0000 UTC m=+20.787450860" Apr 23 16:35:34.990280 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:34.990217 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-rpmks" podStartSLOduration=3.7054969829999997 podStartE2EDuration="20.990197808s" podCreationTimestamp="2026-04-23 16:35:14 +0000 UTC" firstStartedPulling="2026-04-23 16:35:17.156824973 +0000 UTC m=+3.035234556" lastFinishedPulling="2026-04-23 16:35:34.441525784 +0000 UTC m=+20.319935381" observedRunningTime="2026-04-23 16:35:34.989882237 +0000 UTC m=+20.868291852" watchObservedRunningTime="2026-04-23 16:35:34.990197808 +0000 UTC m=+20.868607414" Apr 23 16:35:34.990851 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:34.990812 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-97r2m" podStartSLOduration=3.750897905 podStartE2EDuration="20.990801729s" podCreationTimestamp="2026-04-23 16:35:14 +0000 UTC" firstStartedPulling="2026-04-23 16:35:17.16062339 +0000 UTC m=+3.039032986" lastFinishedPulling="2026-04-23 16:35:34.400527214 +0000 UTC m=+20.278936810" observedRunningTime="2026-04-23 16:35:34.966730092 +0000 UTC m=+20.845139698" watchObservedRunningTime="2026-04-23 16:35:34.990801729 +0000 UTC m=+20.869211386" Apr 23 16:35:35.027014 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:35.026959 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-8dckj" podStartSLOduration=2.7478611710000003 podStartE2EDuration="20.026939348s" podCreationTimestamp="2026-04-23 16:35:15 +0000 UTC" firstStartedPulling="2026-04-23 16:35:17.164554982 +0000 UTC m=+3.042964568" lastFinishedPulling="2026-04-23 16:35:34.44363315 +0000 UTC m=+20.322042745" observedRunningTime="2026-04-23 16:35:35.009172909 +0000 UTC m=+20.887582514" watchObservedRunningTime="2026-04-23 16:35:35.026939348 +0000 UTC m=+20.905348952" Apr 23 16:35:35.740254 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:35.740029 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8thv9" Apr 23 16:35:35.740254 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:35.740029 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvscm" Apr 23 16:35:35.740444 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:35.740304 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8thv9" podUID="810bb628-4255-436d-8a3c-1b1c2d54dc0e" Apr 23 16:35:35.740444 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:35.740367 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nvscm" podUID="1e04a51c-81ee-4ef6-b546-0b1e611642d1" Apr 23 16:35:35.899315 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:35.899290 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6qml_9af32d6e-ad97-4dd0-a962-8bc05ab87464/ovn-acl-logging/0.log" Apr 23 16:35:35.899714 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:35.899612 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" event={"ID":"9af32d6e-ad97-4dd0-a962-8bc05ab87464","Type":"ContainerStarted","Data":"484fcc5376fb13c8378f041ba28a873636db4543f0e6d3492f8e60b6fa6863a0"} Apr 23 16:35:35.899714 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:35.899643 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" event={"ID":"9af32d6e-ad97-4dd0-a962-8bc05ab87464","Type":"ContainerStarted","Data":"20fdf782e64b3981665a1de44f5b89dbcbe7e19f51412fe059609470c18a09ad"} Apr 23 16:35:35.899714 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:35.899653 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" event={"ID":"9af32d6e-ad97-4dd0-a962-8bc05ab87464","Type":"ContainerStarted","Data":"36ea48dd54927f01fec0d7e4d4c77c485d5cff4b4bb8d1100fa5e33fe6bfe8ef"} Apr 23 16:35:35.899714 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:35.899666 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" event={"ID":"9af32d6e-ad97-4dd0-a962-8bc05ab87464","Type":"ContainerStarted","Data":"f2a9604ee384444ae901d3358c275fbb1d8c3da411c85130bbe1764610700a41"} Apr 23 16:35:35.900741 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:35.900713 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-p6qdp" event={"ID":"adecd1bd-5d92-46f9-97d2-d824dfa55524","Type":"ContainerStarted","Data":"c9661a744b6321333fa263891870a6820408ea7f0dfdc234e05fc5a06addf79f"} Apr 23 16:35:35.902065 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:35.902043 2571 generic.go:358] "Generic (PLEG): container finished" podID="204aff10-0787-493a-aee6-18d5bf3d74be" containerID="1ca8a105d27ae1147c72102c75e523dcc809d3967267520885aad31109367978" exitCode=0 Apr 23 16:35:35.902192 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:35.902152 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6z9f6" event={"ID":"204aff10-0787-493a-aee6-18d5bf3d74be","Type":"ContainerDied","Data":"1ca8a105d27ae1147c72102c75e523dcc809d3967267520885aad31109367978"} Apr 23 16:35:35.921671 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:35.921630 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-zrzxd" podStartSLOduration=11.788593996 podStartE2EDuration="20.921618668s" podCreationTimestamp="2026-04-23 16:35:15 +0000 UTC" firstStartedPulling="2026-04-23 16:35:17.157985177 +0000 UTC m=+3.036394760" lastFinishedPulling="2026-04-23 16:35:26.29100983 +0000 UTC m=+12.169419432" observedRunningTime="2026-04-23 16:35:35.029430812 +0000 UTC m=+20.907840417" watchObservedRunningTime="2026-04-23 16:35:35.921618668 +0000 UTC m=+21.800028264" Apr 23 16:35:35.957726 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:35.957686 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-p6qdp" podStartSLOduration=4.680833326 podStartE2EDuration="21.957674055s" podCreationTimestamp="2026-04-23 16:35:14 +0000 UTC" firstStartedPulling="2026-04-23 16:35:17.16380723 +0000 UTC m=+3.042216830" lastFinishedPulling="2026-04-23 16:35:34.440647973 +0000 UTC m=+20.319057559" observedRunningTime="2026-04-23 16:35:35.922057823 +0000 UTC m=+21.800467439" watchObservedRunningTime="2026-04-23 16:35:35.957674055 +0000 UTC m=+21.836083659" Apr 23 16:35:36.342877 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:36.342855 2571 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 16:35:36.670753 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:36.670556 2571 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T16:35:36.342871748Z","UUID":"d1094ec6-508a-4f95-bb8a-8dd913aead57","Handler":null,"Name":"","Endpoint":""} Apr 23 16:35:36.672197 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:36.672173 2571 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 16:35:36.672307 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:36.672204 2571 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 16:35:36.739458 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:36.739422 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kt9x" Apr 23 16:35:36.739692 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:36.739565 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kt9x" podUID="dbfa21d6-bf64-400e-9cbe-b26171f7ef99" Apr 23 16:35:36.905900 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:36.905866 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-md68h" event={"ID":"ffff79cd-77b6-4fa7-a7f2-a92f6efbb577","Type":"ContainerStarted","Data":"491bbb46eb3294dd2dd07ed24aad3fe4c8434b56cf605e924853e07269e7ae2b"} Apr 23 16:35:37.740274 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:37.740207 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8thv9" Apr 23 16:35:37.740408 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:37.740207 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvscm" Apr 23 16:35:37.740408 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:37.740318 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8thv9" podUID="810bb628-4255-436d-8a3c-1b1c2d54dc0e" Apr 23 16:35:37.740408 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:37.740388 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nvscm" podUID="1e04a51c-81ee-4ef6-b546-0b1e611642d1" Apr 23 16:35:37.910728 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:37.910696 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6qml_9af32d6e-ad97-4dd0-a962-8bc05ab87464/ovn-acl-logging/0.log" Apr 23 16:35:37.911271 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:37.911168 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" event={"ID":"9af32d6e-ad97-4dd0-a962-8bc05ab87464","Type":"ContainerStarted","Data":"e283a7417c8bbd4eb7502f34365e1583726dcaa5f02936af427bcfc240d0f9b6"} Apr 23 16:35:37.913108 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:37.913077 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-md68h" event={"ID":"ffff79cd-77b6-4fa7-a7f2-a92f6efbb577","Type":"ContainerStarted","Data":"176d324c512a3ecf39acd686c3f3b5c514cd090fb144b90aa1fcc9922cfb3001"} Apr 23 16:35:38.739830 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:38.739795 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kt9x" Apr 23 16:35:38.740004 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:38.739939 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kt9x" podUID="dbfa21d6-bf64-400e-9cbe-b26171f7ef99" Apr 23 16:35:39.413491 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:39.413460 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-97r2m" Apr 23 16:35:39.414487 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:39.414461 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-97r2m" Apr 23 16:35:39.433080 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:39.433019 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-md68h" podStartSLOduration=4.044157735 podStartE2EDuration="24.433002894s" podCreationTimestamp="2026-04-23 16:35:15 +0000 UTC" firstStartedPulling="2026-04-23 16:35:17.162580796 +0000 UTC m=+3.040990392" lastFinishedPulling="2026-04-23 16:35:37.551425955 +0000 UTC m=+23.429835551" observedRunningTime="2026-04-23 16:35:37.935320817 +0000 UTC m=+23.813730421" watchObservedRunningTime="2026-04-23 16:35:39.433002894 +0000 UTC m=+25.311412498" Apr 23 16:35:39.739981 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:39.739958 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8thv9" Apr 23 16:35:39.740127 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:39.739958 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvscm" Apr 23 16:35:39.740127 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:39.740081 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8thv9" podUID="810bb628-4255-436d-8a3c-1b1c2d54dc0e" Apr 23 16:35:39.740236 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:39.740157 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nvscm" podUID="1e04a51c-81ee-4ef6-b546-0b1e611642d1" Apr 23 16:35:39.916628 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:39.916436 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-97r2m" Apr 23 16:35:39.916921 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:39.916904 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-97r2m" Apr 23 16:35:40.739863 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:40.739827 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kt9x" Apr 23 16:35:40.740251 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:40.739943 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kt9x" podUID="dbfa21d6-bf64-400e-9cbe-b26171f7ef99" Apr 23 16:35:40.919272 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:40.919243 2571 generic.go:358] "Generic (PLEG): container finished" podID="204aff10-0787-493a-aee6-18d5bf3d74be" containerID="5aacd9b47bf9df06607c6104147a6f50d54415277bd8fd23d7c7b1bdbd670d1d" exitCode=0 Apr 23 16:35:40.919416 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:40.919318 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6z9f6" event={"ID":"204aff10-0787-493a-aee6-18d5bf3d74be","Type":"ContainerDied","Data":"5aacd9b47bf9df06607c6104147a6f50d54415277bd8fd23d7c7b1bdbd670d1d"} Apr 23 16:35:40.922378 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:40.922341 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6qml_9af32d6e-ad97-4dd0-a962-8bc05ab87464/ovn-acl-logging/0.log" Apr 23 16:35:40.922718 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:40.922694 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" event={"ID":"9af32d6e-ad97-4dd0-a962-8bc05ab87464","Type":"ContainerStarted","Data":"ed3324468b90a9a62f8de6d94b5312a5df182b5a2958e1b3abc7df077487b8ee"} Apr 23 16:35:40.923059 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:40.923031 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:40.923059 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:40.923058 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:40.923181 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:40.923162 2571 scope.go:117] "RemoveContainer" containerID="94ccfd5b1f1bd2a7155221abaa8f4a2facb7401a6499eca1bc70a605da1c46e7" Apr 23 16:35:40.938414 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:40.938396 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:40.938493 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:40.938458 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:41.041074 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:41.041009 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:35:41.739843 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:41.739671 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8thv9" Apr 23 16:35:41.739984 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:41.739722 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvscm" Apr 23 16:35:41.739984 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:41.739930 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8thv9" podUID="810bb628-4255-436d-8a3c-1b1c2d54dc0e" Apr 23 16:35:41.739984 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:41.739974 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nvscm" podUID="1e04a51c-81ee-4ef6-b546-0b1e611642d1" Apr 23 16:35:41.871446 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:41.871360 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7kt9x"] Apr 23 16:35:41.871590 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:41.871469 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kt9x" Apr 23 16:35:41.871680 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:41.871590 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kt9x" podUID="dbfa21d6-bf64-400e-9cbe-b26171f7ef99" Apr 23 16:35:41.872124 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:41.872105 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-nvscm"] Apr 23 16:35:41.872862 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:41.872840 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8thv9"] Apr 23 16:35:41.925804 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:41.925778 2571 generic.go:358] "Generic (PLEG): container finished" podID="204aff10-0787-493a-aee6-18d5bf3d74be" containerID="04231acccc8d8657d8571dcba1bed7ecb49482a886ad07ea0184539af3c58fff" exitCode=0 Apr 23 16:35:41.925925 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:41.925858 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6z9f6" event={"ID":"204aff10-0787-493a-aee6-18d5bf3d74be","Type":"ContainerDied","Data":"04231acccc8d8657d8571dcba1bed7ecb49482a886ad07ea0184539af3c58fff"} Apr 23 16:35:41.929009 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:41.928984 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6qml_9af32d6e-ad97-4dd0-a962-8bc05ab87464/ovn-acl-logging/0.log" Apr 23 16:35:41.929301 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:41.929274 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" event={"ID":"9af32d6e-ad97-4dd0-a962-8bc05ab87464","Type":"ContainerStarted","Data":"ec229134fe3444548873be93d9d4913345ce1083bb0f756a943b860cfb0c8eb1"} Apr 23 16:35:41.929384 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:41.929319 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvscm" Apr 23 16:35:41.929384 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:41.929322 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8thv9" Apr 23 16:35:41.929517 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:41.929476 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nvscm" podUID="1e04a51c-81ee-4ef6-b546-0b1e611642d1" Apr 23 16:35:41.929691 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:41.929668 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8thv9" podUID="810bb628-4255-436d-8a3c-1b1c2d54dc0e" Apr 23 16:35:41.993931 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:41.993884 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" podStartSLOduration=10.670919208 podStartE2EDuration="27.993871949s" podCreationTimestamp="2026-04-23 16:35:14 +0000 UTC" firstStartedPulling="2026-04-23 16:35:17.165622715 +0000 UTC m=+3.044032298" lastFinishedPulling="2026-04-23 16:35:34.488575439 +0000 UTC m=+20.366985039" observedRunningTime="2026-04-23 16:35:41.992335027 +0000 UTC m=+27.870744648" watchObservedRunningTime="2026-04-23 16:35:41.993871949 +0000 UTC m=+27.872281548" Apr 23 16:35:42.933278 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:42.933190 2571 generic.go:358] "Generic (PLEG): container finished" podID="204aff10-0787-493a-aee6-18d5bf3d74be" containerID="4fff05505101869c80f071532ab492c77b25f268791b65737010c2205dd00145" exitCode=0 Apr 23 16:35:42.933278 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:42.933269 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6z9f6" event={"ID":"204aff10-0787-493a-aee6-18d5bf3d74be","Type":"ContainerDied","Data":"4fff05505101869c80f071532ab492c77b25f268791b65737010c2205dd00145"} Apr 23 16:35:43.739651 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:43.739617 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvscm" Apr 23 16:35:43.739813 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:43.739622 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kt9x" Apr 23 16:35:43.739813 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:43.739741 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nvscm" podUID="1e04a51c-81ee-4ef6-b546-0b1e611642d1" Apr 23 16:35:43.739813 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:43.739617 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8thv9" Apr 23 16:35:43.739968 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:43.739810 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kt9x" podUID="dbfa21d6-bf64-400e-9cbe-b26171f7ef99" Apr 23 16:35:43.739968 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:43.739876 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8thv9" podUID="810bb628-4255-436d-8a3c-1b1c2d54dc0e" Apr 23 16:35:45.739513 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:45.739470 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kt9x" Apr 23 16:35:45.740051 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:45.739470 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8thv9" Apr 23 16:35:45.740051 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:45.739488 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvscm" Apr 23 16:35:45.740051 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:45.739614 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kt9x" podUID="dbfa21d6-bf64-400e-9cbe-b26171f7ef99" Apr 23 16:35:45.740051 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:45.739684 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8thv9" podUID="810bb628-4255-436d-8a3c-1b1c2d54dc0e" Apr 23 16:35:45.740051 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:45.739786 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nvscm" podUID="1e04a51c-81ee-4ef6-b546-0b1e611642d1" Apr 23 16:35:47.740135 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:47.739948 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvscm" Apr 23 16:35:47.740686 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:47.740005 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8thv9" Apr 23 16:35:47.740686 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:47.740218 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nvscm" podUID="1e04a51c-81ee-4ef6-b546-0b1e611642d1" Apr 23 16:35:47.740686 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:47.740033 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kt9x" Apr 23 16:35:47.740686 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:47.740387 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8thv9" podUID="810bb628-4255-436d-8a3c-1b1c2d54dc0e" Apr 23 16:35:47.740686 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:47.740418 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kt9x" podUID="dbfa21d6-bf64-400e-9cbe-b26171f7ef99" Apr 23 16:35:48.386257 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:48.386225 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbfa21d6-bf64-400e-9cbe-b26171f7ef99-metrics-certs\") pod \"network-metrics-daemon-7kt9x\" (UID: \"dbfa21d6-bf64-400e-9cbe-b26171f7ef99\") " pod="openshift-multus/network-metrics-daemon-7kt9x" Apr 23 16:35:48.386473 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:48.386383 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:48.386473 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:48.386448 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbfa21d6-bf64-400e-9cbe-b26171f7ef99-metrics-certs podName:dbfa21d6-bf64-400e-9cbe-b26171f7ef99 nodeName:}" failed. No retries permitted until 2026-04-23 16:36:20.386428308 +0000 UTC m=+66.264837891 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dbfa21d6-bf64-400e-9cbe-b26171f7ef99-metrics-certs") pod "network-metrics-daemon-7kt9x" (UID: "dbfa21d6-bf64-400e-9cbe-b26171f7ef99") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:48.427577 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:48.427554 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-190.ec2.internal" event="NodeReady" Apr 23 16:35:48.427723 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:48.427683 2571 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 16:35:48.480183 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:48.480158 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-79c4v"] Apr 23 16:35:48.487081 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:48.486990 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q2bbv\" (UniqueName: \"kubernetes.io/projected/1e04a51c-81ee-4ef6-b546-0b1e611642d1-kube-api-access-q2bbv\") pod \"network-check-target-nvscm\" (UID: \"1e04a51c-81ee-4ef6-b546-0b1e611642d1\") " pod="openshift-network-diagnostics/network-check-target-nvscm" Apr 23 16:35:48.487190 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:48.487099 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:48.487190 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:48.487117 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:48.487190 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:48.487155 2571 projected.go:194] Error preparing data for projected volume kube-api-access-q2bbv for pod openshift-network-diagnostics/network-check-target-nvscm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:48.487344 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:48.487206 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1e04a51c-81ee-4ef6-b546-0b1e611642d1-kube-api-access-q2bbv podName:1e04a51c-81ee-4ef6-b546-0b1e611642d1 nodeName:}" failed. No retries permitted until 2026-04-23 16:36:20.487192387 +0000 UTC m=+66.365601982 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-q2bbv" (UniqueName: "kubernetes.io/projected/1e04a51c-81ee-4ef6-b546-0b1e611642d1-kube-api-access-q2bbv") pod "network-check-target-nvscm" (UID: "1e04a51c-81ee-4ef6-b546-0b1e611642d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:48.502254 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:48.502231 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-bcpj7"] Apr 23 16:35:48.502411 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:48.502391 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-79c4v" Apr 23 16:35:48.505230 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:48.505182 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 16:35:48.505368 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:48.505266 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 16:35:48.505368 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:48.505360 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vvvrx\"" Apr 23 16:35:48.528985 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:48.528963 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-79c4v"] Apr 23 16:35:48.528985 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:48.528990 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bcpj7"] Apr 23 16:35:48.529124 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:48.529090 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bcpj7" Apr 23 16:35:48.531982 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:48.531964 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 16:35:48.532188 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:48.532174 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l9v47\"" Apr 23 16:35:48.532249 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:48.532231 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 16:35:48.532433 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:48.532419 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 16:35:48.687832 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:48.687801 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/90a007dd-2ad4-4fff-96a4-2c15593ab169-config-volume\") pod \"dns-default-79c4v\" (UID: \"90a007dd-2ad4-4fff-96a4-2c15593ab169\") " pod="openshift-dns/dns-default-79c4v" Apr 23 16:35:48.688001 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:48.687844 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdfcl\" (UniqueName: \"kubernetes.io/projected/e8ef03b2-bf64-4538-9ba2-cd786001399c-kube-api-access-wdfcl\") pod \"ingress-canary-bcpj7\" (UID: \"e8ef03b2-bf64-4538-9ba2-cd786001399c\") " pod="openshift-ingress-canary/ingress-canary-bcpj7" Apr 23 16:35:48.688001 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:48.687937 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8ef03b2-bf64-4538-9ba2-cd786001399c-cert\") pod \"ingress-canary-bcpj7\" (UID: \"e8ef03b2-bf64-4538-9ba2-cd786001399c\") " pod="openshift-ingress-canary/ingress-canary-bcpj7" Apr 23 16:35:48.688115 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:48.688014 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhsnp\" (UniqueName: \"kubernetes.io/projected/90a007dd-2ad4-4fff-96a4-2c15593ab169-kube-api-access-vhsnp\") pod \"dns-default-79c4v\" (UID: \"90a007dd-2ad4-4fff-96a4-2c15593ab169\") " pod="openshift-dns/dns-default-79c4v" Apr 23 16:35:48.688115 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:48.688091 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/90a007dd-2ad4-4fff-96a4-2c15593ab169-tmp-dir\") pod \"dns-default-79c4v\" (UID: \"90a007dd-2ad4-4fff-96a4-2c15593ab169\") " pod="openshift-dns/dns-default-79c4v" Apr 23 16:35:48.688207 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:48.688131 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/90a007dd-2ad4-4fff-96a4-2c15593ab169-metrics-tls\") pod \"dns-default-79c4v\" (UID: \"90a007dd-2ad4-4fff-96a4-2c15593ab169\") " pod="openshift-dns/dns-default-79c4v" Apr 23 16:35:48.789086 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:48.789065 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhsnp\" (UniqueName: \"kubernetes.io/projected/90a007dd-2ad4-4fff-96a4-2c15593ab169-kube-api-access-vhsnp\") pod \"dns-default-79c4v\" (UID: \"90a007dd-2ad4-4fff-96a4-2c15593ab169\") " pod="openshift-dns/dns-default-79c4v" Apr 23 16:35:48.789434 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:48.789109 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/90a007dd-2ad4-4fff-96a4-2c15593ab169-tmp-dir\") pod \"dns-default-79c4v\" (UID: \"90a007dd-2ad4-4fff-96a4-2c15593ab169\") " pod="openshift-dns/dns-default-79c4v" Apr 23 16:35:48.789434 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:48.789126 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/90a007dd-2ad4-4fff-96a4-2c15593ab169-metrics-tls\") pod \"dns-default-79c4v\" (UID: \"90a007dd-2ad4-4fff-96a4-2c15593ab169\") " pod="openshift-dns/dns-default-79c4v" Apr 23 16:35:48.789434 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:48.789206 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:48.789434 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:48.789239 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/90a007dd-2ad4-4fff-96a4-2c15593ab169-config-volume\") pod \"dns-default-79c4v\" (UID: \"90a007dd-2ad4-4fff-96a4-2c15593ab169\") " pod="openshift-dns/dns-default-79c4v" Apr 23 16:35:48.789434 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:48.789251 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90a007dd-2ad4-4fff-96a4-2c15593ab169-metrics-tls podName:90a007dd-2ad4-4fff-96a4-2c15593ab169 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:49.28923563 +0000 UTC m=+35.167645213 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/90a007dd-2ad4-4fff-96a4-2c15593ab169-metrics-tls") pod "dns-default-79c4v" (UID: "90a007dd-2ad4-4fff-96a4-2c15593ab169") : secret "dns-default-metrics-tls" not found Apr 23 16:35:48.789434 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:48.789288 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wdfcl\" (UniqueName: \"kubernetes.io/projected/e8ef03b2-bf64-4538-9ba2-cd786001399c-kube-api-access-wdfcl\") pod \"ingress-canary-bcpj7\" (UID: \"e8ef03b2-bf64-4538-9ba2-cd786001399c\") " pod="openshift-ingress-canary/ingress-canary-bcpj7" Apr 23 16:35:48.789434 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:48.789333 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8ef03b2-bf64-4538-9ba2-cd786001399c-cert\") pod \"ingress-canary-bcpj7\" (UID: \"e8ef03b2-bf64-4538-9ba2-cd786001399c\") " pod="openshift-ingress-canary/ingress-canary-bcpj7" Apr 23 16:35:48.789434 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:48.789416 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/90a007dd-2ad4-4fff-96a4-2c15593ab169-tmp-dir\") pod \"dns-default-79c4v\" (UID: \"90a007dd-2ad4-4fff-96a4-2c15593ab169\") " pod="openshift-dns/dns-default-79c4v" Apr 23 16:35:48.789735 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:48.789451 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:48.789735 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:48.789497 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8ef03b2-bf64-4538-9ba2-cd786001399c-cert podName:e8ef03b2-bf64-4538-9ba2-cd786001399c nodeName:}" failed. No retries permitted until 2026-04-23 16:35:49.28948303 +0000 UTC m=+35.167892612 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e8ef03b2-bf64-4538-9ba2-cd786001399c-cert") pod "ingress-canary-bcpj7" (UID: "e8ef03b2-bf64-4538-9ba2-cd786001399c") : secret "canary-serving-cert" not found Apr 23 16:35:48.789802 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:48.789749 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/90a007dd-2ad4-4fff-96a4-2c15593ab169-config-volume\") pod \"dns-default-79c4v\" (UID: \"90a007dd-2ad4-4fff-96a4-2c15593ab169\") " pod="openshift-dns/dns-default-79c4v" Apr 23 16:35:48.802401 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:48.802374 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdfcl\" (UniqueName: \"kubernetes.io/projected/e8ef03b2-bf64-4538-9ba2-cd786001399c-kube-api-access-wdfcl\") pod \"ingress-canary-bcpj7\" (UID: \"e8ef03b2-bf64-4538-9ba2-cd786001399c\") " pod="openshift-ingress-canary/ingress-canary-bcpj7" Apr 23 16:35:48.802531 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:48.802417 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhsnp\" (UniqueName: \"kubernetes.io/projected/90a007dd-2ad4-4fff-96a4-2c15593ab169-kube-api-access-vhsnp\") pod \"dns-default-79c4v\" (UID: \"90a007dd-2ad4-4fff-96a4-2c15593ab169\") " pod="openshift-dns/dns-default-79c4v" Apr 23 16:35:48.947442 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:48.947412 2571 generic.go:358] "Generic (PLEG): container finished" podID="204aff10-0787-493a-aee6-18d5bf3d74be" containerID="6816162b36bbd3d6ef43e1acb2c50e4d348f61199af671e1d2f9e634ed233b54" exitCode=0 Apr 23 16:35:48.947542 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:48.947474 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6z9f6" event={"ID":"204aff10-0787-493a-aee6-18d5bf3d74be","Type":"ContainerDied","Data":"6816162b36bbd3d6ef43e1acb2c50e4d348f61199af671e1d2f9e634ed233b54"} Apr 23 16:35:49.292706 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:49.292634 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/90a007dd-2ad4-4fff-96a4-2c15593ab169-metrics-tls\") pod \"dns-default-79c4v\" (UID: \"90a007dd-2ad4-4fff-96a4-2c15593ab169\") " pod="openshift-dns/dns-default-79c4v" Apr 23 16:35:49.292706 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:49.292694 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8ef03b2-bf64-4538-9ba2-cd786001399c-cert\") pod \"ingress-canary-bcpj7\" (UID: \"e8ef03b2-bf64-4538-9ba2-cd786001399c\") " pod="openshift-ingress-canary/ingress-canary-bcpj7" Apr 23 16:35:49.292888 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:49.292781 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:49.292888 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:49.292792 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:49.292888 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:49.292843 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90a007dd-2ad4-4fff-96a4-2c15593ab169-metrics-tls podName:90a007dd-2ad4-4fff-96a4-2c15593ab169 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:50.292824679 +0000 UTC m=+36.171234271 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/90a007dd-2ad4-4fff-96a4-2c15593ab169-metrics-tls") pod "dns-default-79c4v" (UID: "90a007dd-2ad4-4fff-96a4-2c15593ab169") : secret "dns-default-metrics-tls" not found Apr 23 16:35:49.292888 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:49.292858 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8ef03b2-bf64-4538-9ba2-cd786001399c-cert podName:e8ef03b2-bf64-4538-9ba2-cd786001399c nodeName:}" failed. No retries permitted until 2026-04-23 16:35:50.292852388 +0000 UTC m=+36.171261970 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e8ef03b2-bf64-4538-9ba2-cd786001399c-cert") pod "ingress-canary-bcpj7" (UID: "e8ef03b2-bf64-4538-9ba2-cd786001399c") : secret "canary-serving-cert" not found Apr 23 16:35:49.594879 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:49.594791 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/810bb628-4255-436d-8a3c-1b1c2d54dc0e-original-pull-secret\") pod \"global-pull-secret-syncer-8thv9\" (UID: \"810bb628-4255-436d-8a3c-1b1c2d54dc0e\") " pod="kube-system/global-pull-secret-syncer-8thv9" Apr 23 16:35:49.595026 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:49.594939 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:49.595026 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:49.595001 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/810bb628-4255-436d-8a3c-1b1c2d54dc0e-original-pull-secret podName:810bb628-4255-436d-8a3c-1b1c2d54dc0e nodeName:}" failed. No retries permitted until 2026-04-23 16:36:21.594986579 +0000 UTC m=+67.473396168 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/810bb628-4255-436d-8a3c-1b1c2d54dc0e-original-pull-secret") pod "global-pull-secret-syncer-8thv9" (UID: "810bb628-4255-436d-8a3c-1b1c2d54dc0e") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:49.739951 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:49.739924 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvscm" Apr 23 16:35:49.740079 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:49.739976 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kt9x" Apr 23 16:35:49.740079 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:49.739988 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8thv9" Apr 23 16:35:49.742983 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:49.742957 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 16:35:49.743119 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:49.742957 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-nldpm\"" Apr 23 16:35:49.743119 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:49.743024 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 16:35:49.743244 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:49.743229 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 16:35:49.743287 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:49.743252 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 16:35:49.744215 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:49.744196 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tpj7t\"" Apr 23 16:35:49.951697 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:49.951666 2571 generic.go:358] "Generic (PLEG): container finished" podID="204aff10-0787-493a-aee6-18d5bf3d74be" containerID="752102ced6ec3b51df4f07c633babf02505baf1318306b13222ee05074936400" exitCode=0 Apr 23 16:35:49.952024 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:49.951711 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6z9f6" event={"ID":"204aff10-0787-493a-aee6-18d5bf3d74be","Type":"ContainerDied","Data":"752102ced6ec3b51df4f07c633babf02505baf1318306b13222ee05074936400"} Apr 23 16:35:50.300484 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.300402 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/90a007dd-2ad4-4fff-96a4-2c15593ab169-metrics-tls\") pod \"dns-default-79c4v\" (UID: \"90a007dd-2ad4-4fff-96a4-2c15593ab169\") " pod="openshift-dns/dns-default-79c4v" Apr 23 16:35:50.300484 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.300453 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8ef03b2-bf64-4538-9ba2-cd786001399c-cert\") pod \"ingress-canary-bcpj7\" (UID: \"e8ef03b2-bf64-4538-9ba2-cd786001399c\") " pod="openshift-ingress-canary/ingress-canary-bcpj7" Apr 23 16:35:50.300684 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:50.300548 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:50.300684 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:50.300568 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:50.300684 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:50.300620 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90a007dd-2ad4-4fff-96a4-2c15593ab169-metrics-tls podName:90a007dd-2ad4-4fff-96a4-2c15593ab169 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:52.300591125 +0000 UTC m=+38.179000708 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/90a007dd-2ad4-4fff-96a4-2c15593ab169-metrics-tls") pod "dns-default-79c4v" (UID: "90a007dd-2ad4-4fff-96a4-2c15593ab169") : secret "dns-default-metrics-tls" not found Apr 23 16:35:50.300684 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:50.300633 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8ef03b2-bf64-4538-9ba2-cd786001399c-cert podName:e8ef03b2-bf64-4538-9ba2-cd786001399c nodeName:}" failed. No retries permitted until 2026-04-23 16:35:52.300627783 +0000 UTC m=+38.179037365 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e8ef03b2-bf64-4538-9ba2-cd786001399c-cert") pod "ingress-canary-bcpj7" (UID: "e8ef03b2-bf64-4538-9ba2-cd786001399c") : secret "canary-serving-cert" not found Apr 23 16:35:50.565496 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.565409 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6fbdc46d5c-jgjjc"] Apr 23 16:35:50.568546 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.568520 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn"] Apr 23 16:35:50.568703 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.568672 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6fbdc46d5c-jgjjc" Apr 23 16:35:50.571270 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.571243 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 23 16:35:50.571448 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.571426 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-p2jnj\"" Apr 23 16:35:50.571755 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.571738 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn" Apr 23 16:35:50.572436 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.572417 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 23 16:35:50.572436 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.572425 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 23 16:35:50.572578 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.572465 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 23 16:35:50.576025 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.575841 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 23 16:35:50.576025 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.575954 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 23 16:35:50.576629 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.576342 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 23 16:35:50.576629 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.576349 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 23 16:35:50.592145 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.592091 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6fbdc46d5c-jgjjc"] Apr 23 16:35:50.593011 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.592989 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn"] Apr 23 16:35:50.602726 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.602706 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmqcg\" (UniqueName: \"kubernetes.io/projected/bef6393a-edcc-44f9-ade1-a5a09b18288b-kube-api-access-xmqcg\") pod \"managed-serviceaccount-addon-agent-6fbdc46d5c-jgjjc\" (UID: \"bef6393a-edcc-44f9-ade1-a5a09b18288b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6fbdc46d5c-jgjjc" Apr 23 16:35:50.602823 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.602772 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/bef6393a-edcc-44f9-ade1-a5a09b18288b-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6fbdc46d5c-jgjjc\" (UID: \"bef6393a-edcc-44f9-ade1-a5a09b18288b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6fbdc46d5c-jgjjc" Apr 23 16:35:50.703377 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.703341 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/a7d872aa-f6d8-4da7-929f-adb4601407db-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn\" (UID: \"a7d872aa-f6d8-4da7-929f-adb4601407db\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn" Apr 23 16:35:50.703377 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.703381 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a7d872aa-f6d8-4da7-929f-adb4601407db-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn\" (UID: \"a7d872aa-f6d8-4da7-929f-adb4601407db\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn" Apr 23 16:35:50.703624 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.703404 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmqcg\" (UniqueName: \"kubernetes.io/projected/bef6393a-edcc-44f9-ade1-a5a09b18288b-kube-api-access-xmqcg\") pod \"managed-serviceaccount-addon-agent-6fbdc46d5c-jgjjc\" (UID: \"bef6393a-edcc-44f9-ade1-a5a09b18288b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6fbdc46d5c-jgjjc" Apr 23 16:35:50.703624 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.703423 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/a7d872aa-f6d8-4da7-929f-adb4601407db-hub\") pod \"cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn\" (UID: \"a7d872aa-f6d8-4da7-929f-adb4601407db\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn" Apr 23 16:35:50.703624 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.703441 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxvkx\" (UniqueName: \"kubernetes.io/projected/a7d872aa-f6d8-4da7-929f-adb4601407db-kube-api-access-rxvkx\") pod \"cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn\" (UID: \"a7d872aa-f6d8-4da7-929f-adb4601407db\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn" Apr 23 16:35:50.703624 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.703483 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/bef6393a-edcc-44f9-ade1-a5a09b18288b-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6fbdc46d5c-jgjjc\" (UID: \"bef6393a-edcc-44f9-ade1-a5a09b18288b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6fbdc46d5c-jgjjc" Apr 23 16:35:50.703624 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.703498 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/a7d872aa-f6d8-4da7-929f-adb4601407db-ca\") pod \"cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn\" (UID: \"a7d872aa-f6d8-4da7-929f-adb4601407db\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn" Apr 23 16:35:50.703846 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.703631 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/a7d872aa-f6d8-4da7-929f-adb4601407db-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn\" (UID: \"a7d872aa-f6d8-4da7-929f-adb4601407db\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn" Apr 23 16:35:50.707249 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.707222 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/bef6393a-edcc-44f9-ade1-a5a09b18288b-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6fbdc46d5c-jgjjc\" (UID: \"bef6393a-edcc-44f9-ade1-a5a09b18288b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6fbdc46d5c-jgjjc" Apr 23 16:35:50.713147 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.713125 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmqcg\" (UniqueName: \"kubernetes.io/projected/bef6393a-edcc-44f9-ade1-a5a09b18288b-kube-api-access-xmqcg\") pod \"managed-serviceaccount-addon-agent-6fbdc46d5c-jgjjc\" (UID: \"bef6393a-edcc-44f9-ade1-a5a09b18288b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6fbdc46d5c-jgjjc" Apr 23 16:35:50.804233 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.804188 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/a7d872aa-f6d8-4da7-929f-adb4601407db-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn\" (UID: \"a7d872aa-f6d8-4da7-929f-adb4601407db\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn" Apr 23 16:35:50.804233 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.804233 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a7d872aa-f6d8-4da7-929f-adb4601407db-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn\" (UID: \"a7d872aa-f6d8-4da7-929f-adb4601407db\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn" Apr 23 16:35:50.804469 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.804262 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/a7d872aa-f6d8-4da7-929f-adb4601407db-hub\") pod \"cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn\" (UID: \"a7d872aa-f6d8-4da7-929f-adb4601407db\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn" Apr 23 16:35:50.804469 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.804431 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rxvkx\" (UniqueName: \"kubernetes.io/projected/a7d872aa-f6d8-4da7-929f-adb4601407db-kube-api-access-rxvkx\") pod \"cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn\" (UID: \"a7d872aa-f6d8-4da7-929f-adb4601407db\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn" Apr 23 16:35:50.804588 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.804509 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/a7d872aa-f6d8-4da7-929f-adb4601407db-ca\") pod \"cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn\" (UID: \"a7d872aa-f6d8-4da7-929f-adb4601407db\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn" Apr 23 16:35:50.804668 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.804614 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/a7d872aa-f6d8-4da7-929f-adb4601407db-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn\" (UID: \"a7d872aa-f6d8-4da7-929f-adb4601407db\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn" Apr 23 16:35:50.805069 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.805043 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/a7d872aa-f6d8-4da7-929f-adb4601407db-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn\" (UID: \"a7d872aa-f6d8-4da7-929f-adb4601407db\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn" Apr 23 16:35:50.806758 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.806735 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/a7d872aa-f6d8-4da7-929f-adb4601407db-ca\") pod \"cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn\" (UID: \"a7d872aa-f6d8-4da7-929f-adb4601407db\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn" Apr 23 16:35:50.807027 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.807005 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/a7d872aa-f6d8-4da7-929f-adb4601407db-hub\") pod \"cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn\" (UID: \"a7d872aa-f6d8-4da7-929f-adb4601407db\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn" Apr 23 16:35:50.807091 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.807059 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/a7d872aa-f6d8-4da7-929f-adb4601407db-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn\" (UID: \"a7d872aa-f6d8-4da7-929f-adb4601407db\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn" Apr 23 16:35:50.807147 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.807065 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a7d872aa-f6d8-4da7-929f-adb4601407db-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn\" (UID: \"a7d872aa-f6d8-4da7-929f-adb4601407db\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn" Apr 23 16:35:50.840474 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.840412 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxvkx\" (UniqueName: \"kubernetes.io/projected/a7d872aa-f6d8-4da7-929f-adb4601407db-kube-api-access-rxvkx\") pod \"cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn\" (UID: \"a7d872aa-f6d8-4da7-929f-adb4601407db\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn" Apr 23 16:35:50.888507 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.888333 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6fbdc46d5c-jgjjc" Apr 23 16:35:50.893159 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.893137 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn" Apr 23 16:35:50.959348 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:50.958989 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6z9f6" event={"ID":"204aff10-0787-493a-aee6-18d5bf3d74be","Type":"ContainerStarted","Data":"3e91b3282549bb090c06c30d2a9216ef36cb1b032e7b2d0cb7d6427d6e3d5b47"} Apr 23 16:35:51.053054 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:51.053000 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6z9f6" podStartSLOduration=4.609493373 podStartE2EDuration="36.052979414s" podCreationTimestamp="2026-04-23 16:35:15 +0000 UTC" firstStartedPulling="2026-04-23 16:35:17.161804596 +0000 UTC m=+3.040214194" lastFinishedPulling="2026-04-23 16:35:48.60529065 +0000 UTC m=+34.483700235" observedRunningTime="2026-04-23 16:35:51.003467526 +0000 UTC m=+36.881877134" watchObservedRunningTime="2026-04-23 16:35:51.052979414 +0000 UTC m=+36.931389023" Apr 23 16:35:51.053398 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:51.053381 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6fbdc46d5c-jgjjc"] Apr 23 16:35:51.056931 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:51.056901 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbef6393a_edcc_44f9_ade1_a5a09b18288b.slice/crio-9cbcc8ab76ef0759f16af1ecc5cc5b8aed62480f91253768dd93390ed23097ef WatchSource:0}: Error finding container 9cbcc8ab76ef0759f16af1ecc5cc5b8aed62480f91253768dd93390ed23097ef: Status 404 returned error can't find the container with id 9cbcc8ab76ef0759f16af1ecc5cc5b8aed62480f91253768dd93390ed23097ef Apr 23 16:35:51.060213 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:51.060192 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn"] Apr 23 16:35:51.063388 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:35:51.063368 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7d872aa_f6d8_4da7_929f_adb4601407db.slice/crio-fdc38a680185a51916caf67912c81325933d633fb11db1869066167aa9c224bc WatchSource:0}: Error finding container fdc38a680185a51916caf67912c81325933d633fb11db1869066167aa9c224bc: Status 404 returned error can't find the container with id fdc38a680185a51916caf67912c81325933d633fb11db1869066167aa9c224bc Apr 23 16:35:51.963121 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:51.963082 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6fbdc46d5c-jgjjc" event={"ID":"bef6393a-edcc-44f9-ade1-a5a09b18288b","Type":"ContainerStarted","Data":"9cbcc8ab76ef0759f16af1ecc5cc5b8aed62480f91253768dd93390ed23097ef"} Apr 23 16:35:51.965317 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:51.965288 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn" event={"ID":"a7d872aa-f6d8-4da7-929f-adb4601407db","Type":"ContainerStarted","Data":"fdc38a680185a51916caf67912c81325933d633fb11db1869066167aa9c224bc"} Apr 23 16:35:52.317747 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:52.317654 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/90a007dd-2ad4-4fff-96a4-2c15593ab169-metrics-tls\") pod \"dns-default-79c4v\" (UID: \"90a007dd-2ad4-4fff-96a4-2c15593ab169\") " pod="openshift-dns/dns-default-79c4v" Apr 23 16:35:52.317747 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:52.317733 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8ef03b2-bf64-4538-9ba2-cd786001399c-cert\") pod \"ingress-canary-bcpj7\" (UID: \"e8ef03b2-bf64-4538-9ba2-cd786001399c\") " pod="openshift-ingress-canary/ingress-canary-bcpj7" Apr 23 16:35:52.317968 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:52.317873 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:52.317968 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:52.317937 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8ef03b2-bf64-4538-9ba2-cd786001399c-cert podName:e8ef03b2-bf64-4538-9ba2-cd786001399c nodeName:}" failed. No retries permitted until 2026-04-23 16:35:56.317918769 +0000 UTC m=+42.196328367 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e8ef03b2-bf64-4538-9ba2-cd786001399c-cert") pod "ingress-canary-bcpj7" (UID: "e8ef03b2-bf64-4538-9ba2-cd786001399c") : secret "canary-serving-cert" not found Apr 23 16:35:52.318370 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:52.318351 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:52.318448 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:52.318403 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90a007dd-2ad4-4fff-96a4-2c15593ab169-metrics-tls podName:90a007dd-2ad4-4fff-96a4-2c15593ab169 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:56.318388781 +0000 UTC m=+42.196798364 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/90a007dd-2ad4-4fff-96a4-2c15593ab169-metrics-tls") pod "dns-default-79c4v" (UID: "90a007dd-2ad4-4fff-96a4-2c15593ab169") : secret "dns-default-metrics-tls" not found Apr 23 16:35:54.971763 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:54.971699 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6fbdc46d5c-jgjjc" event={"ID":"bef6393a-edcc-44f9-ade1-a5a09b18288b","Type":"ContainerStarted","Data":"8f3c0e8264f72dec4c7302e4ef2dd5489850ce4f7dec9dea12a35696f9f4d641"} Apr 23 16:35:54.973279 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:54.973251 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn" event={"ID":"a7d872aa-f6d8-4da7-929f-adb4601407db","Type":"ContainerStarted","Data":"713a2b568c96137c6ae9adcd625124253bd9697cc4aefabe4d7beef8e8998abd"} Apr 23 16:35:54.989976 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:54.989925 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6fbdc46d5c-jgjjc" podStartSLOduration=1.679406437 podStartE2EDuration="4.989913146s" podCreationTimestamp="2026-04-23 16:35:50 +0000 UTC" firstStartedPulling="2026-04-23 16:35:51.058731526 +0000 UTC m=+36.937141110" lastFinishedPulling="2026-04-23 16:35:54.369238232 +0000 UTC m=+40.247647819" observedRunningTime="2026-04-23 16:35:54.989037478 +0000 UTC m=+40.867447085" watchObservedRunningTime="2026-04-23 16:35:54.989913146 +0000 UTC m=+40.868322800" Apr 23 16:35:56.362535 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:56.362500 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8ef03b2-bf64-4538-9ba2-cd786001399c-cert\") pod \"ingress-canary-bcpj7\" (UID: \"e8ef03b2-bf64-4538-9ba2-cd786001399c\") " pod="openshift-ingress-canary/ingress-canary-bcpj7" Apr 23 16:35:56.362903 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:56.362579 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/90a007dd-2ad4-4fff-96a4-2c15593ab169-metrics-tls\") pod \"dns-default-79c4v\" (UID: \"90a007dd-2ad4-4fff-96a4-2c15593ab169\") " pod="openshift-dns/dns-default-79c4v" Apr 23 16:35:56.362903 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:56.362659 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:56.362903 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:56.362686 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:56.362903 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:56.362727 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8ef03b2-bf64-4538-9ba2-cd786001399c-cert podName:e8ef03b2-bf64-4538-9ba2-cd786001399c nodeName:}" failed. No retries permitted until 2026-04-23 16:36:04.362711801 +0000 UTC m=+50.241121384 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e8ef03b2-bf64-4538-9ba2-cd786001399c-cert") pod "ingress-canary-bcpj7" (UID: "e8ef03b2-bf64-4538-9ba2-cd786001399c") : secret "canary-serving-cert" not found Apr 23 16:35:56.362903 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:35:56.362741 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90a007dd-2ad4-4fff-96a4-2c15593ab169-metrics-tls podName:90a007dd-2ad4-4fff-96a4-2c15593ab169 nodeName:}" failed. No retries permitted until 2026-04-23 16:36:04.362735279 +0000 UTC m=+50.241144861 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/90a007dd-2ad4-4fff-96a4-2c15593ab169-metrics-tls") pod "dns-default-79c4v" (UID: "90a007dd-2ad4-4fff-96a4-2c15593ab169") : secret "dns-default-metrics-tls" not found Apr 23 16:35:56.978747 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:56.978706 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn" event={"ID":"a7d872aa-f6d8-4da7-929f-adb4601407db","Type":"ContainerStarted","Data":"10bd77ac328bcf3dddb0386ed33b4b49b88b2a7d11af38444704c4c247e1e76e"} Apr 23 16:35:56.979013 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:56.978751 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn" event={"ID":"a7d872aa-f6d8-4da7-929f-adb4601407db","Type":"ContainerStarted","Data":"1aa242a813d1a66806a9334aef1ec8d787faa9f1502ad9887936ae78f0cc18cb"} Apr 23 16:35:56.999545 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:35:56.999502 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn" podStartSLOduration=1.529906636 podStartE2EDuration="6.999488167s" podCreationTimestamp="2026-04-23 16:35:50 +0000 UTC" firstStartedPulling="2026-04-23 16:35:51.064975285 +0000 UTC m=+36.943384869" lastFinishedPulling="2026-04-23 16:35:56.534556813 +0000 UTC m=+42.412966400" observedRunningTime="2026-04-23 16:35:56.998158637 +0000 UTC m=+42.876568242" watchObservedRunningTime="2026-04-23 16:35:56.999488167 +0000 UTC m=+42.877897772" Apr 23 16:36:04.420920 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:36:04.420881 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/90a007dd-2ad4-4fff-96a4-2c15593ab169-metrics-tls\") pod \"dns-default-79c4v\" (UID: \"90a007dd-2ad4-4fff-96a4-2c15593ab169\") " pod="openshift-dns/dns-default-79c4v" Apr 23 16:36:04.421384 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:36:04.420934 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8ef03b2-bf64-4538-9ba2-cd786001399c-cert\") pod \"ingress-canary-bcpj7\" (UID: \"e8ef03b2-bf64-4538-9ba2-cd786001399c\") " pod="openshift-ingress-canary/ingress-canary-bcpj7" Apr 23 16:36:04.421384 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:36:04.421014 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:36:04.421384 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:36:04.421018 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:36:04.421384 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:36:04.421070 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8ef03b2-bf64-4538-9ba2-cd786001399c-cert podName:e8ef03b2-bf64-4538-9ba2-cd786001399c nodeName:}" failed. No retries permitted until 2026-04-23 16:36:20.421053221 +0000 UTC m=+66.299462803 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e8ef03b2-bf64-4538-9ba2-cd786001399c-cert") pod "ingress-canary-bcpj7" (UID: "e8ef03b2-bf64-4538-9ba2-cd786001399c") : secret "canary-serving-cert" not found Apr 23 16:36:04.421384 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:36:04.421158 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90a007dd-2ad4-4fff-96a4-2c15593ab169-metrics-tls podName:90a007dd-2ad4-4fff-96a4-2c15593ab169 nodeName:}" failed. No retries permitted until 2026-04-23 16:36:20.421142948 +0000 UTC m=+66.299552531 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/90a007dd-2ad4-4fff-96a4-2c15593ab169-metrics-tls") pod "dns-default-79c4v" (UID: "90a007dd-2ad4-4fff-96a4-2c15593ab169") : secret "dns-default-metrics-tls" not found Apr 23 16:36:12.943880 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:36:12.943848 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q6qml" Apr 23 16:36:20.432306 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:36:20.432265 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbfa21d6-bf64-400e-9cbe-b26171f7ef99-metrics-certs\") pod \"network-metrics-daemon-7kt9x\" (UID: \"dbfa21d6-bf64-400e-9cbe-b26171f7ef99\") " pod="openshift-multus/network-metrics-daemon-7kt9x" Apr 23 16:36:20.432306 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:36:20.432307 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/90a007dd-2ad4-4fff-96a4-2c15593ab169-metrics-tls\") pod \"dns-default-79c4v\" (UID: \"90a007dd-2ad4-4fff-96a4-2c15593ab169\") " pod="openshift-dns/dns-default-79c4v" Apr 23 16:36:20.432898 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:36:20.432416 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:36:20.432898 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:36:20.432472 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90a007dd-2ad4-4fff-96a4-2c15593ab169-metrics-tls podName:90a007dd-2ad4-4fff-96a4-2c15593ab169 nodeName:}" failed. No retries permitted until 2026-04-23 16:36:52.43245678 +0000 UTC m=+98.310866362 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/90a007dd-2ad4-4fff-96a4-2c15593ab169-metrics-tls") pod "dns-default-79c4v" (UID: "90a007dd-2ad4-4fff-96a4-2c15593ab169") : secret "dns-default-metrics-tls" not found Apr 23 16:36:20.432898 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:36:20.432468 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8ef03b2-bf64-4538-9ba2-cd786001399c-cert\") pod \"ingress-canary-bcpj7\" (UID: \"e8ef03b2-bf64-4538-9ba2-cd786001399c\") " pod="openshift-ingress-canary/ingress-canary-bcpj7" Apr 23 16:36:20.432898 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:36:20.432640 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:36:20.432898 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:36:20.432719 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8ef03b2-bf64-4538-9ba2-cd786001399c-cert podName:e8ef03b2-bf64-4538-9ba2-cd786001399c nodeName:}" failed. No retries permitted until 2026-04-23 16:36:52.432702296 +0000 UTC m=+98.311111889 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e8ef03b2-bf64-4538-9ba2-cd786001399c-cert") pod "ingress-canary-bcpj7" (UID: "e8ef03b2-bf64-4538-9ba2-cd786001399c") : secret "canary-serving-cert" not found Apr 23 16:36:20.435147 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:36:20.435129 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 16:36:20.443441 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:36:20.443421 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 16:36:20.443487 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:36:20.443463 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbfa21d6-bf64-400e-9cbe-b26171f7ef99-metrics-certs podName:dbfa21d6-bf64-400e-9cbe-b26171f7ef99 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:24.443450886 +0000 UTC m=+130.321860469 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dbfa21d6-bf64-400e-9cbe-b26171f7ef99-metrics-certs") pod "network-metrics-daemon-7kt9x" (UID: "dbfa21d6-bf64-400e-9cbe-b26171f7ef99") : secret "metrics-daemon-secret" not found Apr 23 16:36:20.533834 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:36:20.533797 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q2bbv\" (UniqueName: \"kubernetes.io/projected/1e04a51c-81ee-4ef6-b546-0b1e611642d1-kube-api-access-q2bbv\") pod \"network-check-target-nvscm\" (UID: \"1e04a51c-81ee-4ef6-b546-0b1e611642d1\") " pod="openshift-network-diagnostics/network-check-target-nvscm" Apr 23 16:36:20.536695 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:36:20.536678 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 16:36:20.547717 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:36:20.547697 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 16:36:20.558544 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:36:20.558523 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2bbv\" (UniqueName: \"kubernetes.io/projected/1e04a51c-81ee-4ef6-b546-0b1e611642d1-kube-api-access-q2bbv\") pod \"network-check-target-nvscm\" (UID: \"1e04a51c-81ee-4ef6-b546-0b1e611642d1\") " pod="openshift-network-diagnostics/network-check-target-nvscm" Apr 23 16:36:20.661523 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:36:20.661496 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tpj7t\"" Apr 23 16:36:20.668875 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:36:20.668855 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvscm" Apr 23 16:36:20.783043 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:36:20.783014 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-nvscm"] Apr 23 16:36:20.787513 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:36:20.787487 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e04a51c_81ee_4ef6_b546_0b1e611642d1.slice/crio-231fd2ce7dfa59f6fe18352e66b4e38fe8cc69c41867d454c8c8f759a6ba1bcf WatchSource:0}: Error finding container 231fd2ce7dfa59f6fe18352e66b4e38fe8cc69c41867d454c8c8f759a6ba1bcf: Status 404 returned error can't find the container with id 231fd2ce7dfa59f6fe18352e66b4e38fe8cc69c41867d454c8c8f759a6ba1bcf Apr 23 16:36:21.024190 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:36:21.024108 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-nvscm" event={"ID":"1e04a51c-81ee-4ef6-b546-0b1e611642d1","Type":"ContainerStarted","Data":"231fd2ce7dfa59f6fe18352e66b4e38fe8cc69c41867d454c8c8f759a6ba1bcf"} Apr 23 16:36:21.642159 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:36:21.642118 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/810bb628-4255-436d-8a3c-1b1c2d54dc0e-original-pull-secret\") pod \"global-pull-secret-syncer-8thv9\" (UID: \"810bb628-4255-436d-8a3c-1b1c2d54dc0e\") " pod="kube-system/global-pull-secret-syncer-8thv9" Apr 23 16:36:21.645091 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:36:21.645069 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 16:36:21.655441 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:36:21.655418 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/810bb628-4255-436d-8a3c-1b1c2d54dc0e-original-pull-secret\") pod \"global-pull-secret-syncer-8thv9\" (UID: \"810bb628-4255-436d-8a3c-1b1c2d54dc0e\") " pod="kube-system/global-pull-secret-syncer-8thv9" Apr 23 16:36:21.854406 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:36:21.854365 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8thv9" Apr 23 16:36:21.984318 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:36:21.984288 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8thv9"] Apr 23 16:36:21.987716 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:36:21.987686 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod810bb628_4255_436d_8a3c_1b1c2d54dc0e.slice/crio-6565fb7b96a5f0c10fd205fe271e3fcc118d5310ea55c8a58c4cdecdf1352ce7 WatchSource:0}: Error finding container 6565fb7b96a5f0c10fd205fe271e3fcc118d5310ea55c8a58c4cdecdf1352ce7: Status 404 returned error can't find the container with id 6565fb7b96a5f0c10fd205fe271e3fcc118d5310ea55c8a58c4cdecdf1352ce7 Apr 23 16:36:22.027092 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:36:22.027062 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8thv9" event={"ID":"810bb628-4255-436d-8a3c-1b1c2d54dc0e","Type":"ContainerStarted","Data":"6565fb7b96a5f0c10fd205fe271e3fcc118d5310ea55c8a58c4cdecdf1352ce7"} Apr 23 16:36:25.037168 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:36:25.037126 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-nvscm" event={"ID":"1e04a51c-81ee-4ef6-b546-0b1e611642d1","Type":"ContainerStarted","Data":"94d35f58576dfb2b02d1c511eae5780557128fd92cd2118c73cb79c5deda3729"} Apr 23 16:36:25.037569 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:36:25.037319 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-nvscm" Apr 23 16:36:25.055142 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:36:25.055093 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-nvscm" podStartSLOduration=67.825544277 podStartE2EDuration="1m11.055080342s" podCreationTimestamp="2026-04-23 16:35:14 +0000 UTC" firstStartedPulling="2026-04-23 16:36:20.788922781 +0000 UTC m=+66.667332364" lastFinishedPulling="2026-04-23 16:36:24.018458838 +0000 UTC m=+69.896868429" observedRunningTime="2026-04-23 16:36:25.053698256 +0000 UTC m=+70.932107875" watchObservedRunningTime="2026-04-23 16:36:25.055080342 +0000 UTC m=+70.933489946" Apr 23 16:36:27.043432 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:36:27.043395 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8thv9" event={"ID":"810bb628-4255-436d-8a3c-1b1c2d54dc0e","Type":"ContainerStarted","Data":"9ea9810ccc9d7013b2dbe76520fd5b2bbccaa37c9b1ae3aa7d273f2a7f01a851"} Apr 23 16:36:27.061371 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:36:27.061324 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-8thv9" podStartSLOduration=65.956989291 podStartE2EDuration="1m10.061309817s" podCreationTimestamp="2026-04-23 16:35:17 +0000 UTC" firstStartedPulling="2026-04-23 16:36:21.989946618 +0000 UTC m=+67.868356201" lastFinishedPulling="2026-04-23 16:36:26.094267145 +0000 UTC m=+71.972676727" observedRunningTime="2026-04-23 16:36:27.060701755 +0000 UTC m=+72.939111399" watchObservedRunningTime="2026-04-23 16:36:27.061309817 +0000 UTC m=+72.939719421" Apr 23 16:36:52.459105 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:36:52.459051 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/90a007dd-2ad4-4fff-96a4-2c15593ab169-metrics-tls\") pod \"dns-default-79c4v\" (UID: \"90a007dd-2ad4-4fff-96a4-2c15593ab169\") " pod="openshift-dns/dns-default-79c4v" Apr 23 16:36:52.459105 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:36:52.459112 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8ef03b2-bf64-4538-9ba2-cd786001399c-cert\") pod \"ingress-canary-bcpj7\" (UID: \"e8ef03b2-bf64-4538-9ba2-cd786001399c\") " pod="openshift-ingress-canary/ingress-canary-bcpj7" Apr 23 16:36:52.459654 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:36:52.459201 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:36:52.459654 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:36:52.459210 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:36:52.459654 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:36:52.459267 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8ef03b2-bf64-4538-9ba2-cd786001399c-cert podName:e8ef03b2-bf64-4538-9ba2-cd786001399c nodeName:}" failed. No retries permitted until 2026-04-23 16:37:56.459254388 +0000 UTC m=+162.337663971 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e8ef03b2-bf64-4538-9ba2-cd786001399c-cert") pod "ingress-canary-bcpj7" (UID: "e8ef03b2-bf64-4538-9ba2-cd786001399c") : secret "canary-serving-cert" not found Apr 23 16:36:52.459654 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:36:52.459281 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90a007dd-2ad4-4fff-96a4-2c15593ab169-metrics-tls podName:90a007dd-2ad4-4fff-96a4-2c15593ab169 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:56.45927384 +0000 UTC m=+162.337683423 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/90a007dd-2ad4-4fff-96a4-2c15593ab169-metrics-tls") pod "dns-default-79c4v" (UID: "90a007dd-2ad4-4fff-96a4-2c15593ab169") : secret "dns-default-metrics-tls" not found Apr 23 16:36:56.042471 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:36:56.042434 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-nvscm" Apr 23 16:37:24.478612 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:37:24.478556 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbfa21d6-bf64-400e-9cbe-b26171f7ef99-metrics-certs\") pod \"network-metrics-daemon-7kt9x\" (UID: \"dbfa21d6-bf64-400e-9cbe-b26171f7ef99\") " pod="openshift-multus/network-metrics-daemon-7kt9x" Apr 23 16:37:24.479059 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:37:24.478699 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 16:37:24.479059 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:37:24.478784 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbfa21d6-bf64-400e-9cbe-b26171f7ef99-metrics-certs podName:dbfa21d6-bf64-400e-9cbe-b26171f7ef99 nodeName:}" failed. No retries permitted until 2026-04-23 16:39:26.47876828 +0000 UTC m=+252.357177864 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dbfa21d6-bf64-400e-9cbe-b26171f7ef99-metrics-certs") pod "network-metrics-daemon-7kt9x" (UID: "dbfa21d6-bf64-400e-9cbe-b26171f7ef99") : secret "metrics-daemon-secret" not found Apr 23 16:37:30.789379 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:37:30.789350 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rpmks_ee000696-33d6-4a0e-a2ac-f710ef5e1515/dns-node-resolver/0.log" Apr 23 16:37:31.790414 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:37:31.790387 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-zrzxd_cea1706c-bca2-4388-9143-d1aa2fbcdc62/node-ca/0.log" Apr 23 16:37:51.512164 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:37:51.512116 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-79c4v" podUID="90a007dd-2ad4-4fff-96a4-2c15593ab169" Apr 23 16:37:51.537810 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:37:51.537783 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-bcpj7" podUID="e8ef03b2-bf64-4538-9ba2-cd786001399c" Apr 23 16:37:52.240534 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:37:52.240500 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-79c4v" Apr 23 16:37:52.749071 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:37:52.749035 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-7kt9x" podUID="dbfa21d6-bf64-400e-9cbe-b26171f7ef99" Apr 23 16:37:55.248042 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:37:55.248005 2571 generic.go:358] "Generic (PLEG): container finished" podID="bef6393a-edcc-44f9-ade1-a5a09b18288b" containerID="8f3c0e8264f72dec4c7302e4ef2dd5489850ce4f7dec9dea12a35696f9f4d641" exitCode=255 Apr 23 16:37:55.248393 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:37:55.248049 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6fbdc46d5c-jgjjc" event={"ID":"bef6393a-edcc-44f9-ade1-a5a09b18288b","Type":"ContainerDied","Data":"8f3c0e8264f72dec4c7302e4ef2dd5489850ce4f7dec9dea12a35696f9f4d641"} Apr 23 16:37:55.248393 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:37:55.248354 2571 scope.go:117] "RemoveContainer" containerID="8f3c0e8264f72dec4c7302e4ef2dd5489850ce4f7dec9dea12a35696f9f4d641" Apr 23 16:37:56.251792 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:37:56.251759 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6fbdc46d5c-jgjjc" event={"ID":"bef6393a-edcc-44f9-ade1-a5a09b18288b","Type":"ContainerStarted","Data":"668961a81bbb96190f9cfa4c25642eb23b0bce51c7e4524297ff690236a7acd8"} Apr 23 16:37:56.500242 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:37:56.500204 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/90a007dd-2ad4-4fff-96a4-2c15593ab169-metrics-tls\") pod \"dns-default-79c4v\" (UID: \"90a007dd-2ad4-4fff-96a4-2c15593ab169\") " pod="openshift-dns/dns-default-79c4v" Apr 23 16:37:56.500411 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:37:56.500256 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8ef03b2-bf64-4538-9ba2-cd786001399c-cert\") pod \"ingress-canary-bcpj7\" (UID: \"e8ef03b2-bf64-4538-9ba2-cd786001399c\") " pod="openshift-ingress-canary/ingress-canary-bcpj7" Apr 23 16:37:56.502768 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:37:56.502709 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/90a007dd-2ad4-4fff-96a4-2c15593ab169-metrics-tls\") pod \"dns-default-79c4v\" (UID: \"90a007dd-2ad4-4fff-96a4-2c15593ab169\") " pod="openshift-dns/dns-default-79c4v" Apr 23 16:37:56.502768 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:37:56.502720 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8ef03b2-bf64-4538-9ba2-cd786001399c-cert\") pod \"ingress-canary-bcpj7\" (UID: \"e8ef03b2-bf64-4538-9ba2-cd786001399c\") " pod="openshift-ingress-canary/ingress-canary-bcpj7" Apr 23 16:37:56.743551 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:37:56.743524 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vvvrx\"" Apr 23 16:37:56.751661 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:37:56.751645 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-79c4v" Apr 23 16:37:56.865911 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:37:56.865879 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-79c4v"] Apr 23 16:37:56.870052 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:37:56.870024 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90a007dd_2ad4_4fff_96a4_2c15593ab169.slice/crio-d3147628cf590312eee5997d3d4230ddc1ea65897df92e5169bd9246f8d2c305 WatchSource:0}: Error finding container d3147628cf590312eee5997d3d4230ddc1ea65897df92e5169bd9246f8d2c305: Status 404 returned error can't find the container with id d3147628cf590312eee5997d3d4230ddc1ea65897df92e5169bd9246f8d2c305 Apr 23 16:37:57.254976 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:37:57.254942 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-79c4v" event={"ID":"90a007dd-2ad4-4fff-96a4-2c15593ab169","Type":"ContainerStarted","Data":"d3147628cf590312eee5997d3d4230ddc1ea65897df92e5169bd9246f8d2c305"} Apr 23 16:37:58.261945 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:37:58.261907 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-79c4v" event={"ID":"90a007dd-2ad4-4fff-96a4-2c15593ab169","Type":"ContainerStarted","Data":"b7323f3f483588412b777ced6ef8ccf5c7d1579f4a74f79d317eded5c3ff927b"} Apr 23 16:37:59.268770 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:37:59.268731 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-79c4v" event={"ID":"90a007dd-2ad4-4fff-96a4-2c15593ab169","Type":"ContainerStarted","Data":"f3ded2dd6ff42128a9896f0688ef80ef670d312712d6b529c864e9c8e60c7315"} Apr 23 16:37:59.269134 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:37:59.268892 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-79c4v" Apr 23 16:37:59.288495 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:37:59.288448 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-79c4v" podStartSLOduration=130.058533427 podStartE2EDuration="2m11.288434179s" podCreationTimestamp="2026-04-23 16:35:48 +0000 UTC" firstStartedPulling="2026-04-23 16:37:56.872253585 +0000 UTC m=+162.750663169" lastFinishedPulling="2026-04-23 16:37:58.102154338 +0000 UTC m=+163.980563921" observedRunningTime="2026-04-23 16:37:59.288184172 +0000 UTC m=+165.166593777" watchObservedRunningTime="2026-04-23 16:37:59.288434179 +0000 UTC m=+165.166843783" Apr 23 16:38:00.789356 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:00.789327 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-vrfds"] Apr 23 16:38:00.792288 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:00.792271 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-vrfds" Apr 23 16:38:00.797081 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:00.797049 2571 status_manager.go:895] "Failed to get status for pod" podUID="95f8311a-523d-4e15-9d53-62b348ee103c" pod="openshift-insights/insights-runtime-extractor-vrfds" err="pods \"insights-runtime-extractor-vrfds\" is forbidden: User \"system:node:ip-10-0-136-190.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-insights\": no relationship found between node 'ip-10-0-136-190.ec2.internal' and this object" Apr 23 16:38:00.797081 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:38:00.797052 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"insights-runtime-extractor-sa-dockercfg-kd74h\" is forbidden: User \"system:node:ip-10-0-136-190.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-insights\": no relationship found between node 'ip-10-0-136-190.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-kd74h\"" type="*v1.Secret" Apr 23 16:38:00.797426 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:38:00.797395 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"kube-rbac-proxy\" is forbidden: User \"system:node:ip-10-0-136-190.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-insights\": no relationship found between node 'ip-10-0-136-190.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" type="*v1.ConfigMap" Apr 23 16:38:00.797426 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:38:00.797410 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:ip-10-0-136-190.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-insights\": no relationship found between node 'ip-10-0-136-190.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" type="*v1.ConfigMap" Apr 23 16:38:00.797563 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:38:00.797397 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ip-10-0-136-190.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-insights\": no relationship found between node 'ip-10-0-136-190.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" type="*v1.ConfigMap" Apr 23 16:38:00.797563 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:38:00.797415 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"insights-runtime-extractor-tls\" is forbidden: User \"system:node:ip-10-0-136-190.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-insights\": no relationship found between node 'ip-10-0-136-190.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" type="*v1.Secret" Apr 23 16:38:00.815302 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:00.815272 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7b9786f944-5tkx5"] Apr 23 16:38:00.818089 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:00.818073 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7b9786f944-5tkx5" Apr 23 16:38:00.818759 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:00.818737 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-vrfds"] Apr 23 16:38:00.821764 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:00.821737 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 16:38:00.822113 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:00.822099 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 16:38:00.822836 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:00.822820 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 16:38:00.823789 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:00.823769 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/95f8311a-523d-4e15-9d53-62b348ee103c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-vrfds\" (UID: \"95f8311a-523d-4e15-9d53-62b348ee103c\") " pod="openshift-insights/insights-runtime-extractor-vrfds" Apr 23 16:38:00.823927 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:00.823907 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-x7fs6\"" Apr 23 16:38:00.824043 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:00.823939 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/95f8311a-523d-4e15-9d53-62b348ee103c-crio-socket\") pod \"insights-runtime-extractor-vrfds\" (UID: \"95f8311a-523d-4e15-9d53-62b348ee103c\") " pod="openshift-insights/insights-runtime-extractor-vrfds" Apr 23 16:38:00.824043 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:00.824000 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/95f8311a-523d-4e15-9d53-62b348ee103c-data-volume\") pod \"insights-runtime-extractor-vrfds\" (UID: \"95f8311a-523d-4e15-9d53-62b348ee103c\") " pod="openshift-insights/insights-runtime-extractor-vrfds" Apr 23 16:38:00.824043 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:00.824024 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsdc8\" (UniqueName: \"kubernetes.io/projected/95f8311a-523d-4e15-9d53-62b348ee103c-kube-api-access-rsdc8\") pod \"insights-runtime-extractor-vrfds\" (UID: \"95f8311a-523d-4e15-9d53-62b348ee103c\") " pod="openshift-insights/insights-runtime-extractor-vrfds" Apr 23 16:38:00.824198 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:00.824066 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/95f8311a-523d-4e15-9d53-62b348ee103c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vrfds\" (UID: \"95f8311a-523d-4e15-9d53-62b348ee103c\") " pod="openshift-insights/insights-runtime-extractor-vrfds" Apr 23 16:38:00.827172 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:00.827149 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 16:38:00.845669 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:00.845645 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7b9786f944-5tkx5"] Apr 23 16:38:00.925100 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:00.925069 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bab205f1-bda8-4c2a-ba64-83dc4512e8a6-registry-certificates\") pod \"image-registry-7b9786f944-5tkx5\" (UID: \"bab205f1-bda8-4c2a-ba64-83dc4512e8a6\") " pod="openshift-image-registry/image-registry-7b9786f944-5tkx5" Apr 23 16:38:00.925254 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:00.925114 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bab205f1-bda8-4c2a-ba64-83dc4512e8a6-trusted-ca\") pod \"image-registry-7b9786f944-5tkx5\" (UID: \"bab205f1-bda8-4c2a-ba64-83dc4512e8a6\") " pod="openshift-image-registry/image-registry-7b9786f944-5tkx5" Apr 23 16:38:00.925254 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:00.925179 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bab205f1-bda8-4c2a-ba64-83dc4512e8a6-installation-pull-secrets\") pod \"image-registry-7b9786f944-5tkx5\" (UID: \"bab205f1-bda8-4c2a-ba64-83dc4512e8a6\") " pod="openshift-image-registry/image-registry-7b9786f944-5tkx5" Apr 23 16:38:00.925254 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:00.925216 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/95f8311a-523d-4e15-9d53-62b348ee103c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-vrfds\" (UID: \"95f8311a-523d-4e15-9d53-62b348ee103c\") " pod="openshift-insights/insights-runtime-extractor-vrfds" Apr 23 16:38:00.925254 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:00.925238 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bab205f1-bda8-4c2a-ba64-83dc4512e8a6-image-registry-private-configuration\") pod \"image-registry-7b9786f944-5tkx5\" (UID: \"bab205f1-bda8-4c2a-ba64-83dc4512e8a6\") " pod="openshift-image-registry/image-registry-7b9786f944-5tkx5" Apr 23 16:38:00.925457 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:00.925261 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/95f8311a-523d-4e15-9d53-62b348ee103c-crio-socket\") pod \"insights-runtime-extractor-vrfds\" (UID: \"95f8311a-523d-4e15-9d53-62b348ee103c\") " pod="openshift-insights/insights-runtime-extractor-vrfds" Apr 23 16:38:00.925457 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:00.925299 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/95f8311a-523d-4e15-9d53-62b348ee103c-data-volume\") pod \"insights-runtime-extractor-vrfds\" (UID: \"95f8311a-523d-4e15-9d53-62b348ee103c\") " pod="openshift-insights/insights-runtime-extractor-vrfds" Apr 23 16:38:00.925457 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:00.925319 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rsdc8\" (UniqueName: \"kubernetes.io/projected/95f8311a-523d-4e15-9d53-62b348ee103c-kube-api-access-rsdc8\") pod \"insights-runtime-extractor-vrfds\" (UID: \"95f8311a-523d-4e15-9d53-62b348ee103c\") " pod="openshift-insights/insights-runtime-extractor-vrfds" Apr 23 16:38:00.925457 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:00.925343 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bab205f1-bda8-4c2a-ba64-83dc4512e8a6-registry-tls\") pod \"image-registry-7b9786f944-5tkx5\" (UID: \"bab205f1-bda8-4c2a-ba64-83dc4512e8a6\") " pod="openshift-image-registry/image-registry-7b9786f944-5tkx5" Apr 23 16:38:00.925457 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:00.925387 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bab205f1-bda8-4c2a-ba64-83dc4512e8a6-ca-trust-extracted\") pod \"image-registry-7b9786f944-5tkx5\" (UID: \"bab205f1-bda8-4c2a-ba64-83dc4512e8a6\") " pod="openshift-image-registry/image-registry-7b9786f944-5tkx5" Apr 23 16:38:00.925457 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:00.925408 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bab205f1-bda8-4c2a-ba64-83dc4512e8a6-bound-sa-token\") pod \"image-registry-7b9786f944-5tkx5\" (UID: \"bab205f1-bda8-4c2a-ba64-83dc4512e8a6\") " pod="openshift-image-registry/image-registry-7b9786f944-5tkx5" Apr 23 16:38:00.925457 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:00.925424 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/95f8311a-523d-4e15-9d53-62b348ee103c-crio-socket\") pod \"insights-runtime-extractor-vrfds\" (UID: \"95f8311a-523d-4e15-9d53-62b348ee103c\") " pod="openshift-insights/insights-runtime-extractor-vrfds" Apr 23 16:38:00.925457 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:00.925446 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrf68\" (UniqueName: \"kubernetes.io/projected/bab205f1-bda8-4c2a-ba64-83dc4512e8a6-kube-api-access-wrf68\") pod \"image-registry-7b9786f944-5tkx5\" (UID: \"bab205f1-bda8-4c2a-ba64-83dc4512e8a6\") " pod="openshift-image-registry/image-registry-7b9786f944-5tkx5" Apr 23 16:38:00.925775 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:00.925515 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/95f8311a-523d-4e15-9d53-62b348ee103c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vrfds\" (UID: \"95f8311a-523d-4e15-9d53-62b348ee103c\") " pod="openshift-insights/insights-runtime-extractor-vrfds" Apr 23 16:38:00.925775 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:00.925742 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/95f8311a-523d-4e15-9d53-62b348ee103c-data-volume\") pod \"insights-runtime-extractor-vrfds\" (UID: \"95f8311a-523d-4e15-9d53-62b348ee103c\") " pod="openshift-insights/insights-runtime-extractor-vrfds" Apr 23 16:38:01.026213 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:01.026182 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bab205f1-bda8-4c2a-ba64-83dc4512e8a6-registry-tls\") pod \"image-registry-7b9786f944-5tkx5\" (UID: \"bab205f1-bda8-4c2a-ba64-83dc4512e8a6\") " pod="openshift-image-registry/image-registry-7b9786f944-5tkx5" Apr 23 16:38:01.026406 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:01.026223 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bab205f1-bda8-4c2a-ba64-83dc4512e8a6-ca-trust-extracted\") pod \"image-registry-7b9786f944-5tkx5\" (UID: \"bab205f1-bda8-4c2a-ba64-83dc4512e8a6\") " pod="openshift-image-registry/image-registry-7b9786f944-5tkx5" Apr 23 16:38:01.026406 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:01.026240 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bab205f1-bda8-4c2a-ba64-83dc4512e8a6-bound-sa-token\") pod \"image-registry-7b9786f944-5tkx5\" (UID: \"bab205f1-bda8-4c2a-ba64-83dc4512e8a6\") " pod="openshift-image-registry/image-registry-7b9786f944-5tkx5" Apr 23 16:38:01.026406 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:01.026294 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wrf68\" (UniqueName: \"kubernetes.io/projected/bab205f1-bda8-4c2a-ba64-83dc4512e8a6-kube-api-access-wrf68\") pod \"image-registry-7b9786f944-5tkx5\" (UID: \"bab205f1-bda8-4c2a-ba64-83dc4512e8a6\") " pod="openshift-image-registry/image-registry-7b9786f944-5tkx5" Apr 23 16:38:01.026406 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:01.026315 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bab205f1-bda8-4c2a-ba64-83dc4512e8a6-registry-certificates\") pod \"image-registry-7b9786f944-5tkx5\" (UID: \"bab205f1-bda8-4c2a-ba64-83dc4512e8a6\") " pod="openshift-image-registry/image-registry-7b9786f944-5tkx5" Apr 23 16:38:01.026406 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:01.026336 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bab205f1-bda8-4c2a-ba64-83dc4512e8a6-trusted-ca\") pod \"image-registry-7b9786f944-5tkx5\" (UID: \"bab205f1-bda8-4c2a-ba64-83dc4512e8a6\") " pod="openshift-image-registry/image-registry-7b9786f944-5tkx5" Apr 23 16:38:01.026406 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:01.026364 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bab205f1-bda8-4c2a-ba64-83dc4512e8a6-installation-pull-secrets\") pod \"image-registry-7b9786f944-5tkx5\" (UID: \"bab205f1-bda8-4c2a-ba64-83dc4512e8a6\") " pod="openshift-image-registry/image-registry-7b9786f944-5tkx5" Apr 23 16:38:01.026406 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:01.026401 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bab205f1-bda8-4c2a-ba64-83dc4512e8a6-image-registry-private-configuration\") pod \"image-registry-7b9786f944-5tkx5\" (UID: \"bab205f1-bda8-4c2a-ba64-83dc4512e8a6\") " pod="openshift-image-registry/image-registry-7b9786f944-5tkx5" Apr 23 16:38:01.026833 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:01.026706 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bab205f1-bda8-4c2a-ba64-83dc4512e8a6-ca-trust-extracted\") pod \"image-registry-7b9786f944-5tkx5\" (UID: \"bab205f1-bda8-4c2a-ba64-83dc4512e8a6\") " pod="openshift-image-registry/image-registry-7b9786f944-5tkx5" Apr 23 16:38:01.027268 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:01.027244 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bab205f1-bda8-4c2a-ba64-83dc4512e8a6-registry-certificates\") pod \"image-registry-7b9786f944-5tkx5\" (UID: \"bab205f1-bda8-4c2a-ba64-83dc4512e8a6\") " pod="openshift-image-registry/image-registry-7b9786f944-5tkx5" Apr 23 16:38:01.027710 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:01.027683 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bab205f1-bda8-4c2a-ba64-83dc4512e8a6-trusted-ca\") pod \"image-registry-7b9786f944-5tkx5\" (UID: \"bab205f1-bda8-4c2a-ba64-83dc4512e8a6\") " pod="openshift-image-registry/image-registry-7b9786f944-5tkx5" Apr 23 16:38:01.028770 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:01.028741 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bab205f1-bda8-4c2a-ba64-83dc4512e8a6-image-registry-private-configuration\") pod \"image-registry-7b9786f944-5tkx5\" (UID: \"bab205f1-bda8-4c2a-ba64-83dc4512e8a6\") " pod="openshift-image-registry/image-registry-7b9786f944-5tkx5" Apr 23 16:38:01.028848 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:01.028751 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bab205f1-bda8-4c2a-ba64-83dc4512e8a6-registry-tls\") pod \"image-registry-7b9786f944-5tkx5\" (UID: \"bab205f1-bda8-4c2a-ba64-83dc4512e8a6\") " pod="openshift-image-registry/image-registry-7b9786f944-5tkx5" Apr 23 16:38:01.028888 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:01.028863 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bab205f1-bda8-4c2a-ba64-83dc4512e8a6-installation-pull-secrets\") pod \"image-registry-7b9786f944-5tkx5\" (UID: \"bab205f1-bda8-4c2a-ba64-83dc4512e8a6\") " pod="openshift-image-registry/image-registry-7b9786f944-5tkx5" Apr 23 16:38:01.035000 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:01.034979 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bab205f1-bda8-4c2a-ba64-83dc4512e8a6-bound-sa-token\") pod \"image-registry-7b9786f944-5tkx5\" (UID: \"bab205f1-bda8-4c2a-ba64-83dc4512e8a6\") " pod="openshift-image-registry/image-registry-7b9786f944-5tkx5" Apr 23 16:38:01.035180 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:01.035161 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrf68\" (UniqueName: \"kubernetes.io/projected/bab205f1-bda8-4c2a-ba64-83dc4512e8a6-kube-api-access-wrf68\") pod \"image-registry-7b9786f944-5tkx5\" (UID: \"bab205f1-bda8-4c2a-ba64-83dc4512e8a6\") " pod="openshift-image-registry/image-registry-7b9786f944-5tkx5" Apr 23 16:38:01.127860 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:01.127770 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7b9786f944-5tkx5" Apr 23 16:38:01.250071 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:01.250041 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7b9786f944-5tkx5"] Apr 23 16:38:01.253455 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:38:01.253426 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbab205f1_bda8_4c2a_ba64_83dc4512e8a6.slice/crio-94fb54f27f7612e86da9c2b14f134cc9931266c3244e479cd184deb7b4ef2255 WatchSource:0}: Error finding container 94fb54f27f7612e86da9c2b14f134cc9931266c3244e479cd184deb7b4ef2255: Status 404 returned error can't find the container with id 94fb54f27f7612e86da9c2b14f134cc9931266c3244e479cd184deb7b4ef2255 Apr 23 16:38:01.274467 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:01.274439 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7b9786f944-5tkx5" event={"ID":"bab205f1-bda8-4c2a-ba64-83dc4512e8a6","Type":"ContainerStarted","Data":"94fb54f27f7612e86da9c2b14f134cc9931266c3244e479cd184deb7b4ef2255"} Apr 23 16:38:01.728105 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:01.728071 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 16:38:01.926015 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:38:01.925981 2571 configmap.go:193] Couldn't get configMap openshift-insights/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Apr 23 16:38:01.926366 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:38:01.926061 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/95f8311a-523d-4e15-9d53-62b348ee103c-kube-rbac-proxy-cm podName:95f8311a-523d-4e15-9d53-62b348ee103c nodeName:}" failed. No retries permitted until 2026-04-23 16:38:02.426045862 +0000 UTC m=+168.304455445 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-rbac-proxy-cm" (UniqueName: "kubernetes.io/configmap/95f8311a-523d-4e15-9d53-62b348ee103c-kube-rbac-proxy-cm") pod "insights-runtime-extractor-vrfds" (UID: "95f8311a-523d-4e15-9d53-62b348ee103c") : failed to sync configmap cache: timed out waiting for the condition Apr 23 16:38:01.926366 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:38:01.925985 2571 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: failed to sync secret cache: timed out waiting for the condition Apr 23 16:38:01.926366 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:38:01.926127 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95f8311a-523d-4e15-9d53-62b348ee103c-insights-runtime-extractor-tls podName:95f8311a-523d-4e15-9d53-62b348ee103c nodeName:}" failed. No retries permitted until 2026-04-23 16:38:02.426114834 +0000 UTC m=+168.304524421 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/95f8311a-523d-4e15-9d53-62b348ee103c-insights-runtime-extractor-tls") pod "insights-runtime-extractor-vrfds" (UID: "95f8311a-523d-4e15-9d53-62b348ee103c") : failed to sync secret cache: timed out waiting for the condition Apr 23 16:38:01.941497 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:38:01.941479 2571 projected.go:289] Couldn't get configMap openshift-insights/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Apr 23 16:38:01.941554 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:38:01.941507 2571 projected.go:194] Error preparing data for projected volume kube-api-access-rsdc8 for pod openshift-insights/insights-runtime-extractor-vrfds: failed to sync configmap cache: timed out waiting for the condition Apr 23 16:38:01.941554 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:38:01.941546 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/95f8311a-523d-4e15-9d53-62b348ee103c-kube-api-access-rsdc8 podName:95f8311a-523d-4e15-9d53-62b348ee103c nodeName:}" failed. No retries permitted until 2026-04-23 16:38:02.441536065 +0000 UTC m=+168.319945648 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-rsdc8" (UniqueName: "kubernetes.io/projected/95f8311a-523d-4e15-9d53-62b348ee103c-kube-api-access-rsdc8") pod "insights-runtime-extractor-vrfds" (UID: "95f8311a-523d-4e15-9d53-62b348ee103c") : failed to sync configmap cache: timed out waiting for the condition Apr 23 16:38:02.035254 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:02.035185 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 16:38:02.160046 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:02.160021 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-kd74h\"" Apr 23 16:38:02.183991 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:02.183966 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 16:38:02.278115 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:02.278079 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7b9786f944-5tkx5" event={"ID":"bab205f1-bda8-4c2a-ba64-83dc4512e8a6","Type":"ContainerStarted","Data":"99119c20f2ccf0502518d596ae7a9512770eaa0a2d00eab5259db5644b495177"} Apr 23 16:38:02.278279 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:02.278194 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7b9786f944-5tkx5" Apr 23 16:38:02.299949 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:02.299839 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7b9786f944-5tkx5" podStartSLOduration=2.29982185 podStartE2EDuration="2.29982185s" podCreationTimestamp="2026-04-23 16:38:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:38:02.299181418 +0000 UTC m=+168.177591024" watchObservedRunningTime="2026-04-23 16:38:02.29982185 +0000 UTC m=+168.178231457" Apr 23 16:38:02.346755 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:02.346729 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 16:38:02.437459 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:02.437428 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/95f8311a-523d-4e15-9d53-62b348ee103c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-vrfds\" (UID: \"95f8311a-523d-4e15-9d53-62b348ee103c\") " pod="openshift-insights/insights-runtime-extractor-vrfds" Apr 23 16:38:02.437637 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:02.437516 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/95f8311a-523d-4e15-9d53-62b348ee103c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vrfds\" (UID: \"95f8311a-523d-4e15-9d53-62b348ee103c\") " pod="openshift-insights/insights-runtime-extractor-vrfds" Apr 23 16:38:02.437967 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:02.437946 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/95f8311a-523d-4e15-9d53-62b348ee103c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-vrfds\" (UID: \"95f8311a-523d-4e15-9d53-62b348ee103c\") " pod="openshift-insights/insights-runtime-extractor-vrfds" Apr 23 16:38:02.439739 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:02.439713 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/95f8311a-523d-4e15-9d53-62b348ee103c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vrfds\" (UID: \"95f8311a-523d-4e15-9d53-62b348ee103c\") " pod="openshift-insights/insights-runtime-extractor-vrfds" Apr 23 16:38:02.538480 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:02.538435 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rsdc8\" (UniqueName: \"kubernetes.io/projected/95f8311a-523d-4e15-9d53-62b348ee103c-kube-api-access-rsdc8\") pod \"insights-runtime-extractor-vrfds\" (UID: \"95f8311a-523d-4e15-9d53-62b348ee103c\") " pod="openshift-insights/insights-runtime-extractor-vrfds" Apr 23 16:38:02.540877 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:02.540850 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsdc8\" (UniqueName: \"kubernetes.io/projected/95f8311a-523d-4e15-9d53-62b348ee103c-kube-api-access-rsdc8\") pod \"insights-runtime-extractor-vrfds\" (UID: \"95f8311a-523d-4e15-9d53-62b348ee103c\") " pod="openshift-insights/insights-runtime-extractor-vrfds" Apr 23 16:38:02.600838 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:02.600780 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-vrfds" Apr 23 16:38:02.721643 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:02.721612 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-vrfds"] Apr 23 16:38:02.725976 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:38:02.725946 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95f8311a_523d_4e15_9d53_62b348ee103c.slice/crio-67b0daaa9fbdca0071fdd70969bd596393d4cc40764b43390119cd155bd48da6 WatchSource:0}: Error finding container 67b0daaa9fbdca0071fdd70969bd596393d4cc40764b43390119cd155bd48da6: Status 404 returned error can't find the container with id 67b0daaa9fbdca0071fdd70969bd596393d4cc40764b43390119cd155bd48da6 Apr 23 16:38:02.739473 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:02.739451 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bcpj7" Apr 23 16:38:02.741978 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:02.741956 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l9v47\"" Apr 23 16:38:02.750382 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:02.750354 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bcpj7" Apr 23 16:38:02.870318 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:02.870236 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bcpj7"] Apr 23 16:38:02.873997 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:38:02.873963 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8ef03b2_bf64_4538_9ba2_cd786001399c.slice/crio-20d1904f31ee2a693a2930aaa02e31be58c12c73c3116349e84915b2ed8c0bfa WatchSource:0}: Error finding container 20d1904f31ee2a693a2930aaa02e31be58c12c73c3116349e84915b2ed8c0bfa: Status 404 returned error can't find the container with id 20d1904f31ee2a693a2930aaa02e31be58c12c73c3116349e84915b2ed8c0bfa Apr 23 16:38:03.281912 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:03.281887 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bcpj7" event={"ID":"e8ef03b2-bf64-4538-9ba2-cd786001399c","Type":"ContainerStarted","Data":"20d1904f31ee2a693a2930aaa02e31be58c12c73c3116349e84915b2ed8c0bfa"} Apr 23 16:38:03.283374 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:03.283349 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vrfds" event={"ID":"95f8311a-523d-4e15-9d53-62b348ee103c","Type":"ContainerStarted","Data":"f9b4eacdc729f87ad22e003bc37b07c56713aa44cc0618abe1f0620234849627"} Apr 23 16:38:03.283475 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:03.283381 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vrfds" event={"ID":"95f8311a-523d-4e15-9d53-62b348ee103c","Type":"ContainerStarted","Data":"67b0daaa9fbdca0071fdd70969bd596393d4cc40764b43390119cd155bd48da6"} Apr 23 16:38:04.287570 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:04.287530 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vrfds" event={"ID":"95f8311a-523d-4e15-9d53-62b348ee103c","Type":"ContainerStarted","Data":"8d522ccf46184dcaa5120b6456794c81a04b98bbf94074aa332178eba8c14398"} Apr 23 16:38:05.293490 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:05.293447 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bcpj7" event={"ID":"e8ef03b2-bf64-4538-9ba2-cd786001399c","Type":"ContainerStarted","Data":"964ff2a1959f4ad2a980dacaac43df95fca80f65657192eedbeec097c31bc480"} Apr 23 16:38:05.295113 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:05.295090 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vrfds" event={"ID":"95f8311a-523d-4e15-9d53-62b348ee103c","Type":"ContainerStarted","Data":"0fa0583fed164c915181f77430e8830722460ffd1350ad928070c3e95378075e"} Apr 23 16:38:05.311209 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:05.311166 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-bcpj7" podStartSLOduration=135.308331916 podStartE2EDuration="2m17.311153541s" podCreationTimestamp="2026-04-23 16:35:48 +0000 UTC" firstStartedPulling="2026-04-23 16:38:02.875812063 +0000 UTC m=+168.754221646" lastFinishedPulling="2026-04-23 16:38:04.878633688 +0000 UTC m=+170.757043271" observedRunningTime="2026-04-23 16:38:05.310869979 +0000 UTC m=+171.189279577" watchObservedRunningTime="2026-04-23 16:38:05.311153541 +0000 UTC m=+171.189563145" Apr 23 16:38:05.332457 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:05.332411 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-vrfds" podStartSLOduration=3.233630041 podStartE2EDuration="5.332397839s" podCreationTimestamp="2026-04-23 16:38:00 +0000 UTC" firstStartedPulling="2026-04-23 16:38:02.781201157 +0000 UTC m=+168.659610743" lastFinishedPulling="2026-04-23 16:38:04.87996895 +0000 UTC m=+170.758378541" observedRunningTime="2026-04-23 16:38:05.332142524 +0000 UTC m=+171.210552131" watchObservedRunningTime="2026-04-23 16:38:05.332397839 +0000 UTC m=+171.210807441" Apr 23 16:38:07.740184 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:07.740137 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kt9x" Apr 23 16:38:09.273751 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:09.273720 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-79c4v" Apr 23 16:38:10.407653 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:10.407615 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-r52q6"] Apr 23 16:38:10.412328 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:10.412310 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-r52q6" Apr 23 16:38:10.414902 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:10.414777 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 16:38:10.415036 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:10.414927 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-292nf\"" Apr 23 16:38:10.415036 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:10.415028 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 16:38:10.415232 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:10.415218 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 16:38:10.416060 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:10.416043 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 16:38:10.416152 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:10.416047 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 16:38:10.416152 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:10.416087 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 16:38:10.493982 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:10.493953 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3a765611-ea7a-4204-b9f3-15d7d8a32024-root\") pod \"node-exporter-r52q6\" (UID: \"3a765611-ea7a-4204-b9f3-15d7d8a32024\") " pod="openshift-monitoring/node-exporter-r52q6" Apr 23 16:38:10.494114 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:10.493989 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/3a765611-ea7a-4204-b9f3-15d7d8a32024-node-exporter-accelerators-collector-config\") pod \"node-exporter-r52q6\" (UID: \"3a765611-ea7a-4204-b9f3-15d7d8a32024\") " pod="openshift-monitoring/node-exporter-r52q6" Apr 23 16:38:10.494114 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:10.494056 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3a765611-ea7a-4204-b9f3-15d7d8a32024-node-exporter-textfile\") pod \"node-exporter-r52q6\" (UID: \"3a765611-ea7a-4204-b9f3-15d7d8a32024\") " pod="openshift-monitoring/node-exporter-r52q6" Apr 23 16:38:10.494114 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:10.494082 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3a765611-ea7a-4204-b9f3-15d7d8a32024-node-exporter-tls\") pod \"node-exporter-r52q6\" (UID: \"3a765611-ea7a-4204-b9f3-15d7d8a32024\") " pod="openshift-monitoring/node-exporter-r52q6" Apr 23 16:38:10.494215 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:10.494138 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3a765611-ea7a-4204-b9f3-15d7d8a32024-node-exporter-wtmp\") pod \"node-exporter-r52q6\" (UID: \"3a765611-ea7a-4204-b9f3-15d7d8a32024\") " pod="openshift-monitoring/node-exporter-r52q6" Apr 23 16:38:10.494215 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:10.494159 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3a765611-ea7a-4204-b9f3-15d7d8a32024-sys\") pod \"node-exporter-r52q6\" (UID: \"3a765611-ea7a-4204-b9f3-15d7d8a32024\") " pod="openshift-monitoring/node-exporter-r52q6" Apr 23 16:38:10.494215 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:10.494173 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3a765611-ea7a-4204-b9f3-15d7d8a32024-metrics-client-ca\") pod \"node-exporter-r52q6\" (UID: \"3a765611-ea7a-4204-b9f3-15d7d8a32024\") " pod="openshift-monitoring/node-exporter-r52q6" Apr 23 16:38:10.494215 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:10.494195 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3a765611-ea7a-4204-b9f3-15d7d8a32024-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-r52q6\" (UID: \"3a765611-ea7a-4204-b9f3-15d7d8a32024\") " pod="openshift-monitoring/node-exporter-r52q6" Apr 23 16:38:10.494343 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:10.494258 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xd9q\" (UniqueName: \"kubernetes.io/projected/3a765611-ea7a-4204-b9f3-15d7d8a32024-kube-api-access-4xd9q\") pod \"node-exporter-r52q6\" (UID: \"3a765611-ea7a-4204-b9f3-15d7d8a32024\") " pod="openshift-monitoring/node-exporter-r52q6" Apr 23 16:38:10.595143 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:10.595109 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3a765611-ea7a-4204-b9f3-15d7d8a32024-node-exporter-tls\") pod \"node-exporter-r52q6\" (UID: \"3a765611-ea7a-4204-b9f3-15d7d8a32024\") " pod="openshift-monitoring/node-exporter-r52q6" Apr 23 16:38:10.595143 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:10.595148 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3a765611-ea7a-4204-b9f3-15d7d8a32024-node-exporter-wtmp\") pod \"node-exporter-r52q6\" (UID: \"3a765611-ea7a-4204-b9f3-15d7d8a32024\") " pod="openshift-monitoring/node-exporter-r52q6" Apr 23 16:38:10.595363 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:10.595168 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3a765611-ea7a-4204-b9f3-15d7d8a32024-sys\") pod \"node-exporter-r52q6\" (UID: \"3a765611-ea7a-4204-b9f3-15d7d8a32024\") " pod="openshift-monitoring/node-exporter-r52q6" Apr 23 16:38:10.595363 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:10.595187 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3a765611-ea7a-4204-b9f3-15d7d8a32024-metrics-client-ca\") pod \"node-exporter-r52q6\" (UID: \"3a765611-ea7a-4204-b9f3-15d7d8a32024\") " pod="openshift-monitoring/node-exporter-r52q6" Apr 23 16:38:10.595363 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:10.595208 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3a765611-ea7a-4204-b9f3-15d7d8a32024-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-r52q6\" (UID: \"3a765611-ea7a-4204-b9f3-15d7d8a32024\") " pod="openshift-monitoring/node-exporter-r52q6" Apr 23 16:38:10.595363 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:10.595247 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4xd9q\" (UniqueName: \"kubernetes.io/projected/3a765611-ea7a-4204-b9f3-15d7d8a32024-kube-api-access-4xd9q\") pod \"node-exporter-r52q6\" (UID: \"3a765611-ea7a-4204-b9f3-15d7d8a32024\") " pod="openshift-monitoring/node-exporter-r52q6" Apr 23 16:38:10.595363 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:10.595279 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3a765611-ea7a-4204-b9f3-15d7d8a32024-root\") pod \"node-exporter-r52q6\" (UID: \"3a765611-ea7a-4204-b9f3-15d7d8a32024\") " pod="openshift-monitoring/node-exporter-r52q6" Apr 23 16:38:10.595363 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:10.595309 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/3a765611-ea7a-4204-b9f3-15d7d8a32024-node-exporter-accelerators-collector-config\") pod \"node-exporter-r52q6\" (UID: \"3a765611-ea7a-4204-b9f3-15d7d8a32024\") " pod="openshift-monitoring/node-exporter-r52q6" Apr 23 16:38:10.595363 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:10.595345 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3a765611-ea7a-4204-b9f3-15d7d8a32024-root\") pod \"node-exporter-r52q6\" (UID: \"3a765611-ea7a-4204-b9f3-15d7d8a32024\") " pod="openshift-monitoring/node-exporter-r52q6" Apr 23 16:38:10.595363 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:10.595355 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3a765611-ea7a-4204-b9f3-15d7d8a32024-node-exporter-textfile\") pod \"node-exporter-r52q6\" (UID: \"3a765611-ea7a-4204-b9f3-15d7d8a32024\") " pod="openshift-monitoring/node-exporter-r52q6" Apr 23 16:38:10.595725 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:10.595245 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3a765611-ea7a-4204-b9f3-15d7d8a32024-sys\") pod \"node-exporter-r52q6\" (UID: \"3a765611-ea7a-4204-b9f3-15d7d8a32024\") " pod="openshift-monitoring/node-exporter-r52q6" Apr 23 16:38:10.595725 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:10.595307 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3a765611-ea7a-4204-b9f3-15d7d8a32024-node-exporter-wtmp\") pod \"node-exporter-r52q6\" (UID: \"3a765611-ea7a-4204-b9f3-15d7d8a32024\") " pod="openshift-monitoring/node-exporter-r52q6" Apr 23 16:38:10.595725 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:10.595678 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3a765611-ea7a-4204-b9f3-15d7d8a32024-node-exporter-textfile\") pod \"node-exporter-r52q6\" (UID: \"3a765611-ea7a-4204-b9f3-15d7d8a32024\") " pod="openshift-monitoring/node-exporter-r52q6" Apr 23 16:38:10.595874 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:10.595830 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3a765611-ea7a-4204-b9f3-15d7d8a32024-metrics-client-ca\") pod \"node-exporter-r52q6\" (UID: \"3a765611-ea7a-4204-b9f3-15d7d8a32024\") " pod="openshift-monitoring/node-exporter-r52q6" Apr 23 16:38:10.595928 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:10.595900 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/3a765611-ea7a-4204-b9f3-15d7d8a32024-node-exporter-accelerators-collector-config\") pod \"node-exporter-r52q6\" (UID: \"3a765611-ea7a-4204-b9f3-15d7d8a32024\") " pod="openshift-monitoring/node-exporter-r52q6" Apr 23 16:38:10.597489 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:10.597462 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3a765611-ea7a-4204-b9f3-15d7d8a32024-node-exporter-tls\") pod \"node-exporter-r52q6\" (UID: \"3a765611-ea7a-4204-b9f3-15d7d8a32024\") " pod="openshift-monitoring/node-exporter-r52q6" Apr 23 16:38:10.597582 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:10.597512 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3a765611-ea7a-4204-b9f3-15d7d8a32024-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-r52q6\" (UID: \"3a765611-ea7a-4204-b9f3-15d7d8a32024\") " pod="openshift-monitoring/node-exporter-r52q6" Apr 23 16:38:10.605242 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:10.605219 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xd9q\" (UniqueName: \"kubernetes.io/projected/3a765611-ea7a-4204-b9f3-15d7d8a32024-kube-api-access-4xd9q\") pod \"node-exporter-r52q6\" (UID: \"3a765611-ea7a-4204-b9f3-15d7d8a32024\") " pod="openshift-monitoring/node-exporter-r52q6" Apr 23 16:38:10.721865 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:10.721835 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-r52q6" Apr 23 16:38:10.729454 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:38:10.729427 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a765611_ea7a_4204_b9f3_15d7d8a32024.slice/crio-ef315010cbdb79539b39978e58d319e25ad663b38022923f3dc4826e0ced06db WatchSource:0}: Error finding container ef315010cbdb79539b39978e58d319e25ad663b38022923f3dc4826e0ced06db: Status 404 returned error can't find the container with id ef315010cbdb79539b39978e58d319e25ad663b38022923f3dc4826e0ced06db Apr 23 16:38:11.318910 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:11.318866 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-r52q6" event={"ID":"3a765611-ea7a-4204-b9f3-15d7d8a32024","Type":"ContainerStarted","Data":"ef315010cbdb79539b39978e58d319e25ad663b38022923f3dc4826e0ced06db"} Apr 23 16:38:12.322817 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:12.322781 2571 generic.go:358] "Generic (PLEG): container finished" podID="3a765611-ea7a-4204-b9f3-15d7d8a32024" containerID="515247c8912fd8989c0e9661368e0757dbc1bb02901cacd96d61dc9d680cf6a9" exitCode=0 Apr 23 16:38:12.323233 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:12.322827 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-r52q6" event={"ID":"3a765611-ea7a-4204-b9f3-15d7d8a32024","Type":"ContainerDied","Data":"515247c8912fd8989c0e9661368e0757dbc1bb02901cacd96d61dc9d680cf6a9"} Apr 23 16:38:13.326853 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:13.326821 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-r52q6" event={"ID":"3a765611-ea7a-4204-b9f3-15d7d8a32024","Type":"ContainerStarted","Data":"701e136073907ac722616b98e8cf3988565090a3dfc986092ebbbb4ecebd1fd9"} Apr 23 16:38:13.327198 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:13.326860 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-r52q6" event={"ID":"3a765611-ea7a-4204-b9f3-15d7d8a32024","Type":"ContainerStarted","Data":"60c29bd5cdb1a31b66e73dbc308fe9255c96949b617cc2260bb13c827aa44253"} Apr 23 16:38:13.352386 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:13.352333 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-r52q6" podStartSLOduration=2.617815297 podStartE2EDuration="3.352319805s" podCreationTimestamp="2026-04-23 16:38:10 +0000 UTC" firstStartedPulling="2026-04-23 16:38:10.731159468 +0000 UTC m=+176.609569065" lastFinishedPulling="2026-04-23 16:38:11.465663989 +0000 UTC m=+177.344073573" observedRunningTime="2026-04-23 16:38:13.350465287 +0000 UTC m=+179.228874894" watchObservedRunningTime="2026-04-23 16:38:13.352319805 +0000 UTC m=+179.230729410" Apr 23 16:38:16.646681 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.646651 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 16:38:16.651151 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.651133 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.653585 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.653562 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 23 16:38:16.654140 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.654117 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 23 16:38:16.654340 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.654313 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-lnpsq\"" Apr 23 16:38:16.654445 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.654384 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 23 16:38:16.655031 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.655018 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 16:38:16.656161 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.656130 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 23 16:38:16.656499 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.656479 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 23 16:38:16.656499 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.656479 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 23 16:38:16.656587 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.656504 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 23 16:38:16.657735 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.657718 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 23 16:38:16.657832 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.657756 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 23 16:38:16.658123 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.658104 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 23 16:38:16.659011 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.658994 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-bkalbold486o3\"" Apr 23 16:38:16.660656 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.660638 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 23 16:38:16.662052 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.662037 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 23 16:38:16.666073 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.666056 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 16:38:16.741956 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.741932 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.742083 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.741965 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.742083 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.741987 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-web-config\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.742083 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.742035 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.742083 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.742063 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.742083 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.742081 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.742285 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.742129 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.742285 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.742150 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.742285 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.742167 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-config-out\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.742285 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.742189 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.742285 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.742219 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.742442 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.742292 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.742442 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.742328 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.742442 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.742365 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.742442 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.742409 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.742592 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.742445 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.742592 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.742474 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-config\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.742592 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.742503 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shx9f\" (UniqueName: \"kubernetes.io/projected/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-kube-api-access-shx9f\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.843408 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.843379 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.843408 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.843409 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.843591 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.843428 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.843591 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.843449 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.843591 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.843575 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.843763 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.843627 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-config-out\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.843763 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.843667 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.843763 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.843697 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.843909 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.843769 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.843909 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.843796 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.843909 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.843822 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.843909 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.843850 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.843909 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.843894 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.844164 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.843928 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-config\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.844164 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.843963 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shx9f\" (UniqueName: \"kubernetes.io/projected/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-kube-api-access-shx9f\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.844164 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.843996 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.844317 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.844212 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.844529 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.844021 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.844748 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.844730 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-web-config\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.845189 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.845157 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.846395 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.846369 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.846933 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.846861 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.847156 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.844520 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.847501 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.847473 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.847728 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.847707 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-config-out\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.848119 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.848096 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-web-config\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.848265 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.848246 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.848509 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.848481 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.848610 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.848573 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.848903 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.848879 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.848979 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.848932 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.849072 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.849050 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.849288 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.849261 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.849528 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.849512 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.850301 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.850282 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-config\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.860236 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.860218 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-shx9f\" (UniqueName: \"kubernetes.io/projected/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-kube-api-access-shx9f\") pod \"prometheus-k8s-0\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:16.959944 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:16.959905 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:17.148953 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:17.148920 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 16:38:17.151725 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:38:17.151695 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb91bab7f_5f15_4bf1_a6c3_241a09794ddc.slice/crio-95e3feb27404383424a337b49fad967d21315d0687f78b06925d1dc2035e67fc WatchSource:0}: Error finding container 95e3feb27404383424a337b49fad967d21315d0687f78b06925d1dc2035e67fc: Status 404 returned error can't find the container with id 95e3feb27404383424a337b49fad967d21315d0687f78b06925d1dc2035e67fc Apr 23 16:38:17.338079 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:17.337990 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b91bab7f-5f15-4bf1-a6c3-241a09794ddc","Type":"ContainerStarted","Data":"95e3feb27404383424a337b49fad967d21315d0687f78b06925d1dc2035e67fc"} Apr 23 16:38:18.341408 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:18.341375 2571 generic.go:358] "Generic (PLEG): container finished" podID="b91bab7f-5f15-4bf1-a6c3-241a09794ddc" containerID="6605d1ca9d563e2d9abd6c65e925af194e3c65af5a76e2a1dedf4945815c64f2" exitCode=0 Apr 23 16:38:18.341797 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:18.341441 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b91bab7f-5f15-4bf1-a6c3-241a09794ddc","Type":"ContainerDied","Data":"6605d1ca9d563e2d9abd6c65e925af194e3c65af5a76e2a1dedf4945815c64f2"} Apr 23 16:38:21.352053 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:21.352021 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b91bab7f-5f15-4bf1-a6c3-241a09794ddc","Type":"ContainerStarted","Data":"2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb"} Apr 23 16:38:21.352053 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:21.352056 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b91bab7f-5f15-4bf1-a6c3-241a09794ddc","Type":"ContainerStarted","Data":"a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6"} Apr 23 16:38:23.287695 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:23.287665 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7b9786f944-5tkx5" Apr 23 16:38:23.360404 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:23.360371 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b91bab7f-5f15-4bf1-a6c3-241a09794ddc","Type":"ContainerStarted","Data":"470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893"} Apr 23 16:38:23.360404 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:23.360405 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b91bab7f-5f15-4bf1-a6c3-241a09794ddc","Type":"ContainerStarted","Data":"561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e"} Apr 23 16:38:23.360593 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:23.360419 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b91bab7f-5f15-4bf1-a6c3-241a09794ddc","Type":"ContainerStarted","Data":"5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3"} Apr 23 16:38:23.360593 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:23.360427 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b91bab7f-5f15-4bf1-a6c3-241a09794ddc","Type":"ContainerStarted","Data":"8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a"} Apr 23 16:38:23.405805 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:23.405762 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.951227832 podStartE2EDuration="7.405747587s" podCreationTimestamp="2026-04-23 16:38:16 +0000 UTC" firstStartedPulling="2026-04-23 16:38:17.153727575 +0000 UTC m=+183.032137158" lastFinishedPulling="2026-04-23 16:38:22.608247315 +0000 UTC m=+188.486656913" observedRunningTime="2026-04-23 16:38:23.403840159 +0000 UTC m=+189.282249821" watchObservedRunningTime="2026-04-23 16:38:23.405747587 +0000 UTC m=+189.284157191" Apr 23 16:38:26.960361 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:26.960324 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:30.894917 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:30.894856 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn" podUID="a7d872aa-f6d8-4da7-929f-adb4601407db" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 16:38:40.894859 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:40.894820 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn" podUID="a7d872aa-f6d8-4da7-929f-adb4601407db" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 16:38:50.894338 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:50.894297 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn" podUID="a7d872aa-f6d8-4da7-929f-adb4601407db" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 16:38:50.894709 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:50.894359 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn" Apr 23 16:38:50.894878 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:50.894848 2571 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"10bd77ac328bcf3dddb0386ed33b4b49b88b2a7d11af38444704c4c247e1e76e"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 23 16:38:50.894921 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:50.894911 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn" podUID="a7d872aa-f6d8-4da7-929f-adb4601407db" containerName="service-proxy" containerID="cri-o://10bd77ac328bcf3dddb0386ed33b4b49b88b2a7d11af38444704c4c247e1e76e" gracePeriod=30 Apr 23 16:38:51.431561 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:51.431528 2571 generic.go:358] "Generic (PLEG): container finished" podID="a7d872aa-f6d8-4da7-929f-adb4601407db" containerID="10bd77ac328bcf3dddb0386ed33b4b49b88b2a7d11af38444704c4c247e1e76e" exitCode=2 Apr 23 16:38:51.431742 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:51.431594 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn" event={"ID":"a7d872aa-f6d8-4da7-929f-adb4601407db","Type":"ContainerDied","Data":"10bd77ac328bcf3dddb0386ed33b4b49b88b2a7d11af38444704c4c247e1e76e"} Apr 23 16:38:51.431742 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:38:51.431661 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bcb5f9c9b-qwmzn" event={"ID":"a7d872aa-f6d8-4da7-929f-adb4601407db","Type":"ContainerStarted","Data":"aee4c03ce9cd8c4020de5d18ab3192886e842e680b871058ba3d3be7684a0a7a"} Apr 23 16:39:16.960163 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:16.960121 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:16.978266 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:16.978236 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:17.514978 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:17.514950 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:26.499776 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:26.499735 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbfa21d6-bf64-400e-9cbe-b26171f7ef99-metrics-certs\") pod \"network-metrics-daemon-7kt9x\" (UID: \"dbfa21d6-bf64-400e-9cbe-b26171f7ef99\") " pod="openshift-multus/network-metrics-daemon-7kt9x" Apr 23 16:39:26.502016 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:26.501989 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbfa21d6-bf64-400e-9cbe-b26171f7ef99-metrics-certs\") pod \"network-metrics-daemon-7kt9x\" (UID: \"dbfa21d6-bf64-400e-9cbe-b26171f7ef99\") " pod="openshift-multus/network-metrics-daemon-7kt9x" Apr 23 16:39:26.643515 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:26.643483 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-nldpm\"" Apr 23 16:39:26.651570 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:26.651553 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kt9x" Apr 23 16:39:26.766412 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:26.766341 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7kt9x"] Apr 23 16:39:26.770672 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:39:26.770648 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbfa21d6_bf64_400e_9cbe_b26171f7ef99.slice/crio-ab4064cfa5ef33b2362ca0a190726778496bd1d1c48630e9fedb491e1d986697 WatchSource:0}: Error finding container ab4064cfa5ef33b2362ca0a190726778496bd1d1c48630e9fedb491e1d986697: Status 404 returned error can't find the container with id ab4064cfa5ef33b2362ca0a190726778496bd1d1c48630e9fedb491e1d986697 Apr 23 16:39:27.524343 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:27.524310 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7kt9x" event={"ID":"dbfa21d6-bf64-400e-9cbe-b26171f7ef99","Type":"ContainerStarted","Data":"ab4064cfa5ef33b2362ca0a190726778496bd1d1c48630e9fedb491e1d986697"} Apr 23 16:39:28.528263 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:28.528223 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7kt9x" event={"ID":"dbfa21d6-bf64-400e-9cbe-b26171f7ef99","Type":"ContainerStarted","Data":"382e0e4d1996a2ca73cbd2d75bfd505c658cfe4696cee435b858da20fad50f50"} Apr 23 16:39:28.528263 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:28.528256 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7kt9x" event={"ID":"dbfa21d6-bf64-400e-9cbe-b26171f7ef99","Type":"ContainerStarted","Data":"9051faff127c6b9515f7ee7a1d9c20e43788cbf1ca81c5b90fb20c774a3db7c9"} Apr 23 16:39:28.546201 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:28.546158 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-7kt9x" podStartSLOduration=253.622691768 podStartE2EDuration="4m14.546143713s" podCreationTimestamp="2026-04-23 16:35:14 +0000 UTC" firstStartedPulling="2026-04-23 16:39:26.77276635 +0000 UTC m=+252.651175933" lastFinishedPulling="2026-04-23 16:39:27.696218295 +0000 UTC m=+253.574627878" observedRunningTime="2026-04-23 16:39:28.544773783 +0000 UTC m=+254.423183400" watchObservedRunningTime="2026-04-23 16:39:28.546143713 +0000 UTC m=+254.424553317" Apr 23 16:39:35.169567 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.169527 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 16:39:35.170207 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.170169 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="b91bab7f-5f15-4bf1-a6c3-241a09794ddc" containerName="kube-rbac-proxy" containerID="cri-o://561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e" gracePeriod=600 Apr 23 16:39:35.170207 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.170169 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="b91bab7f-5f15-4bf1-a6c3-241a09794ddc" containerName="config-reloader" containerID="cri-o://2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb" gracePeriod=600 Apr 23 16:39:35.170411 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.170182 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="b91bab7f-5f15-4bf1-a6c3-241a09794ddc" containerName="thanos-sidecar" containerID="cri-o://8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a" gracePeriod=600 Apr 23 16:39:35.170411 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.170211 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="b91bab7f-5f15-4bf1-a6c3-241a09794ddc" containerName="kube-rbac-proxy-thanos" containerID="cri-o://470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893" gracePeriod=600 Apr 23 16:39:35.170411 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.170193 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="b91bab7f-5f15-4bf1-a6c3-241a09794ddc" containerName="kube-rbac-proxy-web" containerID="cri-o://5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3" gracePeriod=600 Apr 23 16:39:35.170411 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.170279 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="b91bab7f-5f15-4bf1-a6c3-241a09794ddc" containerName="prometheus" containerID="cri-o://a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6" gracePeriod=600 Apr 23 16:39:35.406280 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.406256 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:35.552152 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.552116 2571 generic.go:358] "Generic (PLEG): container finished" podID="b91bab7f-5f15-4bf1-a6c3-241a09794ddc" containerID="470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893" exitCode=0 Apr 23 16:39:35.552152 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.552150 2571 generic.go:358] "Generic (PLEG): container finished" podID="b91bab7f-5f15-4bf1-a6c3-241a09794ddc" containerID="561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e" exitCode=0 Apr 23 16:39:35.552354 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.552159 2571 generic.go:358] "Generic (PLEG): container finished" podID="b91bab7f-5f15-4bf1-a6c3-241a09794ddc" containerID="5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3" exitCode=0 Apr 23 16:39:35.552354 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.552169 2571 generic.go:358] "Generic (PLEG): container finished" podID="b91bab7f-5f15-4bf1-a6c3-241a09794ddc" containerID="8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a" exitCode=0 Apr 23 16:39:35.552354 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.552176 2571 generic.go:358] "Generic (PLEG): container finished" podID="b91bab7f-5f15-4bf1-a6c3-241a09794ddc" containerID="2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb" exitCode=0 Apr 23 16:39:35.552354 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.552184 2571 generic.go:358] "Generic (PLEG): container finished" podID="b91bab7f-5f15-4bf1-a6c3-241a09794ddc" containerID="a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6" exitCode=0 Apr 23 16:39:35.552354 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.552196 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b91bab7f-5f15-4bf1-a6c3-241a09794ddc","Type":"ContainerDied","Data":"470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893"} Apr 23 16:39:35.552354 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.552237 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b91bab7f-5f15-4bf1-a6c3-241a09794ddc","Type":"ContainerDied","Data":"561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e"} Apr 23 16:39:35.552354 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.552247 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b91bab7f-5f15-4bf1-a6c3-241a09794ddc","Type":"ContainerDied","Data":"5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3"} Apr 23 16:39:35.552354 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.552256 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b91bab7f-5f15-4bf1-a6c3-241a09794ddc","Type":"ContainerDied","Data":"8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a"} Apr 23 16:39:35.552354 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.552256 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:35.552354 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.552266 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b91bab7f-5f15-4bf1-a6c3-241a09794ddc","Type":"ContainerDied","Data":"2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb"} Apr 23 16:39:35.552354 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.552274 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b91bab7f-5f15-4bf1-a6c3-241a09794ddc","Type":"ContainerDied","Data":"a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6"} Apr 23 16:39:35.552354 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.552282 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b91bab7f-5f15-4bf1-a6c3-241a09794ddc","Type":"ContainerDied","Data":"95e3feb27404383424a337b49fad967d21315d0687f78b06925d1dc2035e67fc"} Apr 23 16:39:35.552354 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.552315 2571 scope.go:117] "RemoveContainer" containerID="470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893" Apr 23 16:39:35.559457 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.559437 2571 scope.go:117] "RemoveContainer" containerID="561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e" Apr 23 16:39:35.566062 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.566043 2571 scope.go:117] "RemoveContainer" containerID="5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3" Apr 23 16:39:35.570996 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.570974 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-configmap-kubelet-serving-ca-bundle\") pod \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " Apr 23 16:39:35.571089 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.571014 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shx9f\" (UniqueName: \"kubernetes.io/projected/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-kube-api-access-shx9f\") pod \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " Apr 23 16:39:35.571089 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.571055 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " Apr 23 16:39:35.571201 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.571091 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-configmap-metrics-client-ca\") pod \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " Apr 23 16:39:35.571201 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.571115 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-thanos-prometheus-http-client-file\") pod \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " Apr 23 16:39:35.571201 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.571142 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-config\") pod \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " Apr 23 16:39:35.571435 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.571415 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-web-config\") pod \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " Apr 23 16:39:35.571507 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.571454 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-prometheus-k8s-db\") pod \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " Apr 23 16:39:35.571507 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.571490 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-configmap-serving-certs-ca-bundle\") pod \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " Apr 23 16:39:35.571636 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.571518 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-prometheus-trusted-ca-bundle\") pod \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " Apr 23 16:39:35.571636 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.571524 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "b91bab7f-5f15-4bf1-a6c3-241a09794ddc" (UID: "b91bab7f-5f15-4bf1-a6c3-241a09794ddc"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:39:35.571636 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.571540 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-config-out\") pod \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " Apr 23 16:39:35.571636 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.571520 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "b91bab7f-5f15-4bf1-a6c3-241a09794ddc" (UID: "b91bab7f-5f15-4bf1-a6c3-241a09794ddc"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:39:35.571636 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.571575 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-tls-assets\") pod \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " Apr 23 16:39:35.571886 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.571639 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-secret-grpc-tls\") pod \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " Apr 23 16:39:35.571886 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.571663 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-secret-kube-rbac-proxy\") pod \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " Apr 23 16:39:35.571886 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.571687 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-secret-prometheus-k8s-tls\") pod \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " Apr 23 16:39:35.571886 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.571722 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-prometheus-k8s-rulefiles-0\") pod \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " Apr 23 16:39:35.571886 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.571751 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-secret-metrics-client-certs\") pod \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " Apr 23 16:39:35.571886 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.571774 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\" (UID: \"b91bab7f-5f15-4bf1-a6c3-241a09794ddc\") " Apr 23 16:39:35.572193 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.571976 2571 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-136-190.ec2.internal\" DevicePath \"\"" Apr 23 16:39:35.572193 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.571994 2571 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-configmap-metrics-client-ca\") on node \"ip-10-0-136-190.ec2.internal\" DevicePath \"\"" Apr 23 16:39:35.572193 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.572135 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "b91bab7f-5f15-4bf1-a6c3-241a09794ddc" (UID: "b91bab7f-5f15-4bf1-a6c3-241a09794ddc"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:39:35.572447 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.572425 2571 scope.go:117] "RemoveContainer" containerID="8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a" Apr 23 16:39:35.572578 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.572555 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "b91bab7f-5f15-4bf1-a6c3-241a09794ddc" (UID: "b91bab7f-5f15-4bf1-a6c3-241a09794ddc"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:39:35.572918 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.572798 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "b91bab7f-5f15-4bf1-a6c3-241a09794ddc" (UID: "b91bab7f-5f15-4bf1-a6c3-241a09794ddc"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:39:35.574400 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.574370 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "b91bab7f-5f15-4bf1-a6c3-241a09794ddc" (UID: "b91bab7f-5f15-4bf1-a6c3-241a09794ddc"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:39:35.574529 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.574502 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-config" (OuterVolumeSpecName: "config") pod "b91bab7f-5f15-4bf1-a6c3-241a09794ddc" (UID: "b91bab7f-5f15-4bf1-a6c3-241a09794ddc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:39:35.574676 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.574504 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "b91bab7f-5f15-4bf1-a6c3-241a09794ddc" (UID: "b91bab7f-5f15-4bf1-a6c3-241a09794ddc"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:39:35.575502 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.575473 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "b91bab7f-5f15-4bf1-a6c3-241a09794ddc" (UID: "b91bab7f-5f15-4bf1-a6c3-241a09794ddc"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:39:35.575647 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.575618 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "b91bab7f-5f15-4bf1-a6c3-241a09794ddc" (UID: "b91bab7f-5f15-4bf1-a6c3-241a09794ddc"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:39:35.576047 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.576007 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "b91bab7f-5f15-4bf1-a6c3-241a09794ddc" (UID: "b91bab7f-5f15-4bf1-a6c3-241a09794ddc"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:39:35.576158 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.576141 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-kube-api-access-shx9f" (OuterVolumeSpecName: "kube-api-access-shx9f") pod "b91bab7f-5f15-4bf1-a6c3-241a09794ddc" (UID: "b91bab7f-5f15-4bf1-a6c3-241a09794ddc"). InnerVolumeSpecName "kube-api-access-shx9f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:39:35.576273 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.576256 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "b91bab7f-5f15-4bf1-a6c3-241a09794ddc" (UID: "b91bab7f-5f15-4bf1-a6c3-241a09794ddc"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:39:35.576350 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.576304 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "b91bab7f-5f15-4bf1-a6c3-241a09794ddc" (UID: "b91bab7f-5f15-4bf1-a6c3-241a09794ddc"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:39:35.576397 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.576377 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-config-out" (OuterVolumeSpecName: "config-out") pod "b91bab7f-5f15-4bf1-a6c3-241a09794ddc" (UID: "b91bab7f-5f15-4bf1-a6c3-241a09794ddc"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:39:35.576991 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.576965 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "b91bab7f-5f15-4bf1-a6c3-241a09794ddc" (UID: "b91bab7f-5f15-4bf1-a6c3-241a09794ddc"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:39:35.577284 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.577265 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "b91bab7f-5f15-4bf1-a6c3-241a09794ddc" (UID: "b91bab7f-5f15-4bf1-a6c3-241a09794ddc"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:39:35.580708 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.580685 2571 scope.go:117] "RemoveContainer" containerID="2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb" Apr 23 16:39:35.585507 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.585481 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-web-config" (OuterVolumeSpecName: "web-config") pod "b91bab7f-5f15-4bf1-a6c3-241a09794ddc" (UID: "b91bab7f-5f15-4bf1-a6c3-241a09794ddc"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:39:35.586716 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.586702 2571 scope.go:117] "RemoveContainer" containerID="a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6" Apr 23 16:39:35.592717 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.592702 2571 scope.go:117] "RemoveContainer" containerID="6605d1ca9d563e2d9abd6c65e925af194e3c65af5a76e2a1dedf4945815c64f2" Apr 23 16:39:35.598315 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.598301 2571 scope.go:117] "RemoveContainer" containerID="470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893" Apr 23 16:39:35.599077 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:39:35.598943 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893\": container with ID starting with 470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893 not found: ID does not exist" containerID="470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893" Apr 23 16:39:35.599077 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.598979 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893"} err="failed to get container status \"470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893\": rpc error: code = NotFound desc = could not find container \"470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893\": container with ID starting with 470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893 not found: ID does not exist" Apr 23 16:39:35.599077 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.599002 2571 scope.go:117] "RemoveContainer" containerID="561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e" Apr 23 16:39:35.599405 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:39:35.599286 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e\": container with ID starting with 561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e not found: ID does not exist" containerID="561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e" Apr 23 16:39:35.599405 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.599318 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e"} err="failed to get container status \"561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e\": rpc error: code = NotFound desc = could not find container \"561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e\": container with ID starting with 561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e not found: ID does not exist" Apr 23 16:39:35.599405 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.599339 2571 scope.go:117] "RemoveContainer" containerID="5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3" Apr 23 16:39:35.599873 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:39:35.599739 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3\": container with ID starting with 5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3 not found: ID does not exist" containerID="5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3" Apr 23 16:39:35.599873 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.599774 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3"} err="failed to get container status \"5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3\": rpc error: code = NotFound desc = could not find container \"5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3\": container with ID starting with 5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3 not found: ID does not exist" Apr 23 16:39:35.599873 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.599795 2571 scope.go:117] "RemoveContainer" containerID="8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a" Apr 23 16:39:35.600881 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:39:35.600851 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a\": container with ID starting with 8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a not found: ID does not exist" containerID="8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a" Apr 23 16:39:35.600962 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.600882 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a"} err="failed to get container status \"8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a\": rpc error: code = NotFound desc = could not find container \"8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a\": container with ID starting with 8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a not found: ID does not exist" Apr 23 16:39:35.600962 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.600906 2571 scope.go:117] "RemoveContainer" containerID="2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb" Apr 23 16:39:35.601443 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:39:35.601425 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb\": container with ID starting with 2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb not found: ID does not exist" containerID="2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb" Apr 23 16:39:35.601511 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.601447 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb"} err="failed to get container status \"2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb\": rpc error: code = NotFound desc = could not find container \"2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb\": container with ID starting with 2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb not found: ID does not exist" Apr 23 16:39:35.601511 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.601463 2571 scope.go:117] "RemoveContainer" containerID="a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6" Apr 23 16:39:35.601724 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:39:35.601708 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6\": container with ID starting with a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6 not found: ID does not exist" containerID="a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6" Apr 23 16:39:35.601772 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.601728 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6"} err="failed to get container status \"a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6\": rpc error: code = NotFound desc = could not find container \"a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6\": container with ID starting with a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6 not found: ID does not exist" Apr 23 16:39:35.601772 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.601741 2571 scope.go:117] "RemoveContainer" containerID="6605d1ca9d563e2d9abd6c65e925af194e3c65af5a76e2a1dedf4945815c64f2" Apr 23 16:39:35.601917 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:39:35.601899 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6605d1ca9d563e2d9abd6c65e925af194e3c65af5a76e2a1dedf4945815c64f2\": container with ID starting with 6605d1ca9d563e2d9abd6c65e925af194e3c65af5a76e2a1dedf4945815c64f2 not found: ID does not exist" containerID="6605d1ca9d563e2d9abd6c65e925af194e3c65af5a76e2a1dedf4945815c64f2" Apr 23 16:39:35.601954 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.601922 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6605d1ca9d563e2d9abd6c65e925af194e3c65af5a76e2a1dedf4945815c64f2"} err="failed to get container status \"6605d1ca9d563e2d9abd6c65e925af194e3c65af5a76e2a1dedf4945815c64f2\": rpc error: code = NotFound desc = could not find container \"6605d1ca9d563e2d9abd6c65e925af194e3c65af5a76e2a1dedf4945815c64f2\": container with ID starting with 6605d1ca9d563e2d9abd6c65e925af194e3c65af5a76e2a1dedf4945815c64f2 not found: ID does not exist" Apr 23 16:39:35.601954 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.601934 2571 scope.go:117] "RemoveContainer" containerID="470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893" Apr 23 16:39:35.602110 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.602094 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893"} err="failed to get container status \"470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893\": rpc error: code = NotFound desc = could not find container \"470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893\": container with ID starting with 470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893 not found: ID does not exist" Apr 23 16:39:35.602146 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.602110 2571 scope.go:117] "RemoveContainer" containerID="561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e" Apr 23 16:39:35.602296 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.602273 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e"} err="failed to get container status \"561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e\": rpc error: code = NotFound desc = could not find container \"561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e\": container with ID starting with 561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e not found: ID does not exist" Apr 23 16:39:35.602339 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.602299 2571 scope.go:117] "RemoveContainer" containerID="5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3" Apr 23 16:39:35.602490 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.602475 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3"} err="failed to get container status \"5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3\": rpc error: code = NotFound desc = could not find container \"5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3\": container with ID starting with 5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3 not found: ID does not exist" Apr 23 16:39:35.602527 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.602490 2571 scope.go:117] "RemoveContainer" containerID="8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a" Apr 23 16:39:35.602710 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.602689 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a"} err="failed to get container status \"8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a\": rpc error: code = NotFound desc = could not find container \"8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a\": container with ID starting with 8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a not found: ID does not exist" Apr 23 16:39:35.602754 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.602711 2571 scope.go:117] "RemoveContainer" containerID="2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb" Apr 23 16:39:35.602922 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.602906 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb"} err="failed to get container status \"2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb\": rpc error: code = NotFound desc = could not find container \"2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb\": container with ID starting with 2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb not found: ID does not exist" Apr 23 16:39:35.602922 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.602921 2571 scope.go:117] "RemoveContainer" containerID="a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6" Apr 23 16:39:35.603123 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.603107 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6"} err="failed to get container status \"a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6\": rpc error: code = NotFound desc = could not find container \"a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6\": container with ID starting with a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6 not found: ID does not exist" Apr 23 16:39:35.603123 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.603123 2571 scope.go:117] "RemoveContainer" containerID="6605d1ca9d563e2d9abd6c65e925af194e3c65af5a76e2a1dedf4945815c64f2" Apr 23 16:39:35.603295 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.603279 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6605d1ca9d563e2d9abd6c65e925af194e3c65af5a76e2a1dedf4945815c64f2"} err="failed to get container status \"6605d1ca9d563e2d9abd6c65e925af194e3c65af5a76e2a1dedf4945815c64f2\": rpc error: code = NotFound desc = could not find container \"6605d1ca9d563e2d9abd6c65e925af194e3c65af5a76e2a1dedf4945815c64f2\": container with ID starting with 6605d1ca9d563e2d9abd6c65e925af194e3c65af5a76e2a1dedf4945815c64f2 not found: ID does not exist" Apr 23 16:39:35.603344 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.603295 2571 scope.go:117] "RemoveContainer" containerID="470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893" Apr 23 16:39:35.603448 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.603434 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893"} err="failed to get container status \"470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893\": rpc error: code = NotFound desc = could not find container \"470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893\": container with ID starting with 470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893 not found: ID does not exist" Apr 23 16:39:35.603494 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.603448 2571 scope.go:117] "RemoveContainer" containerID="561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e" Apr 23 16:39:35.603605 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.603580 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e"} err="failed to get container status \"561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e\": rpc error: code = NotFound desc = could not find container \"561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e\": container with ID starting with 561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e not found: ID does not exist" Apr 23 16:39:35.603659 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.603607 2571 scope.go:117] "RemoveContainer" containerID="5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3" Apr 23 16:39:35.603781 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.603757 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3"} err="failed to get container status \"5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3\": rpc error: code = NotFound desc = could not find container \"5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3\": container with ID starting with 5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3 not found: ID does not exist" Apr 23 16:39:35.603816 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.603780 2571 scope.go:117] "RemoveContainer" containerID="8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a" Apr 23 16:39:35.603952 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.603936 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a"} err="failed to get container status \"8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a\": rpc error: code = NotFound desc = could not find container \"8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a\": container with ID starting with 8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a not found: ID does not exist" Apr 23 16:39:35.603993 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.603952 2571 scope.go:117] "RemoveContainer" containerID="2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb" Apr 23 16:39:35.604148 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.604133 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb"} err="failed to get container status \"2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb\": rpc error: code = NotFound desc = could not find container \"2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb\": container with ID starting with 2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb not found: ID does not exist" Apr 23 16:39:35.604185 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.604147 2571 scope.go:117] "RemoveContainer" containerID="a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6" Apr 23 16:39:35.604315 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.604300 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6"} err="failed to get container status \"a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6\": rpc error: code = NotFound desc = could not find container \"a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6\": container with ID starting with a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6 not found: ID does not exist" Apr 23 16:39:35.604351 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.604314 2571 scope.go:117] "RemoveContainer" containerID="6605d1ca9d563e2d9abd6c65e925af194e3c65af5a76e2a1dedf4945815c64f2" Apr 23 16:39:35.604494 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.604477 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6605d1ca9d563e2d9abd6c65e925af194e3c65af5a76e2a1dedf4945815c64f2"} err="failed to get container status \"6605d1ca9d563e2d9abd6c65e925af194e3c65af5a76e2a1dedf4945815c64f2\": rpc error: code = NotFound desc = could not find container \"6605d1ca9d563e2d9abd6c65e925af194e3c65af5a76e2a1dedf4945815c64f2\": container with ID starting with 6605d1ca9d563e2d9abd6c65e925af194e3c65af5a76e2a1dedf4945815c64f2 not found: ID does not exist" Apr 23 16:39:35.604536 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.604494 2571 scope.go:117] "RemoveContainer" containerID="470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893" Apr 23 16:39:35.604671 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.604656 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893"} err="failed to get container status \"470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893\": rpc error: code = NotFound desc = could not find container \"470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893\": container with ID starting with 470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893 not found: ID does not exist" Apr 23 16:39:35.604718 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.604671 2571 scope.go:117] "RemoveContainer" containerID="561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e" Apr 23 16:39:35.604855 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.604828 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e"} err="failed to get container status \"561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e\": rpc error: code = NotFound desc = could not find container \"561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e\": container with ID starting with 561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e not found: ID does not exist" Apr 23 16:39:35.604917 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.604857 2571 scope.go:117] "RemoveContainer" containerID="5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3" Apr 23 16:39:35.605033 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.605018 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3"} err="failed to get container status \"5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3\": rpc error: code = NotFound desc = could not find container \"5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3\": container with ID starting with 5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3 not found: ID does not exist" Apr 23 16:39:35.605076 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.605033 2571 scope.go:117] "RemoveContainer" containerID="8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a" Apr 23 16:39:35.605194 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.605180 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a"} err="failed to get container status \"8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a\": rpc error: code = NotFound desc = could not find container \"8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a\": container with ID starting with 8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a not found: ID does not exist" Apr 23 16:39:35.605241 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.605194 2571 scope.go:117] "RemoveContainer" containerID="2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb" Apr 23 16:39:35.605343 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.605328 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb"} err="failed to get container status \"2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb\": rpc error: code = NotFound desc = could not find container \"2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb\": container with ID starting with 2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb not found: ID does not exist" Apr 23 16:39:35.605383 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.605343 2571 scope.go:117] "RemoveContainer" containerID="a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6" Apr 23 16:39:35.605555 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.605540 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6"} err="failed to get container status \"a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6\": rpc error: code = NotFound desc = could not find container \"a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6\": container with ID starting with a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6 not found: ID does not exist" Apr 23 16:39:35.605555 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.605554 2571 scope.go:117] "RemoveContainer" containerID="6605d1ca9d563e2d9abd6c65e925af194e3c65af5a76e2a1dedf4945815c64f2" Apr 23 16:39:35.605736 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.605722 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6605d1ca9d563e2d9abd6c65e925af194e3c65af5a76e2a1dedf4945815c64f2"} err="failed to get container status \"6605d1ca9d563e2d9abd6c65e925af194e3c65af5a76e2a1dedf4945815c64f2\": rpc error: code = NotFound desc = could not find container \"6605d1ca9d563e2d9abd6c65e925af194e3c65af5a76e2a1dedf4945815c64f2\": container with ID starting with 6605d1ca9d563e2d9abd6c65e925af194e3c65af5a76e2a1dedf4945815c64f2 not found: ID does not exist" Apr 23 16:39:35.605777 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.605736 2571 scope.go:117] "RemoveContainer" containerID="470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893" Apr 23 16:39:35.605930 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.605914 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893"} err="failed to get container status \"470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893\": rpc error: code = NotFound desc = could not find container \"470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893\": container with ID starting with 470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893 not found: ID does not exist" Apr 23 16:39:35.605978 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.605929 2571 scope.go:117] "RemoveContainer" containerID="561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e" Apr 23 16:39:35.606117 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.606102 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e"} err="failed to get container status \"561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e\": rpc error: code = NotFound desc = could not find container \"561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e\": container with ID starting with 561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e not found: ID does not exist" Apr 23 16:39:35.606117 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.606117 2571 scope.go:117] "RemoveContainer" containerID="5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3" Apr 23 16:39:35.606282 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.606266 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3"} err="failed to get container status \"5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3\": rpc error: code = NotFound desc = could not find container \"5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3\": container with ID starting with 5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3 not found: ID does not exist" Apr 23 16:39:35.606282 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.606282 2571 scope.go:117] "RemoveContainer" containerID="8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a" Apr 23 16:39:35.606481 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.606459 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a"} err="failed to get container status \"8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a\": rpc error: code = NotFound desc = could not find container \"8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a\": container with ID starting with 8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a not found: ID does not exist" Apr 23 16:39:35.606522 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.606482 2571 scope.go:117] "RemoveContainer" containerID="2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb" Apr 23 16:39:35.606695 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.606678 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb"} err="failed to get container status \"2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb\": rpc error: code = NotFound desc = could not find container \"2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb\": container with ID starting with 2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb not found: ID does not exist" Apr 23 16:39:35.606755 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.606694 2571 scope.go:117] "RemoveContainer" containerID="a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6" Apr 23 16:39:35.606854 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.606838 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6"} err="failed to get container status \"a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6\": rpc error: code = NotFound desc = could not find container \"a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6\": container with ID starting with a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6 not found: ID does not exist" Apr 23 16:39:35.606854 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.606853 2571 scope.go:117] "RemoveContainer" containerID="6605d1ca9d563e2d9abd6c65e925af194e3c65af5a76e2a1dedf4945815c64f2" Apr 23 16:39:35.606999 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.606978 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6605d1ca9d563e2d9abd6c65e925af194e3c65af5a76e2a1dedf4945815c64f2"} err="failed to get container status \"6605d1ca9d563e2d9abd6c65e925af194e3c65af5a76e2a1dedf4945815c64f2\": rpc error: code = NotFound desc = could not find container \"6605d1ca9d563e2d9abd6c65e925af194e3c65af5a76e2a1dedf4945815c64f2\": container with ID starting with 6605d1ca9d563e2d9abd6c65e925af194e3c65af5a76e2a1dedf4945815c64f2 not found: ID does not exist" Apr 23 16:39:35.607039 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.607000 2571 scope.go:117] "RemoveContainer" containerID="470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893" Apr 23 16:39:35.607183 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.607169 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893"} err="failed to get container status \"470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893\": rpc error: code = NotFound desc = could not find container \"470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893\": container with ID starting with 470ed98bd7b9b8416775e1a4ae73668e0915bbeb9f8415261ac454729b4a2893 not found: ID does not exist" Apr 23 16:39:35.607219 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.607183 2571 scope.go:117] "RemoveContainer" containerID="561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e" Apr 23 16:39:35.607381 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.607364 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e"} err="failed to get container status \"561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e\": rpc error: code = NotFound desc = could not find container \"561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e\": container with ID starting with 561e0dcabdb37245e988af00dbc73fae7bb570dc232d5788098a89a3b51d789e not found: ID does not exist" Apr 23 16:39:35.607381 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.607380 2571 scope.go:117] "RemoveContainer" containerID="5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3" Apr 23 16:39:35.607531 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.607516 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3"} err="failed to get container status \"5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3\": rpc error: code = NotFound desc = could not find container \"5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3\": container with ID starting with 5a98dc5535fac46a576c6f368f8bdbfa1225ef1602fc858f8d02233dd5afdca3 not found: ID does not exist" Apr 23 16:39:35.607575 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.607531 2571 scope.go:117] "RemoveContainer" containerID="8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a" Apr 23 16:39:35.607691 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.607676 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a"} err="failed to get container status \"8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a\": rpc error: code = NotFound desc = could not find container \"8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a\": container with ID starting with 8745abf5aca8efb967d829623f8901fb7b3eb7adf77cf66f995444de1abdb26a not found: ID does not exist" Apr 23 16:39:35.607737 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.607690 2571 scope.go:117] "RemoveContainer" containerID="2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb" Apr 23 16:39:35.607853 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.607837 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb"} err="failed to get container status \"2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb\": rpc error: code = NotFound desc = could not find container \"2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb\": container with ID starting with 2044ed131be5738777f9139c396bbe628f7c53c67b511d16cc1551cfd8cad5cb not found: ID does not exist" Apr 23 16:39:35.607890 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.607853 2571 scope.go:117] "RemoveContainer" containerID="a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6" Apr 23 16:39:35.608002 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.607988 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6"} err="failed to get container status \"a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6\": rpc error: code = NotFound desc = could not find container \"a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6\": container with ID starting with a18d47407edc20caedaf397d189d05ddb440777dc0117e3d0b211bc697e478e6 not found: ID does not exist" Apr 23 16:39:35.608044 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.608001 2571 scope.go:117] "RemoveContainer" containerID="6605d1ca9d563e2d9abd6c65e925af194e3c65af5a76e2a1dedf4945815c64f2" Apr 23 16:39:35.608169 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.608156 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6605d1ca9d563e2d9abd6c65e925af194e3c65af5a76e2a1dedf4945815c64f2"} err="failed to get container status \"6605d1ca9d563e2d9abd6c65e925af194e3c65af5a76e2a1dedf4945815c64f2\": rpc error: code = NotFound desc = could not find container \"6605d1ca9d563e2d9abd6c65e925af194e3c65af5a76e2a1dedf4945815c64f2\": container with ID starting with 6605d1ca9d563e2d9abd6c65e925af194e3c65af5a76e2a1dedf4945815c64f2 not found: ID does not exist" Apr 23 16:39:35.673251 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.673217 2571 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-secret-kube-rbac-proxy\") on node \"ip-10-0-136-190.ec2.internal\" DevicePath \"\"" Apr 23 16:39:35.673251 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.673245 2571 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-secret-prometheus-k8s-tls\") on node \"ip-10-0-136-190.ec2.internal\" DevicePath \"\"" Apr 23 16:39:35.673251 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.673255 2571 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-136-190.ec2.internal\" DevicePath \"\"" Apr 23 16:39:35.673452 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.673264 2571 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-secret-metrics-client-certs\") on node \"ip-10-0-136-190.ec2.internal\" DevicePath \"\"" Apr 23 16:39:35.673452 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.673274 2571 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-136-190.ec2.internal\" DevicePath \"\"" Apr 23 16:39:35.673452 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.673283 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-shx9f\" (UniqueName: \"kubernetes.io/projected/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-kube-api-access-shx9f\") on node \"ip-10-0-136-190.ec2.internal\" DevicePath \"\"" Apr 23 16:39:35.673452 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.673291 2571 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-136-190.ec2.internal\" DevicePath \"\"" Apr 23 16:39:35.673452 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.673302 2571 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-thanos-prometheus-http-client-file\") on node \"ip-10-0-136-190.ec2.internal\" DevicePath \"\"" Apr 23 16:39:35.673452 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.673311 2571 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-config\") on node \"ip-10-0-136-190.ec2.internal\" DevicePath \"\"" Apr 23 16:39:35.673452 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.673319 2571 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-web-config\") on node \"ip-10-0-136-190.ec2.internal\" DevicePath \"\"" Apr 23 16:39:35.673452 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.673329 2571 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-prometheus-k8s-db\") on node \"ip-10-0-136-190.ec2.internal\" DevicePath \"\"" Apr 23 16:39:35.673452 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.673338 2571 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-136-190.ec2.internal\" DevicePath \"\"" Apr 23 16:39:35.673452 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.673346 2571 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-prometheus-trusted-ca-bundle\") on node \"ip-10-0-136-190.ec2.internal\" DevicePath \"\"" Apr 23 16:39:35.673452 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.673354 2571 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-config-out\") on node \"ip-10-0-136-190.ec2.internal\" DevicePath \"\"" Apr 23 16:39:35.673452 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.673363 2571 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-tls-assets\") on node \"ip-10-0-136-190.ec2.internal\" DevicePath \"\"" Apr 23 16:39:35.673452 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.673371 2571 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b91bab7f-5f15-4bf1-a6c3-241a09794ddc-secret-grpc-tls\") on node \"ip-10-0-136-190.ec2.internal\" DevicePath \"\"" Apr 23 16:39:35.876420 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.876386 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 16:39:35.883818 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.883481 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 16:39:35.909273 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.909249 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 16:39:35.909581 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.909564 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b91bab7f-5f15-4bf1-a6c3-241a09794ddc" containerName="kube-rbac-proxy" Apr 23 16:39:35.909658 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.909585 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b91bab7f-5f15-4bf1-a6c3-241a09794ddc" containerName="kube-rbac-proxy" Apr 23 16:39:35.909658 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.909623 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b91bab7f-5f15-4bf1-a6c3-241a09794ddc" containerName="kube-rbac-proxy-web" Apr 23 16:39:35.909658 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.909632 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b91bab7f-5f15-4bf1-a6c3-241a09794ddc" containerName="kube-rbac-proxy-web" Apr 23 16:39:35.909658 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.909654 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b91bab7f-5f15-4bf1-a6c3-241a09794ddc" containerName="thanos-sidecar" Apr 23 16:39:35.909781 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.909663 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b91bab7f-5f15-4bf1-a6c3-241a09794ddc" containerName="thanos-sidecar" Apr 23 16:39:35.909781 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.909676 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b91bab7f-5f15-4bf1-a6c3-241a09794ddc" containerName="init-config-reloader" Apr 23 16:39:35.909781 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.909684 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b91bab7f-5f15-4bf1-a6c3-241a09794ddc" containerName="init-config-reloader" Apr 23 16:39:35.909781 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.909696 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b91bab7f-5f15-4bf1-a6c3-241a09794ddc" containerName="prometheus" Apr 23 16:39:35.909781 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.909704 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b91bab7f-5f15-4bf1-a6c3-241a09794ddc" containerName="prometheus" Apr 23 16:39:35.909781 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.909713 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b91bab7f-5f15-4bf1-a6c3-241a09794ddc" containerName="config-reloader" Apr 23 16:39:35.909781 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.909721 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b91bab7f-5f15-4bf1-a6c3-241a09794ddc" containerName="config-reloader" Apr 23 16:39:35.909781 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.909734 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b91bab7f-5f15-4bf1-a6c3-241a09794ddc" containerName="kube-rbac-proxy-thanos" Apr 23 16:39:35.909781 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.909743 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b91bab7f-5f15-4bf1-a6c3-241a09794ddc" containerName="kube-rbac-proxy-thanos" Apr 23 16:39:35.910022 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.909799 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="b91bab7f-5f15-4bf1-a6c3-241a09794ddc" containerName="prometheus" Apr 23 16:39:35.910022 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.909812 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="b91bab7f-5f15-4bf1-a6c3-241a09794ddc" containerName="kube-rbac-proxy" Apr 23 16:39:35.910022 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.909822 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="b91bab7f-5f15-4bf1-a6c3-241a09794ddc" containerName="config-reloader" Apr 23 16:39:35.910022 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.909832 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="b91bab7f-5f15-4bf1-a6c3-241a09794ddc" containerName="kube-rbac-proxy-thanos" Apr 23 16:39:35.910022 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.909843 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="b91bab7f-5f15-4bf1-a6c3-241a09794ddc" containerName="thanos-sidecar" Apr 23 16:39:35.910022 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.909853 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="b91bab7f-5f15-4bf1-a6c3-241a09794ddc" containerName="kube-rbac-proxy-web" Apr 23 16:39:35.914684 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.914667 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:35.917691 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.917672 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 23 16:39:35.917787 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.917672 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 23 16:39:35.918734 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.918720 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-lnpsq\"" Apr 23 16:39:35.918853 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.918838 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 23 16:39:35.919417 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.919272 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-bkalbold486o3\"" Apr 23 16:39:35.919417 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.919279 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 23 16:39:35.919417 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.919284 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 23 16:39:35.919417 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.919322 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 23 16:39:35.919417 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.919366 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 16:39:35.919720 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.919484 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 23 16:39:35.919720 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.919571 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 23 16:39:35.919720 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.919625 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 23 16:39:35.920228 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.920210 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 23 16:39:35.922194 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.922177 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 23 16:39:35.926356 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.926334 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 23 16:39:35.927809 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:35.927790 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 16:39:36.076169 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.076137 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a2d0fe11-f062-4bbf-967f-b49737665d70-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.076169 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.076171 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a2d0fe11-f062-4bbf-967f-b49737665d70-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.076361 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.076195 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a2d0fe11-f062-4bbf-967f-b49737665d70-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.076361 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.076216 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a2d0fe11-f062-4bbf-967f-b49737665d70-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.076361 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.076250 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlvmp\" (UniqueName: \"kubernetes.io/projected/a2d0fe11-f062-4bbf-967f-b49737665d70-kube-api-access-tlvmp\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.076361 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.076270 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a2d0fe11-f062-4bbf-967f-b49737665d70-config\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.076361 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.076292 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a2d0fe11-f062-4bbf-967f-b49737665d70-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.076361 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.076312 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a2d0fe11-f062-4bbf-967f-b49737665d70-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.076361 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.076336 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a2d0fe11-f062-4bbf-967f-b49737665d70-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.076582 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.076368 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a2d0fe11-f062-4bbf-967f-b49737665d70-config-out\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.076582 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.076387 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2d0fe11-f062-4bbf-967f-b49737665d70-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.076582 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.076404 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a2d0fe11-f062-4bbf-967f-b49737665d70-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.076582 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.076422 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2d0fe11-f062-4bbf-967f-b49737665d70-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.076582 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.076477 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a2d0fe11-f062-4bbf-967f-b49737665d70-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.076582 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.076506 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2d0fe11-f062-4bbf-967f-b49737665d70-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.076582 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.076552 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a2d0fe11-f062-4bbf-967f-b49737665d70-web-config\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.076582 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.076570 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a2d0fe11-f062-4bbf-967f-b49737665d70-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.076844 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.076611 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a2d0fe11-f062-4bbf-967f-b49737665d70-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.177645 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.177576 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a2d0fe11-f062-4bbf-967f-b49737665d70-web-config\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.177645 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.177628 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a2d0fe11-f062-4bbf-967f-b49737665d70-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.178095 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.177682 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a2d0fe11-f062-4bbf-967f-b49737665d70-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.178095 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.177704 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a2d0fe11-f062-4bbf-967f-b49737665d70-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.178095 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.177723 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a2d0fe11-f062-4bbf-967f-b49737665d70-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.178095 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.177746 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a2d0fe11-f062-4bbf-967f-b49737665d70-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.178095 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.177771 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a2d0fe11-f062-4bbf-967f-b49737665d70-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.178095 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.177809 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tlvmp\" (UniqueName: \"kubernetes.io/projected/a2d0fe11-f062-4bbf-967f-b49737665d70-kube-api-access-tlvmp\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.178095 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.177839 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a2d0fe11-f062-4bbf-967f-b49737665d70-config\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.178095 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.177861 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a2d0fe11-f062-4bbf-967f-b49737665d70-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.178095 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.177877 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a2d0fe11-f062-4bbf-967f-b49737665d70-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.178095 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.177925 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a2d0fe11-f062-4bbf-967f-b49737665d70-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.178095 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.177963 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a2d0fe11-f062-4bbf-967f-b49737665d70-config-out\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.178095 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.177984 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2d0fe11-f062-4bbf-967f-b49737665d70-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.178095 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.178010 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a2d0fe11-f062-4bbf-967f-b49737665d70-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.178095 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.178041 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2d0fe11-f062-4bbf-967f-b49737665d70-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.178095 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.178066 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a2d0fe11-f062-4bbf-967f-b49737665d70-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.178095 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.178091 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2d0fe11-f062-4bbf-967f-b49737665d70-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.178916 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.178248 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a2d0fe11-f062-4bbf-967f-b49737665d70-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.179274 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.179249 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2d0fe11-f062-4bbf-967f-b49737665d70-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.181796 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.181766 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2d0fe11-f062-4bbf-967f-b49737665d70-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.181969 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.181947 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a2d0fe11-f062-4bbf-967f-b49737665d70-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.182071 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.182051 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2d0fe11-f062-4bbf-967f-b49737665d70-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.182753 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.182732 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a2d0fe11-f062-4bbf-967f-b49737665d70-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.182873 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.182754 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a2d0fe11-f062-4bbf-967f-b49737665d70-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.182983 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.182895 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a2d0fe11-f062-4bbf-967f-b49737665d70-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.183091 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.182904 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a2d0fe11-f062-4bbf-967f-b49737665d70-config-out\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.183091 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.182985 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a2d0fe11-f062-4bbf-967f-b49737665d70-config\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.183265 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.183038 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a2d0fe11-f062-4bbf-967f-b49737665d70-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.183265 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.183047 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a2d0fe11-f062-4bbf-967f-b49737665d70-web-config\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.183265 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.183048 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a2d0fe11-f062-4bbf-967f-b49737665d70-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.183265 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.183259 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a2d0fe11-f062-4bbf-967f-b49737665d70-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.183491 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.183425 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a2d0fe11-f062-4bbf-967f-b49737665d70-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.183882 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.183864 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a2d0fe11-f062-4bbf-967f-b49737665d70-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.184633 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.184613 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a2d0fe11-f062-4bbf-967f-b49737665d70-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.188950 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.188927 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlvmp\" (UniqueName: \"kubernetes.io/projected/a2d0fe11-f062-4bbf-967f-b49737665d70-kube-api-access-tlvmp\") pod \"prometheus-k8s-0\" (UID: \"a2d0fe11-f062-4bbf-967f-b49737665d70\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.223892 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.223868 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:36.346559 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.346529 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 16:39:36.349697 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:39:36.349670 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2d0fe11_f062_4bbf_967f_b49737665d70.slice/crio-602582cdfba5cf3649016612adac75f25e1c1d2c5867d5e54109d4b8c3a454ba WatchSource:0}: Error finding container 602582cdfba5cf3649016612adac75f25e1c1d2c5867d5e54109d4b8c3a454ba: Status 404 returned error can't find the container with id 602582cdfba5cf3649016612adac75f25e1c1d2c5867d5e54109d4b8c3a454ba Apr 23 16:39:36.557383 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.557349 2571 generic.go:358] "Generic (PLEG): container finished" podID="a2d0fe11-f062-4bbf-967f-b49737665d70" containerID="270a3a79f651a831b05e2b44672cef8abd405950a3313b0fb14c305be9bbd8cb" exitCode=0 Apr 23 16:39:36.557533 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.557399 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a2d0fe11-f062-4bbf-967f-b49737665d70","Type":"ContainerDied","Data":"270a3a79f651a831b05e2b44672cef8abd405950a3313b0fb14c305be9bbd8cb"} Apr 23 16:39:36.557533 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.557429 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a2d0fe11-f062-4bbf-967f-b49737665d70","Type":"ContainerStarted","Data":"602582cdfba5cf3649016612adac75f25e1c1d2c5867d5e54109d4b8c3a454ba"} Apr 23 16:39:36.749282 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:36.749131 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b91bab7f-5f15-4bf1-a6c3-241a09794ddc" path="/var/lib/kubelet/pods/b91bab7f-5f15-4bf1-a6c3-241a09794ddc/volumes" Apr 23 16:39:37.563268 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:37.563234 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a2d0fe11-f062-4bbf-967f-b49737665d70","Type":"ContainerStarted","Data":"faed29e471c4eac4b79b8a474afa115287781398a63476dd04bdbccab0b9eac5"} Apr 23 16:39:37.563268 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:37.563269 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a2d0fe11-f062-4bbf-967f-b49737665d70","Type":"ContainerStarted","Data":"ad93f6741095591af0bca8f9eb62aa841c1afaa2e728a6d661e48e8aa3a8398c"} Apr 23 16:39:37.563686 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:37.563281 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a2d0fe11-f062-4bbf-967f-b49737665d70","Type":"ContainerStarted","Data":"88f6018093d1971d26c3bede3bffe9616f3c3073b6a136cb1f3919804ef0d634"} Apr 23 16:39:37.563686 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:37.563290 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a2d0fe11-f062-4bbf-967f-b49737665d70","Type":"ContainerStarted","Data":"d51388d15055dbe1b6fa909230aa3be817ca1a1560839b72250ad6f1a0f2535d"} Apr 23 16:39:37.563686 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:37.563298 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a2d0fe11-f062-4bbf-967f-b49737665d70","Type":"ContainerStarted","Data":"5e0a0fbf74ae760b5c23f6805d0f34d64329d48a092606b43ce34d5aab19c225"} Apr 23 16:39:37.563686 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:37.563305 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a2d0fe11-f062-4bbf-967f-b49737665d70","Type":"ContainerStarted","Data":"f36588ae764f0d276a86d0a20a249cf1a57edf020471f11136c1224c8e9fa6bd"} Apr 23 16:39:37.600157 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:37.600108 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.600093415 podStartE2EDuration="2.600093415s" podCreationTimestamp="2026-04-23 16:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:39:37.595669727 +0000 UTC m=+263.474079332" watchObservedRunningTime="2026-04-23 16:39:37.600093415 +0000 UTC m=+263.478503019" Apr 23 16:39:41.224321 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:39:41.224289 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:40:14.619888 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:40:14.619854 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6qml_9af32d6e-ad97-4dd0-a962-8bc05ab87464/ovn-acl-logging/0.log" Apr 23 16:40:14.620907 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:40:14.620883 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6qml_9af32d6e-ad97-4dd0-a962-8bc05ab87464/ovn-acl-logging/0.log" Apr 23 16:40:36.224474 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:40:36.224432 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:40:36.239575 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:40:36.239551 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:40:36.736810 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:40:36.736780 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:44:10.098836 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:44:10.098767 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-b454c4fb-h92mf"] Apr 23 16:44:10.101791 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:44:10.101776 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-b454c4fb-h92mf" Apr 23 16:44:10.105677 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:44:10.105657 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 23 16:44:10.105839 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:44:10.105826 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-zl9f4\"" Apr 23 16:44:10.105966 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:44:10.105954 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 23 16:44:10.106116 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:44:10.106106 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:44:10.106235 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:44:10.106215 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 23 16:44:10.106365 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:44:10.106348 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 23 16:44:10.109910 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:44:10.109889 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-b454c4fb-h92mf"] Apr 23 16:44:10.202645 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:44:10.202614 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a833e29b-84ec-4c94-89dd-fd58326929e7-cert\") pod \"lws-controller-manager-b454c4fb-h92mf\" (UID: \"a833e29b-84ec-4c94-89dd-fd58326929e7\") " pod="openshift-lws-operator/lws-controller-manager-b454c4fb-h92mf" Apr 23 16:44:10.202645 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:44:10.202645 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g9ds\" (UniqueName: \"kubernetes.io/projected/a833e29b-84ec-4c94-89dd-fd58326929e7-kube-api-access-8g9ds\") pod \"lws-controller-manager-b454c4fb-h92mf\" (UID: \"a833e29b-84ec-4c94-89dd-fd58326929e7\") " pod="openshift-lws-operator/lws-controller-manager-b454c4fb-h92mf" Apr 23 16:44:10.202829 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:44:10.202673 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/a833e29b-84ec-4c94-89dd-fd58326929e7-metrics-cert\") pod \"lws-controller-manager-b454c4fb-h92mf\" (UID: \"a833e29b-84ec-4c94-89dd-fd58326929e7\") " pod="openshift-lws-operator/lws-controller-manager-b454c4fb-h92mf" Apr 23 16:44:10.202829 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:44:10.202799 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/a833e29b-84ec-4c94-89dd-fd58326929e7-manager-config\") pod \"lws-controller-manager-b454c4fb-h92mf\" (UID: \"a833e29b-84ec-4c94-89dd-fd58326929e7\") " pod="openshift-lws-operator/lws-controller-manager-b454c4fb-h92mf" Apr 23 16:44:10.303422 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:44:10.303390 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/a833e29b-84ec-4c94-89dd-fd58326929e7-metrics-cert\") pod \"lws-controller-manager-b454c4fb-h92mf\" (UID: \"a833e29b-84ec-4c94-89dd-fd58326929e7\") " pod="openshift-lws-operator/lws-controller-manager-b454c4fb-h92mf" Apr 23 16:44:10.303625 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:44:10.303454 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/a833e29b-84ec-4c94-89dd-fd58326929e7-manager-config\") pod \"lws-controller-manager-b454c4fb-h92mf\" (UID: \"a833e29b-84ec-4c94-89dd-fd58326929e7\") " pod="openshift-lws-operator/lws-controller-manager-b454c4fb-h92mf" Apr 23 16:44:10.303625 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:44:10.303475 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a833e29b-84ec-4c94-89dd-fd58326929e7-cert\") pod \"lws-controller-manager-b454c4fb-h92mf\" (UID: \"a833e29b-84ec-4c94-89dd-fd58326929e7\") " pod="openshift-lws-operator/lws-controller-manager-b454c4fb-h92mf" Apr 23 16:44:10.303625 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:44:10.303492 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8g9ds\" (UniqueName: \"kubernetes.io/projected/a833e29b-84ec-4c94-89dd-fd58326929e7-kube-api-access-8g9ds\") pod \"lws-controller-manager-b454c4fb-h92mf\" (UID: \"a833e29b-84ec-4c94-89dd-fd58326929e7\") " pod="openshift-lws-operator/lws-controller-manager-b454c4fb-h92mf" Apr 23 16:44:10.304193 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:44:10.304162 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/a833e29b-84ec-4c94-89dd-fd58326929e7-manager-config\") pod \"lws-controller-manager-b454c4fb-h92mf\" (UID: \"a833e29b-84ec-4c94-89dd-fd58326929e7\") " pod="openshift-lws-operator/lws-controller-manager-b454c4fb-h92mf" Apr 23 16:44:10.305937 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:44:10.305912 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/a833e29b-84ec-4c94-89dd-fd58326929e7-metrics-cert\") pod \"lws-controller-manager-b454c4fb-h92mf\" (UID: \"a833e29b-84ec-4c94-89dd-fd58326929e7\") " pod="openshift-lws-operator/lws-controller-manager-b454c4fb-h92mf" Apr 23 16:44:10.306024 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:44:10.306001 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a833e29b-84ec-4c94-89dd-fd58326929e7-cert\") pod \"lws-controller-manager-b454c4fb-h92mf\" (UID: \"a833e29b-84ec-4c94-89dd-fd58326929e7\") " pod="openshift-lws-operator/lws-controller-manager-b454c4fb-h92mf" Apr 23 16:44:10.312942 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:44:10.312913 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g9ds\" (UniqueName: \"kubernetes.io/projected/a833e29b-84ec-4c94-89dd-fd58326929e7-kube-api-access-8g9ds\") pod \"lws-controller-manager-b454c4fb-h92mf\" (UID: \"a833e29b-84ec-4c94-89dd-fd58326929e7\") " pod="openshift-lws-operator/lws-controller-manager-b454c4fb-h92mf" Apr 23 16:44:10.411404 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:44:10.411325 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-b454c4fb-h92mf" Apr 23 16:44:10.528031 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:44:10.528002 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-b454c4fb-h92mf"] Apr 23 16:44:10.531733 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:44:10.531699 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda833e29b_84ec_4c94_89dd_fd58326929e7.slice/crio-916e6726288f4b55fa9452d887f4c3abebb6b9e7c9fcdf3b24367fe7d9f3cfcc WatchSource:0}: Error finding container 916e6726288f4b55fa9452d887f4c3abebb6b9e7c9fcdf3b24367fe7d9f3cfcc: Status 404 returned error can't find the container with id 916e6726288f4b55fa9452d887f4c3abebb6b9e7c9fcdf3b24367fe7d9f3cfcc Apr 23 16:44:10.535794 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:44:10.535779 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:44:11.269059 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:44:11.269024 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-b454c4fb-h92mf" event={"ID":"a833e29b-84ec-4c94-89dd-fd58326929e7","Type":"ContainerStarted","Data":"916e6726288f4b55fa9452d887f4c3abebb6b9e7c9fcdf3b24367fe7d9f3cfcc"} Apr 23 16:44:35.336705 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:44:35.336662 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-b454c4fb-h92mf" event={"ID":"a833e29b-84ec-4c94-89dd-fd58326929e7","Type":"ContainerStarted","Data":"0bb6729e7fa672b889d8a050137942fa03d02ffcb6392635b80565b3b67ad6e0"} Apr 23 16:44:35.337089 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:44:35.336812 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-b454c4fb-h92mf" Apr 23 16:44:35.361686 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:44:35.361646 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-b454c4fb-h92mf" podStartSLOduration=1.590296231 podStartE2EDuration="25.361633619s" podCreationTimestamp="2026-04-23 16:44:10 +0000 UTC" firstStartedPulling="2026-04-23 16:44:10.535906589 +0000 UTC m=+536.414316172" lastFinishedPulling="2026-04-23 16:44:34.307243978 +0000 UTC m=+560.185653560" observedRunningTime="2026-04-23 16:44:35.360521382 +0000 UTC m=+561.238930986" watchObservedRunningTime="2026-04-23 16:44:35.361633619 +0000 UTC m=+561.240043223" Apr 23 16:44:46.340968 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:44:46.340940 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-b454c4fb-h92mf" Apr 23 16:45:14.637775 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:14.637741 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6qml_9af32d6e-ad97-4dd0-a962-8bc05ab87464/ovn-acl-logging/0.log" Apr 23 16:45:14.639657 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:14.639638 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6qml_9af32d6e-ad97-4dd0-a962-8bc05ab87464/ovn-acl-logging/0.log" Apr 23 16:45:20.693096 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:20.693061 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-hmlb6"] Apr 23 16:45:20.697852 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:20.697835 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-hmlb6" Apr 23 16:45:20.700827 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:20.700800 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-2b57v\"" Apr 23 16:45:20.700827 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:20.700824 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 23 16:45:20.700990 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:20.700921 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 23 16:45:20.701048 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:20.701041 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 23 16:45:20.707517 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:20.707497 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-hmlb6"] Apr 23 16:45:20.724779 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:20.724756 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nqvp\" (UniqueName: \"kubernetes.io/projected/84421eaf-9c05-4479-b702-3cb23a0cc73f-kube-api-access-7nqvp\") pod \"dns-operator-controller-manager-844548ff4c-hmlb6\" (UID: \"84421eaf-9c05-4479-b702-3cb23a0cc73f\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-hmlb6" Apr 23 16:45:20.826117 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:20.826075 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7nqvp\" (UniqueName: \"kubernetes.io/projected/84421eaf-9c05-4479-b702-3cb23a0cc73f-kube-api-access-7nqvp\") pod \"dns-operator-controller-manager-844548ff4c-hmlb6\" (UID: \"84421eaf-9c05-4479-b702-3cb23a0cc73f\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-hmlb6" Apr 23 16:45:20.838445 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:20.838423 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nqvp\" (UniqueName: \"kubernetes.io/projected/84421eaf-9c05-4479-b702-3cb23a0cc73f-kube-api-access-7nqvp\") pod \"dns-operator-controller-manager-844548ff4c-hmlb6\" (UID: \"84421eaf-9c05-4479-b702-3cb23a0cc73f\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-hmlb6" Apr 23 16:45:21.008421 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:21.008346 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-hmlb6" Apr 23 16:45:21.130801 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:21.130769 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-hmlb6"] Apr 23 16:45:21.134311 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:45:21.134283 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84421eaf_9c05_4479_b702_3cb23a0cc73f.slice/crio-1cf1c7c681ca5f1d127db3aecd52cc8dbde2d47b9fb6d9e223e4cca4a669bfca WatchSource:0}: Error finding container 1cf1c7c681ca5f1d127db3aecd52cc8dbde2d47b9fb6d9e223e4cca4a669bfca: Status 404 returned error can't find the container with id 1cf1c7c681ca5f1d127db3aecd52cc8dbde2d47b9fb6d9e223e4cca4a669bfca Apr 23 16:45:21.459156 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:21.459123 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-hmlb6" event={"ID":"84421eaf-9c05-4479-b702-3cb23a0cc73f","Type":"ContainerStarted","Data":"1cf1c7c681ca5f1d127db3aecd52cc8dbde2d47b9fb6d9e223e4cca4a669bfca"} Apr 23 16:45:23.830402 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:23.830370 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-fhhcq"] Apr 23 16:45:23.833472 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:23.833456 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-fhhcq" Apr 23 16:45:23.836126 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:23.836105 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 23 16:45:23.836229 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:23.836152 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-xz7jf\"" Apr 23 16:45:23.836549 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:23.836535 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 23 16:45:23.844437 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:23.844416 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-fhhcq"] Apr 23 16:45:23.847346 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:23.847325 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/35a4046a-203e-4b42-87de-5fbb44ea7f4d-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-fhhcq\" (UID: \"35a4046a-203e-4b42-87de-5fbb44ea7f4d\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-fhhcq" Apr 23 16:45:23.847425 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:23.847356 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5djw\" (UniqueName: \"kubernetes.io/projected/35a4046a-203e-4b42-87de-5fbb44ea7f4d-kube-api-access-j5djw\") pod \"kuadrant-console-plugin-6c886788f8-fhhcq\" (UID: \"35a4046a-203e-4b42-87de-5fbb44ea7f4d\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-fhhcq" Apr 23 16:45:23.847425 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:23.847376 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/35a4046a-203e-4b42-87de-5fbb44ea7f4d-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-fhhcq\" (UID: \"35a4046a-203e-4b42-87de-5fbb44ea7f4d\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-fhhcq" Apr 23 16:45:23.948491 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:23.948459 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/35a4046a-203e-4b42-87de-5fbb44ea7f4d-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-fhhcq\" (UID: \"35a4046a-203e-4b42-87de-5fbb44ea7f4d\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-fhhcq" Apr 23 16:45:23.948491 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:23.948493 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j5djw\" (UniqueName: \"kubernetes.io/projected/35a4046a-203e-4b42-87de-5fbb44ea7f4d-kube-api-access-j5djw\") pod \"kuadrant-console-plugin-6c886788f8-fhhcq\" (UID: \"35a4046a-203e-4b42-87de-5fbb44ea7f4d\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-fhhcq" Apr 23 16:45:23.948754 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:23.948512 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/35a4046a-203e-4b42-87de-5fbb44ea7f4d-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-fhhcq\" (UID: \"35a4046a-203e-4b42-87de-5fbb44ea7f4d\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-fhhcq" Apr 23 16:45:23.948754 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:45:23.948627 2571 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 23 16:45:23.948754 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:45:23.948701 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35a4046a-203e-4b42-87de-5fbb44ea7f4d-plugin-serving-cert podName:35a4046a-203e-4b42-87de-5fbb44ea7f4d nodeName:}" failed. No retries permitted until 2026-04-23 16:45:24.448680749 +0000 UTC m=+610.327090337 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/35a4046a-203e-4b42-87de-5fbb44ea7f4d-plugin-serving-cert") pod "kuadrant-console-plugin-6c886788f8-fhhcq" (UID: "35a4046a-203e-4b42-87de-5fbb44ea7f4d") : secret "plugin-serving-cert" not found Apr 23 16:45:23.949175 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:23.949158 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/35a4046a-203e-4b42-87de-5fbb44ea7f4d-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-fhhcq\" (UID: \"35a4046a-203e-4b42-87de-5fbb44ea7f4d\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-fhhcq" Apr 23 16:45:23.961957 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:23.961928 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5djw\" (UniqueName: \"kubernetes.io/projected/35a4046a-203e-4b42-87de-5fbb44ea7f4d-kube-api-access-j5djw\") pod \"kuadrant-console-plugin-6c886788f8-fhhcq\" (UID: \"35a4046a-203e-4b42-87de-5fbb44ea7f4d\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-fhhcq" Apr 23 16:45:24.451106 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:24.451071 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/35a4046a-203e-4b42-87de-5fbb44ea7f4d-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-fhhcq\" (UID: \"35a4046a-203e-4b42-87de-5fbb44ea7f4d\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-fhhcq" Apr 23 16:45:24.453349 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:24.453327 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/35a4046a-203e-4b42-87de-5fbb44ea7f4d-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-fhhcq\" (UID: \"35a4046a-203e-4b42-87de-5fbb44ea7f4d\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-fhhcq" Apr 23 16:45:24.742047 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:24.741955 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-fhhcq" Apr 23 16:45:24.875883 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:24.875859 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-fhhcq"] Apr 23 16:45:24.878424 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:45:24.878401 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35a4046a_203e_4b42_87de_5fbb44ea7f4d.slice/crio-c4bf3a124978e1f2a4675f02de7d977fce6be6e9092f5d6e89ad2e900907d324 WatchSource:0}: Error finding container c4bf3a124978e1f2a4675f02de7d977fce6be6e9092f5d6e89ad2e900907d324: Status 404 returned error can't find the container with id c4bf3a124978e1f2a4675f02de7d977fce6be6e9092f5d6e89ad2e900907d324 Apr 23 16:45:25.471808 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:25.471768 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-fhhcq" event={"ID":"35a4046a-203e-4b42-87de-5fbb44ea7f4d","Type":"ContainerStarted","Data":"c4bf3a124978e1f2a4675f02de7d977fce6be6e9092f5d6e89ad2e900907d324"} Apr 23 16:45:26.476072 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:26.476033 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-hmlb6" event={"ID":"84421eaf-9c05-4479-b702-3cb23a0cc73f","Type":"ContainerStarted","Data":"4a7d53fd2bd1c1f8a85729e2f9f707bc2b05865c3097ec7a17ae47df5e3d370e"} Apr 23 16:45:26.476488 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:26.476189 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-hmlb6" Apr 23 16:45:26.495131 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:26.495070 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-hmlb6" podStartSLOduration=1.483907577 podStartE2EDuration="6.495053426s" podCreationTimestamp="2026-04-23 16:45:20 +0000 UTC" firstStartedPulling="2026-04-23 16:45:21.136709122 +0000 UTC m=+607.015118704" lastFinishedPulling="2026-04-23 16:45:26.147854956 +0000 UTC m=+612.026264553" observedRunningTime="2026-04-23 16:45:26.494137829 +0000 UTC m=+612.372547435" watchObservedRunningTime="2026-04-23 16:45:26.495053426 +0000 UTC m=+612.373463033" Apr 23 16:45:33.496052 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:33.496011 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-fhhcq" event={"ID":"35a4046a-203e-4b42-87de-5fbb44ea7f4d","Type":"ContainerStarted","Data":"5ca9bb92a9520ef6030a9ec82226f03ef4bf21aa36f9606e58c62c21f12fecf2"} Apr 23 16:45:33.521276 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:33.521232 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-fhhcq" podStartSLOduration=2.679345808 podStartE2EDuration="10.521215816s" podCreationTimestamp="2026-04-23 16:45:23 +0000 UTC" firstStartedPulling="2026-04-23 16:45:24.879810443 +0000 UTC m=+610.758220026" lastFinishedPulling="2026-04-23 16:45:32.721680436 +0000 UTC m=+618.600090034" observedRunningTime="2026-04-23 16:45:33.519306798 +0000 UTC m=+619.397716425" watchObservedRunningTime="2026-04-23 16:45:33.521215816 +0000 UTC m=+619.399625435" Apr 23 16:45:37.480410 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:45:37.480381 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-hmlb6" Apr 23 16:46:08.849502 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:08.849472 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-hs4cp"] Apr 23 16:46:08.872395 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:08.872361 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-hs4cp"] Apr 23 16:46:08.872541 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:08.872408 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-hs4cp" Apr 23 16:46:08.875198 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:08.875174 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 23 16:46:08.913045 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:08.913017 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e0ae9dea-ca4b-461a-bcf6-31159b72f971-config-file\") pod \"limitador-limitador-64c8f475fb-hs4cp\" (UID: \"e0ae9dea-ca4b-461a-bcf6-31159b72f971\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-hs4cp" Apr 23 16:46:08.913179 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:08.913098 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfbv4\" (UniqueName: \"kubernetes.io/projected/e0ae9dea-ca4b-461a-bcf6-31159b72f971-kube-api-access-gfbv4\") pod \"limitador-limitador-64c8f475fb-hs4cp\" (UID: \"e0ae9dea-ca4b-461a-bcf6-31159b72f971\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-hs4cp" Apr 23 16:46:08.945112 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:08.945081 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-hs4cp"] Apr 23 16:46:09.014385 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:09.014343 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gfbv4\" (UniqueName: \"kubernetes.io/projected/e0ae9dea-ca4b-461a-bcf6-31159b72f971-kube-api-access-gfbv4\") pod \"limitador-limitador-64c8f475fb-hs4cp\" (UID: \"e0ae9dea-ca4b-461a-bcf6-31159b72f971\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-hs4cp" Apr 23 16:46:09.014559 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:09.014454 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e0ae9dea-ca4b-461a-bcf6-31159b72f971-config-file\") pod \"limitador-limitador-64c8f475fb-hs4cp\" (UID: \"e0ae9dea-ca4b-461a-bcf6-31159b72f971\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-hs4cp" Apr 23 16:46:09.015057 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:09.015033 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e0ae9dea-ca4b-461a-bcf6-31159b72f971-config-file\") pod \"limitador-limitador-64c8f475fb-hs4cp\" (UID: \"e0ae9dea-ca4b-461a-bcf6-31159b72f971\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-hs4cp" Apr 23 16:46:09.022578 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:09.022556 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfbv4\" (UniqueName: \"kubernetes.io/projected/e0ae9dea-ca4b-461a-bcf6-31159b72f971-kube-api-access-gfbv4\") pod \"limitador-limitador-64c8f475fb-hs4cp\" (UID: \"e0ae9dea-ca4b-461a-bcf6-31159b72f971\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-hs4cp" Apr 23 16:46:09.182066 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:09.181976 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-hs4cp" Apr 23 16:46:09.302068 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:09.302040 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-hs4cp"] Apr 23 16:46:09.304550 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:46:09.304518 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0ae9dea_ca4b_461a_bcf6_31159b72f971.slice/crio-fec57f6c6b0bee5065252f880bc6c1b9bedc5707dd3758c7c44c03d4443e550c WatchSource:0}: Error finding container fec57f6c6b0bee5065252f880bc6c1b9bedc5707dd3758c7c44c03d4443e550c: Status 404 returned error can't find the container with id fec57f6c6b0bee5065252f880bc6c1b9bedc5707dd3758c7c44c03d4443e550c Apr 23 16:46:09.593191 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:09.593158 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-hs4cp" event={"ID":"e0ae9dea-ca4b-461a-bcf6-31159b72f971","Type":"ContainerStarted","Data":"fec57f6c6b0bee5065252f880bc6c1b9bedc5707dd3758c7c44c03d4443e550c"} Apr 23 16:46:11.601197 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:11.601158 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-hs4cp" event={"ID":"e0ae9dea-ca4b-461a-bcf6-31159b72f971","Type":"ContainerStarted","Data":"a3d47a6f227839c3a73f454c892570afc5ea82cc2746ceb19d46baa7b8154f5b"} Apr 23 16:46:11.601561 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:11.601268 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-hs4cp" Apr 23 16:46:11.618695 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:11.618645 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-64c8f475fb-hs4cp" podStartSLOduration=2.110387577 podStartE2EDuration="3.618631967s" podCreationTimestamp="2026-04-23 16:46:08 +0000 UTC" firstStartedPulling="2026-04-23 16:46:09.306433702 +0000 UTC m=+655.184843286" lastFinishedPulling="2026-04-23 16:46:10.814678084 +0000 UTC m=+656.693087676" observedRunningTime="2026-04-23 16:46:11.616569088 +0000 UTC m=+657.494978697" watchObservedRunningTime="2026-04-23 16:46:11.618631967 +0000 UTC m=+657.497041572" Apr 23 16:46:22.606271 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:22.606243 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-hs4cp" Apr 23 16:46:24.836505 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:24.836472 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-hs4cp"] Apr 23 16:46:24.836895 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:24.836689 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-64c8f475fb-hs4cp" podUID="e0ae9dea-ca4b-461a-bcf6-31159b72f971" containerName="limitador" containerID="cri-o://a3d47a6f227839c3a73f454c892570afc5ea82cc2746ceb19d46baa7b8154f5b" gracePeriod=30 Apr 23 16:46:25.380804 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:25.380782 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-hs4cp" Apr 23 16:46:25.431553 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:25.431486 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfbv4\" (UniqueName: \"kubernetes.io/projected/e0ae9dea-ca4b-461a-bcf6-31159b72f971-kube-api-access-gfbv4\") pod \"e0ae9dea-ca4b-461a-bcf6-31159b72f971\" (UID: \"e0ae9dea-ca4b-461a-bcf6-31159b72f971\") " Apr 23 16:46:25.431553 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:25.431521 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e0ae9dea-ca4b-461a-bcf6-31159b72f971-config-file\") pod \"e0ae9dea-ca4b-461a-bcf6-31159b72f971\" (UID: \"e0ae9dea-ca4b-461a-bcf6-31159b72f971\") " Apr 23 16:46:25.431915 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:25.431890 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0ae9dea-ca4b-461a-bcf6-31159b72f971-config-file" (OuterVolumeSpecName: "config-file") pod "e0ae9dea-ca4b-461a-bcf6-31159b72f971" (UID: "e0ae9dea-ca4b-461a-bcf6-31159b72f971"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:46:25.433616 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:25.433583 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0ae9dea-ca4b-461a-bcf6-31159b72f971-kube-api-access-gfbv4" (OuterVolumeSpecName: "kube-api-access-gfbv4") pod "e0ae9dea-ca4b-461a-bcf6-31159b72f971" (UID: "e0ae9dea-ca4b-461a-bcf6-31159b72f971"). InnerVolumeSpecName "kube-api-access-gfbv4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:46:25.532764 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:25.532732 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gfbv4\" (UniqueName: \"kubernetes.io/projected/e0ae9dea-ca4b-461a-bcf6-31159b72f971-kube-api-access-gfbv4\") on node \"ip-10-0-136-190.ec2.internal\" DevicePath \"\"" Apr 23 16:46:25.532764 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:25.532761 2571 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e0ae9dea-ca4b-461a-bcf6-31159b72f971-config-file\") on node \"ip-10-0-136-190.ec2.internal\" DevicePath \"\"" Apr 23 16:46:25.642430 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:25.642394 2571 generic.go:358] "Generic (PLEG): container finished" podID="e0ae9dea-ca4b-461a-bcf6-31159b72f971" containerID="a3d47a6f227839c3a73f454c892570afc5ea82cc2746ceb19d46baa7b8154f5b" exitCode=0 Apr 23 16:46:25.642636 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:25.642462 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-hs4cp" Apr 23 16:46:25.642636 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:25.642473 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-hs4cp" event={"ID":"e0ae9dea-ca4b-461a-bcf6-31159b72f971","Type":"ContainerDied","Data":"a3d47a6f227839c3a73f454c892570afc5ea82cc2746ceb19d46baa7b8154f5b"} Apr 23 16:46:25.642636 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:25.642501 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-hs4cp" event={"ID":"e0ae9dea-ca4b-461a-bcf6-31159b72f971","Type":"ContainerDied","Data":"fec57f6c6b0bee5065252f880bc6c1b9bedc5707dd3758c7c44c03d4443e550c"} Apr 23 16:46:25.642636 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:25.642516 2571 scope.go:117] "RemoveContainer" containerID="a3d47a6f227839c3a73f454c892570afc5ea82cc2746ceb19d46baa7b8154f5b" Apr 23 16:46:25.650919 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:25.650901 2571 scope.go:117] "RemoveContainer" containerID="a3d47a6f227839c3a73f454c892570afc5ea82cc2746ceb19d46baa7b8154f5b" Apr 23 16:46:25.651160 ip-10-0-136-190 kubenswrapper[2571]: E0423 16:46:25.651140 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3d47a6f227839c3a73f454c892570afc5ea82cc2746ceb19d46baa7b8154f5b\": container with ID starting with a3d47a6f227839c3a73f454c892570afc5ea82cc2746ceb19d46baa7b8154f5b not found: ID does not exist" containerID="a3d47a6f227839c3a73f454c892570afc5ea82cc2746ceb19d46baa7b8154f5b" Apr 23 16:46:25.651212 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:25.651174 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3d47a6f227839c3a73f454c892570afc5ea82cc2746ceb19d46baa7b8154f5b"} err="failed to get container status \"a3d47a6f227839c3a73f454c892570afc5ea82cc2746ceb19d46baa7b8154f5b\": rpc error: code = NotFound desc = could not find container \"a3d47a6f227839c3a73f454c892570afc5ea82cc2746ceb19d46baa7b8154f5b\": container with ID starting with a3d47a6f227839c3a73f454c892570afc5ea82cc2746ceb19d46baa7b8154f5b not found: ID does not exist" Apr 23 16:46:25.663288 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:25.663266 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-hs4cp"] Apr 23 16:46:25.664737 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:25.664718 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-hs4cp"] Apr 23 16:46:26.746446 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:26.746413 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0ae9dea-ca4b-461a-bcf6-31159b72f971" path="/var/lib/kubelet/pods/e0ae9dea-ca4b-461a-bcf6-31159b72f971/volumes" Apr 23 16:46:43.954963 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:43.954931 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-kzcbb"] Apr 23 16:46:43.955378 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:43.955212 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e0ae9dea-ca4b-461a-bcf6-31159b72f971" containerName="limitador" Apr 23 16:46:43.955378 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:43.955222 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0ae9dea-ca4b-461a-bcf6-31159b72f971" containerName="limitador" Apr 23 16:46:43.955378 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:43.955275 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="e0ae9dea-ca4b-461a-bcf6-31159b72f971" containerName="limitador" Apr 23 16:46:43.958267 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:43.958251 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kzcbb" Apr 23 16:46:43.961119 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:43.961096 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 23 16:46:43.961233 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:43.961158 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 23 16:46:43.961233 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:43.961176 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 23 16:46:43.961233 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:43.961186 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 23 16:46:43.961233 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:43.961203 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 23 16:46:43.961233 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:43.961212 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-xc4l6\"" Apr 23 16:46:43.961507 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:43.961494 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 23 16:46:43.967399 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:43.967377 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-kzcbb"] Apr 23 16:46:44.079367 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:44.079335 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/7b826983-703d-4b01-9dc9-3c8dadbc4155-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-kzcbb\" (UID: \"7b826983-703d-4b01-9dc9-3c8dadbc4155\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kzcbb" Apr 23 16:46:44.079548 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:44.079387 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/7b826983-703d-4b01-9dc9-3c8dadbc4155-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-kzcbb\" (UID: \"7b826983-703d-4b01-9dc9-3c8dadbc4155\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kzcbb" Apr 23 16:46:44.079548 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:44.079412 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/7b826983-703d-4b01-9dc9-3c8dadbc4155-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-kzcbb\" (UID: \"7b826983-703d-4b01-9dc9-3c8dadbc4155\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kzcbb" Apr 23 16:46:44.079548 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:44.079440 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9tr2\" (UniqueName: \"kubernetes.io/projected/7b826983-703d-4b01-9dc9-3c8dadbc4155-kube-api-access-b9tr2\") pod \"istiod-openshift-gateway-55ff986f96-kzcbb\" (UID: \"7b826983-703d-4b01-9dc9-3c8dadbc4155\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kzcbb" Apr 23 16:46:44.079548 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:44.079469 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/7b826983-703d-4b01-9dc9-3c8dadbc4155-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-kzcbb\" (UID: \"7b826983-703d-4b01-9dc9-3c8dadbc4155\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kzcbb" Apr 23 16:46:44.079548 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:44.079485 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/7b826983-703d-4b01-9dc9-3c8dadbc4155-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-kzcbb\" (UID: \"7b826983-703d-4b01-9dc9-3c8dadbc4155\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kzcbb" Apr 23 16:46:44.079548 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:44.079533 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/7b826983-703d-4b01-9dc9-3c8dadbc4155-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-kzcbb\" (UID: \"7b826983-703d-4b01-9dc9-3c8dadbc4155\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kzcbb" Apr 23 16:46:44.180185 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:44.180150 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/7b826983-703d-4b01-9dc9-3c8dadbc4155-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-kzcbb\" (UID: \"7b826983-703d-4b01-9dc9-3c8dadbc4155\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kzcbb" Apr 23 16:46:44.180309 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:44.180199 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/7b826983-703d-4b01-9dc9-3c8dadbc4155-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-kzcbb\" (UID: \"7b826983-703d-4b01-9dc9-3c8dadbc4155\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kzcbb" Apr 23 16:46:44.180309 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:44.180232 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/7b826983-703d-4b01-9dc9-3c8dadbc4155-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-kzcbb\" (UID: \"7b826983-703d-4b01-9dc9-3c8dadbc4155\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kzcbb" Apr 23 16:46:44.180309 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:44.180256 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/7b826983-703d-4b01-9dc9-3c8dadbc4155-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-kzcbb\" (UID: \"7b826983-703d-4b01-9dc9-3c8dadbc4155\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kzcbb" Apr 23 16:46:44.180309 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:44.180273 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b9tr2\" (UniqueName: \"kubernetes.io/projected/7b826983-703d-4b01-9dc9-3c8dadbc4155-kube-api-access-b9tr2\") pod \"istiod-openshift-gateway-55ff986f96-kzcbb\" (UID: \"7b826983-703d-4b01-9dc9-3c8dadbc4155\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kzcbb" Apr 23 16:46:44.180309 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:44.180303 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/7b826983-703d-4b01-9dc9-3c8dadbc4155-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-kzcbb\" (UID: \"7b826983-703d-4b01-9dc9-3c8dadbc4155\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kzcbb" Apr 23 16:46:44.180551 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:44.180319 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/7b826983-703d-4b01-9dc9-3c8dadbc4155-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-kzcbb\" (UID: \"7b826983-703d-4b01-9dc9-3c8dadbc4155\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kzcbb" Apr 23 16:46:44.181081 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:44.181049 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/7b826983-703d-4b01-9dc9-3c8dadbc4155-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-kzcbb\" (UID: \"7b826983-703d-4b01-9dc9-3c8dadbc4155\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kzcbb" Apr 23 16:46:44.182748 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:44.182728 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/7b826983-703d-4b01-9dc9-3c8dadbc4155-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-kzcbb\" (UID: \"7b826983-703d-4b01-9dc9-3c8dadbc4155\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kzcbb" Apr 23 16:46:44.183219 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:44.182816 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/7b826983-703d-4b01-9dc9-3c8dadbc4155-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-kzcbb\" (UID: \"7b826983-703d-4b01-9dc9-3c8dadbc4155\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kzcbb" Apr 23 16:46:44.183219 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:44.183098 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/7b826983-703d-4b01-9dc9-3c8dadbc4155-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-kzcbb\" (UID: \"7b826983-703d-4b01-9dc9-3c8dadbc4155\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kzcbb" Apr 23 16:46:44.183219 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:44.183167 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/7b826983-703d-4b01-9dc9-3c8dadbc4155-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-kzcbb\" (UID: \"7b826983-703d-4b01-9dc9-3c8dadbc4155\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kzcbb" Apr 23 16:46:44.188657 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:44.188636 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/7b826983-703d-4b01-9dc9-3c8dadbc4155-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-kzcbb\" (UID: \"7b826983-703d-4b01-9dc9-3c8dadbc4155\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kzcbb" Apr 23 16:46:44.188890 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:44.188874 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9tr2\" (UniqueName: \"kubernetes.io/projected/7b826983-703d-4b01-9dc9-3c8dadbc4155-kube-api-access-b9tr2\") pod \"istiod-openshift-gateway-55ff986f96-kzcbb\" (UID: \"7b826983-703d-4b01-9dc9-3c8dadbc4155\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kzcbb" Apr 23 16:46:44.267880 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:44.267815 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kzcbb" Apr 23 16:46:44.405883 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:44.405857 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-kzcbb"] Apr 23 16:46:44.408395 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:46:44.408356 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b826983_703d_4b01_9dc9_3c8dadbc4155.slice/crio-40980de04f73848131e5489da054b589c6b55f2270e994bfb1b84222897a35ce WatchSource:0}: Error finding container 40980de04f73848131e5489da054b589c6b55f2270e994bfb1b84222897a35ce: Status 404 returned error can't find the container with id 40980de04f73848131e5489da054b589c6b55f2270e994bfb1b84222897a35ce Apr 23 16:46:44.695290 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:44.695252 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kzcbb" event={"ID":"7b826983-703d-4b01-9dc9-3c8dadbc4155","Type":"ContainerStarted","Data":"40980de04f73848131e5489da054b589c6b55f2270e994bfb1b84222897a35ce"} Apr 23 16:46:46.891126 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:46.891087 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 23 16:46:46.891336 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:46.891158 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 23 16:46:47.705397 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:47.705362 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kzcbb" event={"ID":"7b826983-703d-4b01-9dc9-3c8dadbc4155","Type":"ContainerStarted","Data":"18e0da38fc563503fb1c28cb9d5e48044c3481b36e46e69ab1857ca678e41ad2"} Apr 23 16:46:47.705576 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:47.705554 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kzcbb" Apr 23 16:46:47.707204 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:47.707188 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kzcbb" Apr 23 16:46:47.726681 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:46:47.726637 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-kzcbb" podStartSLOduration=2.24603581 podStartE2EDuration="4.726624761s" podCreationTimestamp="2026-04-23 16:46:43 +0000 UTC" firstStartedPulling="2026-04-23 16:46:44.410281758 +0000 UTC m=+690.288691341" lastFinishedPulling="2026-04-23 16:46:46.890870706 +0000 UTC m=+692.769280292" observedRunningTime="2026-04-23 16:46:47.725044816 +0000 UTC m=+693.603454422" watchObservedRunningTime="2026-04-23 16:46:47.726624761 +0000 UTC m=+693.605034360" Apr 23 16:50:14.656320 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:50:14.656242 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6qml_9af32d6e-ad97-4dd0-a962-8bc05ab87464/ovn-acl-logging/0.log" Apr 23 16:50:14.659671 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:50:14.659648 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6qml_9af32d6e-ad97-4dd0-a962-8bc05ab87464/ovn-acl-logging/0.log" Apr 23 16:55:14.674856 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:55:14.674825 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6qml_9af32d6e-ad97-4dd0-a962-8bc05ab87464/ovn-acl-logging/0.log" Apr 23 16:55:14.679047 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:55:14.679026 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6qml_9af32d6e-ad97-4dd0-a962-8bc05ab87464/ovn-acl-logging/0.log" Apr 23 16:56:11.915971 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:56:11.915894 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-5c8bbb8d67-gcckj"] Apr 23 16:56:11.919022 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:56:11.919003 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5c8bbb8d67-gcckj" Apr 23 16:56:11.922969 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:56:11.922949 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 23 16:56:11.923074 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:56:11.922982 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 23 16:56:11.923074 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:56:11.922998 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-mv4mn\"" Apr 23 16:56:11.923074 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:56:11.923012 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 23 16:56:11.927200 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:56:11.927176 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-5c8bbb8d67-gcckj"] Apr 23 16:56:11.949834 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:56:11.949812 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2c16d7b-b156-41a2-bd43-98123e109abe-cert\") pod \"llmisvc-controller-manager-5c8bbb8d67-gcckj\" (UID: \"b2c16d7b-b156-41a2-bd43-98123e109abe\") " pod="kserve/llmisvc-controller-manager-5c8bbb8d67-gcckj" Apr 23 16:56:11.949927 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:56:11.949869 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbtq7\" (UniqueName: \"kubernetes.io/projected/b2c16d7b-b156-41a2-bd43-98123e109abe-kube-api-access-rbtq7\") pod \"llmisvc-controller-manager-5c8bbb8d67-gcckj\" (UID: \"b2c16d7b-b156-41a2-bd43-98123e109abe\") " pod="kserve/llmisvc-controller-manager-5c8bbb8d67-gcckj" Apr 23 16:56:12.050421 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:56:12.050395 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rbtq7\" (UniqueName: \"kubernetes.io/projected/b2c16d7b-b156-41a2-bd43-98123e109abe-kube-api-access-rbtq7\") pod \"llmisvc-controller-manager-5c8bbb8d67-gcckj\" (UID: \"b2c16d7b-b156-41a2-bd43-98123e109abe\") " pod="kserve/llmisvc-controller-manager-5c8bbb8d67-gcckj" Apr 23 16:56:12.050541 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:56:12.050446 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2c16d7b-b156-41a2-bd43-98123e109abe-cert\") pod \"llmisvc-controller-manager-5c8bbb8d67-gcckj\" (UID: \"b2c16d7b-b156-41a2-bd43-98123e109abe\") " pod="kserve/llmisvc-controller-manager-5c8bbb8d67-gcckj" Apr 23 16:56:12.052675 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:56:12.052658 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2c16d7b-b156-41a2-bd43-98123e109abe-cert\") pod \"llmisvc-controller-manager-5c8bbb8d67-gcckj\" (UID: \"b2c16d7b-b156-41a2-bd43-98123e109abe\") " pod="kserve/llmisvc-controller-manager-5c8bbb8d67-gcckj" Apr 23 16:56:12.058973 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:56:12.058940 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbtq7\" (UniqueName: \"kubernetes.io/projected/b2c16d7b-b156-41a2-bd43-98123e109abe-kube-api-access-rbtq7\") pod \"llmisvc-controller-manager-5c8bbb8d67-gcckj\" (UID: \"b2c16d7b-b156-41a2-bd43-98123e109abe\") " pod="kserve/llmisvc-controller-manager-5c8bbb8d67-gcckj" Apr 23 16:56:12.230614 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:56:12.230572 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5c8bbb8d67-gcckj" Apr 23 16:56:12.346513 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:56:12.346483 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-5c8bbb8d67-gcckj"] Apr 23 16:56:12.349199 ip-10-0-136-190 kubenswrapper[2571]: W0423 16:56:12.349171 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb2c16d7b_b156_41a2_bd43_98123e109abe.slice/crio-7a2a9551b9c347f9ab509f0793b9746548043748d2a52b5e7cd6aecc1bc67415 WatchSource:0}: Error finding container 7a2a9551b9c347f9ab509f0793b9746548043748d2a52b5e7cd6aecc1bc67415: Status 404 returned error can't find the container with id 7a2a9551b9c347f9ab509f0793b9746548043748d2a52b5e7cd6aecc1bc67415 Apr 23 16:56:12.350405 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:56:12.350389 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:56:13.261052 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:56:13.261011 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5c8bbb8d67-gcckj" event={"ID":"b2c16d7b-b156-41a2-bd43-98123e109abe","Type":"ContainerStarted","Data":"7a2a9551b9c347f9ab509f0793b9746548043748d2a52b5e7cd6aecc1bc67415"} Apr 23 16:56:16.271119 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:56:16.271078 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5c8bbb8d67-gcckj" event={"ID":"b2c16d7b-b156-41a2-bd43-98123e109abe","Type":"ContainerStarted","Data":"b7d40373d520008016c72fb68eda9b97f50fe66b29f10956e91ea25c23f77344"} Apr 23 16:56:16.271481 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:56:16.271226 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-5c8bbb8d67-gcckj" Apr 23 16:56:16.291740 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:56:16.291695 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-5c8bbb8d67-gcckj" podStartSLOduration=2.03556352 podStartE2EDuration="5.291681778s" podCreationTimestamp="2026-04-23 16:56:11 +0000 UTC" firstStartedPulling="2026-04-23 16:56:12.350507018 +0000 UTC m=+1258.228916602" lastFinishedPulling="2026-04-23 16:56:15.606625263 +0000 UTC m=+1261.485034860" observedRunningTime="2026-04-23 16:56:16.29069718 +0000 UTC m=+1262.169106785" watchObservedRunningTime="2026-04-23 16:56:16.291681778 +0000 UTC m=+1262.170091382" Apr 23 16:56:47.276345 ip-10-0-136-190 kubenswrapper[2571]: I0423 16:56:47.276312 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-5c8bbb8d67-gcckj" Apr 23 17:00:14.695208 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:00:14.695179 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6qml_9af32d6e-ad97-4dd0-a962-8bc05ab87464/ovn-acl-logging/0.log" Apr 23 17:00:14.700133 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:00:14.700112 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6qml_9af32d6e-ad97-4dd0-a962-8bc05ab87464/ovn-acl-logging/0.log" Apr 23 17:05:14.716822 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:14.716792 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6qml_9af32d6e-ad97-4dd0-a962-8bc05ab87464/ovn-acl-logging/0.log" Apr 23 17:05:14.720088 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:14.720062 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6qml_9af32d6e-ad97-4dd0-a962-8bc05ab87464/ovn-acl-logging/0.log" Apr 23 17:05:25.680778 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:25.680750 2571 ???:1] "http2: server: error reading preface from client 10.0.136.27:34668: read tcp 10.0.136.190:10250->10.0.136.27:34668: read: connection reset by peer" Apr 23 17:05:25.751965 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:25.751927 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-kzcbb_7b826983-703d-4b01-9dc9-3c8dadbc4155/discovery/0.log" Apr 23 17:05:26.640582 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:26.640517 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-kzcbb_7b826983-703d-4b01-9dc9-3c8dadbc4155/discovery/0.log" Apr 23 17:05:27.477447 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:27.477418 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-hmlb6_84421eaf-9c05-4479-b702-3cb23a0cc73f/manager/0.log" Apr 23 17:05:27.490630 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:27.490586 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-fhhcq_35a4046a-203e-4b42-87de-5fbb44ea7f4d/kuadrant-console-plugin/0.log" Apr 23 17:05:32.745827 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:32.745794 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-8thv9_810bb628-4255-436d-8a3c-1b1c2d54dc0e/global-pull-secret-syncer/0.log" Apr 23 17:05:32.833814 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:32.833783 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-97r2m_d8172c3c-2af9-4787-983c-905f0f36a0b7/konnectivity-agent/0.log" Apr 23 17:05:32.935463 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:32.935415 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-136-190.ec2.internal_f4da7ff0dd296ec0b2891ebc8475588c/haproxy/0.log" Apr 23 17:05:36.110631 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:36.110580 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-hmlb6_84421eaf-9c05-4479-b702-3cb23a0cc73f/manager/0.log" Apr 23 17:05:36.130447 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:36.130424 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-fhhcq_35a4046a-203e-4b42-87de-5fbb44ea7f4d/kuadrant-console-plugin/0.log" Apr 23 17:05:38.147485 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:38.147451 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-r52q6_3a765611-ea7a-4204-b9f3-15d7d8a32024/node-exporter/0.log" Apr 23 17:05:38.162164 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:38.162145 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-r52q6_3a765611-ea7a-4204-b9f3-15d7d8a32024/kube-rbac-proxy/0.log" Apr 23 17:05:38.179525 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:38.179509 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-r52q6_3a765611-ea7a-4204-b9f3-15d7d8a32024/init-textfile/0.log" Apr 23 17:05:38.293153 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:38.293084 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a2d0fe11-f062-4bbf-967f-b49737665d70/prometheus/0.log" Apr 23 17:05:38.305177 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:38.305157 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a2d0fe11-f062-4bbf-967f-b49737665d70/config-reloader/0.log" Apr 23 17:05:38.320571 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:38.320548 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a2d0fe11-f062-4bbf-967f-b49737665d70/thanos-sidecar/0.log" Apr 23 17:05:38.335622 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:38.335586 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a2d0fe11-f062-4bbf-967f-b49737665d70/kube-rbac-proxy-web/0.log" Apr 23 17:05:38.352328 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:38.352311 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a2d0fe11-f062-4bbf-967f-b49737665d70/kube-rbac-proxy/0.log" Apr 23 17:05:38.368137 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:38.368113 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a2d0fe11-f062-4bbf-967f-b49737665d70/kube-rbac-proxy-thanos/0.log" Apr 23 17:05:38.384384 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:38.384366 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a2d0fe11-f062-4bbf-967f-b49737665d70/init-config-reloader/0.log" Apr 23 17:05:41.724203 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:41.724173 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2fptc/perf-node-gather-daemonset-whd7j"] Apr 23 17:05:41.727413 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:41.727392 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-whd7j" Apr 23 17:05:41.730138 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:41.730118 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-2fptc\"/\"default-dockercfg-wnz7f\"" Apr 23 17:05:41.731179 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:41.731157 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2fptc\"/\"kube-root-ca.crt\"" Apr 23 17:05:41.731268 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:41.731218 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2fptc\"/\"openshift-service-ca.crt\"" Apr 23 17:05:41.737877 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:41.737852 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2fptc/perf-node-gather-daemonset-whd7j"] Apr 23 17:05:41.745045 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:41.745021 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0c730c61-34c0-4e68-aea9-9e9238e00cd7-lib-modules\") pod \"perf-node-gather-daemonset-whd7j\" (UID: \"0c730c61-34c0-4e68-aea9-9e9238e00cd7\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-whd7j" Apr 23 17:05:41.745115 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:41.745054 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxvm5\" (UniqueName: \"kubernetes.io/projected/0c730c61-34c0-4e68-aea9-9e9238e00cd7-kube-api-access-rxvm5\") pod \"perf-node-gather-daemonset-whd7j\" (UID: \"0c730c61-34c0-4e68-aea9-9e9238e00cd7\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-whd7j" Apr 23 17:05:41.745162 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:41.745135 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0c730c61-34c0-4e68-aea9-9e9238e00cd7-podres\") pod \"perf-node-gather-daemonset-whd7j\" (UID: \"0c730c61-34c0-4e68-aea9-9e9238e00cd7\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-whd7j" Apr 23 17:05:41.745206 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:41.745190 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0c730c61-34c0-4e68-aea9-9e9238e00cd7-proc\") pod \"perf-node-gather-daemonset-whd7j\" (UID: \"0c730c61-34c0-4e68-aea9-9e9238e00cd7\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-whd7j" Apr 23 17:05:41.745251 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:41.745215 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0c730c61-34c0-4e68-aea9-9e9238e00cd7-sys\") pod \"perf-node-gather-daemonset-whd7j\" (UID: \"0c730c61-34c0-4e68-aea9-9e9238e00cd7\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-whd7j" Apr 23 17:05:41.846263 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:41.846230 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rxvm5\" (UniqueName: \"kubernetes.io/projected/0c730c61-34c0-4e68-aea9-9e9238e00cd7-kube-api-access-rxvm5\") pod \"perf-node-gather-daemonset-whd7j\" (UID: \"0c730c61-34c0-4e68-aea9-9e9238e00cd7\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-whd7j" Apr 23 17:05:41.846460 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:41.846288 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0c730c61-34c0-4e68-aea9-9e9238e00cd7-podres\") pod \"perf-node-gather-daemonset-whd7j\" (UID: \"0c730c61-34c0-4e68-aea9-9e9238e00cd7\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-whd7j" Apr 23 17:05:41.846460 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:41.846331 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0c730c61-34c0-4e68-aea9-9e9238e00cd7-proc\") pod \"perf-node-gather-daemonset-whd7j\" (UID: \"0c730c61-34c0-4e68-aea9-9e9238e00cd7\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-whd7j" Apr 23 17:05:41.846460 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:41.846359 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0c730c61-34c0-4e68-aea9-9e9238e00cd7-sys\") pod \"perf-node-gather-daemonset-whd7j\" (UID: \"0c730c61-34c0-4e68-aea9-9e9238e00cd7\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-whd7j" Apr 23 17:05:41.846460 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:41.846424 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0c730c61-34c0-4e68-aea9-9e9238e00cd7-lib-modules\") pod \"perf-node-gather-daemonset-whd7j\" (UID: \"0c730c61-34c0-4e68-aea9-9e9238e00cd7\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-whd7j" Apr 23 17:05:41.846460 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:41.846443 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0c730c61-34c0-4e68-aea9-9e9238e00cd7-proc\") pod \"perf-node-gather-daemonset-whd7j\" (UID: \"0c730c61-34c0-4e68-aea9-9e9238e00cd7\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-whd7j" Apr 23 17:05:41.846725 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:41.846476 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0c730c61-34c0-4e68-aea9-9e9238e00cd7-podres\") pod \"perf-node-gather-daemonset-whd7j\" (UID: \"0c730c61-34c0-4e68-aea9-9e9238e00cd7\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-whd7j" Apr 23 17:05:41.846725 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:41.846518 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0c730c61-34c0-4e68-aea9-9e9238e00cd7-sys\") pod \"perf-node-gather-daemonset-whd7j\" (UID: \"0c730c61-34c0-4e68-aea9-9e9238e00cd7\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-whd7j" Apr 23 17:05:41.846725 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:41.846571 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0c730c61-34c0-4e68-aea9-9e9238e00cd7-lib-modules\") pod \"perf-node-gather-daemonset-whd7j\" (UID: \"0c730c61-34c0-4e68-aea9-9e9238e00cd7\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-whd7j" Apr 23 17:05:41.855850 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:41.855829 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxvm5\" (UniqueName: \"kubernetes.io/projected/0c730c61-34c0-4e68-aea9-9e9238e00cd7-kube-api-access-rxvm5\") pod \"perf-node-gather-daemonset-whd7j\" (UID: \"0c730c61-34c0-4e68-aea9-9e9238e00cd7\") " pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-whd7j" Apr 23 17:05:42.038297 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:42.038211 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-whd7j" Apr 23 17:05:42.156392 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:42.156359 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2fptc/perf-node-gather-daemonset-whd7j"] Apr 23 17:05:42.159523 ip-10-0-136-190 kubenswrapper[2571]: W0423 17:05:42.159494 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0c730c61_34c0_4e68_aea9_9e9238e00cd7.slice/crio-37645ad189fe2eb2bf280bfc83ef82db14c5e0a64bcb33ed512783c9f0d92353 WatchSource:0}: Error finding container 37645ad189fe2eb2bf280bfc83ef82db14c5e0a64bcb33ed512783c9f0d92353: Status 404 returned error can't find the container with id 37645ad189fe2eb2bf280bfc83ef82db14c5e0a64bcb33ed512783c9f0d92353 Apr 23 17:05:42.161449 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:42.161433 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 17:05:42.591809 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:42.591784 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-79c4v_90a007dd-2ad4-4fff-96a4-2c15593ab169/dns/0.log" Apr 23 17:05:42.606831 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:42.606806 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-79c4v_90a007dd-2ad4-4fff-96a4-2c15593ab169/kube-rbac-proxy/0.log" Apr 23 17:05:42.686446 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:42.686415 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rpmks_ee000696-33d6-4a0e-a2ac-f710ef5e1515/dns-node-resolver/0.log" Apr 23 17:05:42.817826 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:42.817793 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-whd7j" event={"ID":"0c730c61-34c0-4e68-aea9-9e9238e00cd7","Type":"ContainerStarted","Data":"38cb1ec50ba0597cc921aec5c5eacc8e4933a0916b81c3b4d1fbf8071ac9d18b"} Apr 23 17:05:42.817826 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:42.817831 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-whd7j" event={"ID":"0c730c61-34c0-4e68-aea9-9e9238e00cd7","Type":"ContainerStarted","Data":"37645ad189fe2eb2bf280bfc83ef82db14c5e0a64bcb33ed512783c9f0d92353"} Apr 23 17:05:42.818240 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:42.817974 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-whd7j" Apr 23 17:05:42.833401 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:42.833359 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-whd7j" podStartSLOduration=1.833344649 podStartE2EDuration="1.833344649s" podCreationTimestamp="2026-04-23 17:05:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:05:42.832964261 +0000 UTC m=+1828.711373866" watchObservedRunningTime="2026-04-23 17:05:42.833344649 +0000 UTC m=+1828.711754251" Apr 23 17:05:43.136210 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:43.136181 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7b9786f944-5tkx5_bab205f1-bda8-4c2a-ba64-83dc4512e8a6/registry/0.log" Apr 23 17:05:43.194531 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:43.194501 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-zrzxd_cea1706c-bca2-4388-9143-d1aa2fbcdc62/node-ca/0.log" Apr 23 17:05:43.995834 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:43.995800 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-kzcbb_7b826983-703d-4b01-9dc9-3c8dadbc4155/discovery/0.log" Apr 23 17:05:44.490107 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:44.490076 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-bcpj7_e8ef03b2-bf64-4538-9ba2-cd786001399c/serve-healthcheck-canary/0.log" Apr 23 17:05:44.981273 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:44.981228 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-vrfds_95f8311a-523d-4e15-9d53-62b348ee103c/kube-rbac-proxy/0.log" Apr 23 17:05:44.996657 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:44.996630 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-vrfds_95f8311a-523d-4e15-9d53-62b348ee103c/exporter/0.log" Apr 23 17:05:45.012487 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:45.012461 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-vrfds_95f8311a-523d-4e15-9d53-62b348ee103c/extractor/0.log" Apr 23 17:05:47.588804 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:47.588771 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-b454c4fb-h92mf_a833e29b-84ec-4c94-89dd-fd58326929e7/manager/0.log" Apr 23 17:05:48.189677 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:48.189638 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-5c8bbb8d67-gcckj_b2c16d7b-b156-41a2-bd43-98123e109abe/manager/0.log" Apr 23 17:05:48.830721 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:48.830686 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-2fptc/perf-node-gather-daemonset-whd7j" Apr 23 17:05:54.159996 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:54.159966 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9bf7w_e9cf07b9-000a-4b3d-b47a-38c57899b41d/kube-multus/0.log" Apr 23 17:05:54.178894 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:54.178869 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6z9f6_204aff10-0787-493a-aee6-18d5bf3d74be/kube-multus-additional-cni-plugins/0.log" Apr 23 17:05:54.193507 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:54.193482 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6z9f6_204aff10-0787-493a-aee6-18d5bf3d74be/egress-router-binary-copy/0.log" Apr 23 17:05:54.209084 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:54.209057 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6z9f6_204aff10-0787-493a-aee6-18d5bf3d74be/cni-plugins/0.log" Apr 23 17:05:54.231552 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:54.231525 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6z9f6_204aff10-0787-493a-aee6-18d5bf3d74be/bond-cni-plugin/0.log" Apr 23 17:05:54.252611 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:54.252568 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6z9f6_204aff10-0787-493a-aee6-18d5bf3d74be/routeoverride-cni/0.log" Apr 23 17:05:54.267851 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:54.267807 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6z9f6_204aff10-0787-493a-aee6-18d5bf3d74be/whereabouts-cni-bincopy/0.log" Apr 23 17:05:54.283450 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:54.283430 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6z9f6_204aff10-0787-493a-aee6-18d5bf3d74be/whereabouts-cni/0.log" Apr 23 17:05:54.684163 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:54.684133 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7kt9x_dbfa21d6-bf64-400e-9cbe-b26171f7ef99/network-metrics-daemon/0.log" Apr 23 17:05:54.701970 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:54.701940 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7kt9x_dbfa21d6-bf64-400e-9cbe-b26171f7ef99/kube-rbac-proxy/0.log" Apr 23 17:05:56.268364 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:56.268335 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6qml_9af32d6e-ad97-4dd0-a962-8bc05ab87464/ovn-controller/0.log" Apr 23 17:05:56.280686 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:56.280665 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6qml_9af32d6e-ad97-4dd0-a962-8bc05ab87464/ovn-acl-logging/0.log" Apr 23 17:05:56.296764 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:56.296741 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6qml_9af32d6e-ad97-4dd0-a962-8bc05ab87464/ovn-acl-logging/1.log" Apr 23 17:05:56.314945 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:56.314921 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6qml_9af32d6e-ad97-4dd0-a962-8bc05ab87464/kube-rbac-proxy-node/0.log" Apr 23 17:05:56.332145 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:56.332121 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6qml_9af32d6e-ad97-4dd0-a962-8bc05ab87464/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 17:05:56.344747 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:56.344730 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6qml_9af32d6e-ad97-4dd0-a962-8bc05ab87464/northd/0.log" Apr 23 17:05:56.360396 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:56.360376 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6qml_9af32d6e-ad97-4dd0-a962-8bc05ab87464/nbdb/0.log" Apr 23 17:05:56.375090 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:56.375070 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6qml_9af32d6e-ad97-4dd0-a962-8bc05ab87464/sbdb/0.log" Apr 23 17:05:56.540814 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:56.540749 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q6qml_9af32d6e-ad97-4dd0-a962-8bc05ab87464/ovnkube-controller/0.log" Apr 23 17:05:57.559789 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:57.559759 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-nvscm_1e04a51c-81ee-4ef6-b546-0b1e611642d1/network-check-target-container/0.log" Apr 23 17:05:58.542618 ip-10-0-136-190 kubenswrapper[2571]: I0423 17:05:58.542570 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-p6qdp_adecd1bd-5d92-46f9-97d2-d824dfa55524/iptables-alerter/0.log"